It would be nice if it was possible for users to create UIs for their third party plugins that can be used directly in the devices panel, without needing to open the VST graphical window for each plugin.
The user would no longer have to open each VSTs graphical window when working on them, and they enjoy the improved workflow of working directly in the Devices panel.
The user would also have a workflow that improves on the default view that Bitwig provides for third party plugins.
This is sort of implemented in Ableton where Max for Live plugins can have their own native UI like any other first party plugin.
As a stop gap, managing the scaling of the display values of knobs/macros would be of help to see quickly on screen or hardware. Some vst’s report incomprehensible parameter values.
I think @andrei_olenev or @x.iso had mentioned something about the possibility to have building blocks to create instruments and modulators, but now I can’t remember for sure. I guess these building blocks could be used to produce these native UIs for 3rd party plugins (a neat idea indeed).
Great Idea!
Those interfaces should be export and importable, so User could share them and build a Database.
I guess it would not take long until we have a nice UI for most major third party Plugins.
making custom UI for VST and making custom UI for custom device are entirely different things though. given this exact example, you just can’t derive things like waveform of a sample inside a plugin, or you could fake UI for eq, but it will not truly represent what plugin actually does.
that being said, there’s already a feature request for more options on buttons and macro controls, stuff like custom enum lists would be useful as well. and I would welcome custom UI panel for devices in general, where remote controls just don’t cut it for you
I’m afraid it wouldn’t make sense to incorporate custom UI builder on top of Remote controls, since Remote controls are designed to be auto-mappable to 8 CC’s. seems like it makes more sense to have access to such a thing only for VSTs and Grid’s, so perhaps a dedicated button on device, rather than header of it would make more sense.
tldr: limited composable parameter maps
(this got out of hand)
I’d most use it for bringing controller <-> mod relationships forward rather than for deep editing, just for putting comprehensible data in the view screen.
Paring it right back, you could have something like the following.
All UI items as columns. 3 types: continuous, discrete and grid.
a) continuous visualized as a fader.
b) discrete as a vertical row of buttons 8 high.
c) grid connecting columns horizontally (8+8) (8+8+8+8)
with the following mappings on to controllers:
i) continuous and discrete can map to 1 parameter with a numeric/enum view function
ii) discrete in addition can use one boolean mapping per button in a column.
iii) grid maps to one or two or parameters, either linear 0 <= (width * height) or xpad (x = column,y = row)
eq / env via several sequential discrete or continuous columns one param each.
grid / xy type controllers
5?) add a phase input and you can simulate sequence location along a grid horizontal
Being there’s no fine tuning on the layout it’ll just stack left to right where it can be seen when you’re in that lane, and the result of view functions can be passed to controllers.
Thanks BabelJenga, but it is hard to understand what you wrote here. You are describing a way to improve the experience of using certain VSTs on hardware controllers?
Partially that, but even in general use, what’s the mapping between a knob and a value in a vst controller? 0-1.0 is often all you’ll get, so there’s no means to set snapping values or scales that refer to what you’d probably like to play. The instrument must remain open, which negatively affects the playability of your session as a whole.
So what’s the simplest design that would allow pulling up control surfaces from the depths of whatever signal path you’ve set up?
Macro knobs are great, but there are loads of things they don’t do, and they take up a lot of space for the information they represent.
So then you get on to what would you design that wouldn’t be too hard to implement or to use that would give all the typical information and control on modulation you’d want on the smallest number of ui primitives so you can get your hands off the keyboard/mouse and onto the controller.
So the ui primitives are a vertical slider and a vertical column of 8 buttons, with a little bit of composability to get x/y controls.
You can see and control the 3rd eq or whatever of your synth somewhere in an instrument layer that would be 2-5 seconds away using the mouse/keyboard.
These would stack left to right, and you’d need control over a few things to change what you see as the textual readout.
This way you get a ui with most of what you need working quickly with the minimal amount of complexity possible.
Removing canvas like placing simplifies it, limiting to 2 primitives and three forms limits it further. I dropped in tabs as an afterthought because you’ll have say an eq, and one parameter does gain and another q, and you might want those as layers in a tab controller that would limit the width of your layout even further.
If the standard were to allow for custom guis intended for display in the devices view, then there wouldn’t be a need for bitwig to implement a bitwig-specific UI configurator.
I think the decapitator soundtoys picture is a great example for the main post. Instantly I get it when I see that. The main post I get but I don’t really see a vision till I saw that decapitator pic.
Elisabeth homeland took this concept to an extreme in Ableton’s max. Creating wrappers for kiloheartz, Valhalla, soundtoys, etc. so you don’t need to pull up a bunch of plug-in windows. It’s a cleaner way of working. But going to the extreme of making the UI’s look basically identical is a bit extreme. She does a good job. She also has these buttons on her guis that can pull up the plug-in window which is great.
Her devices costs of course. Not trying to market her stuff but it is interesting and shows why it can be useful
This can also be done in FL studio using patcher. Obviously it’s not the same rack workflow as above but thought I’d mention you can build custom interfaces faceplates for vsts.
Studio one I believe also allows this, or was it cubase? One of the two…