Partially that, but even in general use, what’s the mapping between a knob and a value in a vst controller? 0-1.0 is often all you’ll get, so there’s no means to set snapping values or scales that refer to what you’d probably like to play. The instrument must remain open, which negatively affects the playability of your session as a whole.
So what’s the simplest design that would allow pulling up control surfaces from the depths of whatever signal path you’ve set up?
Macro knobs are great, but there are loads of things they don’t do, and they take up a lot of space for the information they represent.
So then you get on to what would you design that wouldn’t be too hard to implement or to use that would give all the typical information and control on modulation you’d want on the smallest number of ui primitives so you can get your hands off the keyboard/mouse and onto the controller.
So the ui primitives are a vertical slider and a vertical column of 8 buttons, with a little bit of composability to get x/y controls.
You can see and control the 3rd eq or whatever of your synth somewhere in an instrument layer that would be 2-5 seconds away using the mouse/keyboard.
These would stack left to right, and you’d need control over a few things to change what you see as the textual readout.
This way you get a ui with most of what you need working quickly with the minimal amount of complexity possible.
Removing canvas like placing simplifies it, limiting to 2 primitives and three forms limits it further. I dropped in tabs as an afterthought because you’ll have say an eq, and one parameter does gain and another q, and you might want those as layers in a tab controller that would limit the width of your layout even further.
I’m not sure if that clarifies it. I hope so.