Hi Z
You sound almost disappointed and astonished. Don’t be. Once you understand the few basics it will make sense. It may not be as “handy” a time saver as a digital workflow - but it’s “real”!
I guess that you have to get your head around one fundamental - the Voyager is an instrument - just like an oboe, clarinet or piano. Instead of using wind through a reed, or a vibrating wire to produce sound it uses electronics to create resonant tones. It’s still totally analogue (excuse my “proper” spelling - I’m from the colonies! LOL) in that respect and you must first treat it as one of the other instruments I’ve mention to play it. Modern times have afforded a few developments, and we can now “drive” these analogue synths digitally - i.e.. via MIDI, to the extent that we can “tell” the instrument what pitch to play, when to play it, how long for, and with what effect applied within the instrument - to ultimately output that result at the audio sockets. This is where you get to hear tangible music for the first time in the chain - on output in real time. You get your DAW to record this audio like you would ask the the clarinet player to play their instrument in front of a mike. This isn’t telling you anything new. Unless the clarinet player has two instruments in his/her mouth at the same time, they’ll only play one at a time. Same with Voyager presets - as the synthesiser only plays the current preset in real time - based on what either a keyboard or MIDI stream dictates.
Your DAW is capable of performing on-the-fly automation (program & filter changes, etc) to the Voyager as MIDI is streaming to it - i.e. as it’s playing in real time. In this respect, it’s similar to a soft synth - for the most part anyway. What is different is explained here (look for the section on automating CC: http://www.soundonsound.com/sos/feb09/articles/livetech_0209.htm (Live vs. Logic, but you’ll get the idea)
Here’s a typical MIDI recording scenario: you’re recording an instrument that responds to expression or performance controls, such as altering its vibrato with the mod wheel, or changing filter cutoff in response to a particular knob on your controller keyboard. This works using standard MIDI Continuous Controller (CC) messages. If you’re using a hardware MIDI sound source, you can record MIDI notes and CC data simultaneously in Live. Any control movement that generates CC data will be recorded as Clip automation within the Notes Clip in your MIDI track. You can also overdub controller data into an existing Clip if MIDI Overdub mode is active. Continuous Controller data is viewed in the same place as automation Clip Envelopes (see screen above)
MIDI continuous controller data recorded in a Clip can be viewed in the Clip Envelopes section.
In Clips that can have envelopes for more than one device (such as Effects devices), MIDI CC envelopes are grouped as ‘MIDI Ctrl’.
If you’re recording MIDI for an external device, such as a hardware synth, everything works as expected. Unfortunately, real-time recording of modulation for devices within Live, such as internal instruments, effects, mixer parameters and Clip parameters, is not, on the face of it, possible. Here’s why:
To control a parameter in Live, you use the MIDI Map mode to assign it to a knob on your MIDI controller. Surely, you think, you can now record MIDI CCs from that knob into a Clip (as you would with a hardware instrument) and the control will be automated. But no: once a Live parameter has been mapped to a MIDI controller, the MIDI CC channel used for the assignment is disabled from recording. This is ostensibly to avoid conflicts between multiple layers of the same controller. As an aside, I don’t really see the problem, as Live is already able to handle relative relationships between Clip Envelopes and on-screen controls, and can also suspend automation in the Arrangement when a control is moved manually, and subsequently pick up the automation when you press the Back To Arrangement button.
Digital soft synths are a completely different animal. They are not instruments and such. They are programs. Where they differ is that they “render” real sound from their instruction set through an interface. When you bounce a track containing a line of instructions to a software synth the DAW looks at the set parameters and uses these to build the sound file on the fly and saves it to disk. It doesn’t need to “play” the track data in real time, at the pace you’d listen to it - but rather as fast as your CPU is capable of processing the data.