How does MIDI Polyphonic Expression (MPE) work in modern DAWs?

How to do this, that and the other. Share, learn, teach. How did X do that? How can I sound like Y?
Post Reply New Topic
RELATED
PRODUCTS

Post

I've read that as of 2018 we have MPE, which works by assigning each node to its own channel and allowing note bends per channel:
https://www.midi.org/articles-old/midi- ... ession-mpe

This allows you to then bend or glide notes individually rather than all together.

How does this work in modern DAWs like Cubase? Do you need to record your midi with every note on its own channel first? Or does it just reassign things to their own channels as needed?

Ie. Let's say you have a simple keyboard part you've recorded with a regular MIDI keyboard to one shared channel of an Aminor chord. Then you want to bend the C to a C#.

Do you need to separate all the notes out into different channels to do this? Or what?

Has anyone played with this function to any extent or know how to use it?

Post

Bitwig: Native
Tracktion Waveform: Native
Ableton: Nope, but doable.
Cubase: YES (e.g. https://support.roli.com/support/soluti ... ith-cubase)

See following lists for others...

See here for a good list of can-do's and workarounds: https://support.roli.com/support/soluti ... nstruments

Also here: http://www.rogerlinndesign.com/ls-recom ... ounds.html
Last edited by Karbon L. Forms on Sun Dec 02, 2018 12:44 am, edited 1 time in total.
.................................
"Hell is other People" J.P.Sartre
.................................

Post

If you have a multi-timbral instrument you can fake it with channels. Or you can hook up an array of identical plugins with each responding to 1 channel. It's really not rocket science. Just a pain in the hole. I did all this crap whilst waiting on my seaboard/writing an android controller.
.................................
"Hell is other People" J.P.Sartre
.................................

Post

And yes, bending one note of a chord is pure joy. Guitar style double bends and all that.

Hopefully someone better at explaining shit will be along shortly. lol
.................................
"Hell is other People" J.P.Sartre
.................................

Post

Karbon L. Forms wrote: Sat Dec 01, 2018 9:42 pm Bitwig: Native
Tracktion Waveform: Native
Ableton: Nope, but doable.
Cubase: No but doable (e.g. https://support.roli.com/support/soluti ... ith-cubase)

See following lists for others...

See here for a good list of can-do's and workarounds: https://support.roli.com/support/soluti ... nstruments

Also here: http://www.rogerlinndesign.com/ls-recom ... ounds.html
Actually, Cubase 10 supposedly added proper MPE support. I’ve upgraded but haven’t tested that out yet.
Incomplete list of my gear: 1/8" audio input jack.

Post

im sure jancivil uses cubase and has been using mpe lately...?

Post

I've got Cubase. I can't see why it wouldn't support MPE.

It says it's supported here:
Digital audio workstations (DAWs)

Bitwig 8-Track & Studio
GarageBand macOS
Logic Pro X
Reaper
Steinberg Cubase
Steinberg Cubasis 2.6
Tracktion Waveform
https://www.midi.org/articles-old/midi- ... ession-mpe
https://support.roli.com/support/soluti ... ith-cubase

I started two threads over at Juce forum, one of which provided some more info:

https://forum.juce.com/t/is-the-mpesynt ... s/30644/10

The other didn't get any replies yet (specifically on how to use the data from aftertouch pressure and how pitch bend data gets in there from the Roli vibratos and slides):

https://forum.juce.com/t/typical-usage- ... roli/30647

If anyone knows more on this subject and can chime in here that would be great. I asked:
From what I’ve seen, it appears the pitch bends in MPE controllers like the Roli ones (whether from shaking your finger or sliding it down the bottom pad between notes) are being transmitted as simple Pitch Bends. ie. The only difference between them and a pitch wheel is:

The MPE bends are being transmitted per note (ie. per channel).
The MPE bends are easier to calibrate on a per pitch basis using an appropriate controller. ie. If you vibrato, you can set that in your controller/synth to say it can go 1/2 pitch increment max in either direction. If you glide from C down to A, that can easily be recognized clearly as a 3 note pitch change.

I presume that the vibrato pitch bend range must be established on the device, since both vibratos and finger slides between notes are both being transmitted as pitch bends in the MIDI. So the synth can’t tell the difference between which it’s dealing with.

I’ll have to review the example synths but this I think should be pretty straight forward. Is that correct? Anything I’m missing there?

What is the typical application of the aftertouch “pressure” data? From what I got in the recent ADC video, it looks like they’re using it is as “gain” multiplier.

ie. In a typical ADSR based synthesiser, you could multiply the envelope by another gain variable. The envelope max level of the ADSR would be set by the velocity like usual. But if you then lean into the note or lighten your touch, the gain multiplier will change accordingly to amplify or reduce the total envelope output by whatever factor you dictate.

I presume you would want somewhere in the synth to dictate the range of the gain control. eg. To max of +6 db and min of -6 db and map the pressure data linearly to the decibel scale before converting those dB changes to a basic multiplication factor.

I also presume you would want the aftertouch pressure data to be relative to the initial pressure level, so the pressure starting point always gives a factor of “1” in the gain multiplier.

Is this the correct idea behind a basic application for these functions? Thanks.
Last edited by mikejm on Sat Dec 01, 2018 11:26 pm, edited 1 time in total.

Post

This video is good if anyone wants a primer on the subject:

https://www.youtube.com/watch?v=Xqj8BVdI0Tg

Post

deastman wrote: Sat Dec 01, 2018 10:38 pm
Karbon L. Forms wrote: Sat Dec 01, 2018 9:42 pm Bitwig: Native
Tracktion Waveform: Native
Ableton: Nope, but doable.
Cubase: No but doable (e.g. https://support.roli.com/support/soluti ... ith-cubase)

See following lists for others...

See here for a good list of can-do's and workarounds: https://support.roli.com/support/soluti ... nstruments

Also here: http://www.rogerlinndesign.com/ls-recom ... ounds.html
Actually, Cubase 10 supposedly added proper MPE support. I’ve upgraded but haven’t tested that out yet.
Cool. I got muddled up with my quick google.
.................................
"Hell is other People" J.P.Sartre
.................................

Post

Jury rigging a JUCE MPESynthesiser and using Tracktion Waveform was how I got started with MPE. It's braindead easy.
.................................
"Hell is other People" J.P.Sartre
.................................

Post

MPE is most important if you have a controller like the Seaboard, a LinnStrument or a Continuum. They simply spit out the notes on different Midi channels. Cubase can deal with it natively! Cubase has also note expressions which does the same within the DAW kind pf decoupled with controlers, you can edit a pitch curve for each note...
Of course the instrument plug-ins need to support it as well. Note expressions are part of the VST 3 specs and the newest version of that spec supports also MPE. That means it will translate MPE to note expressions. Bitwig for example is one of the hosts which support that translation as well and I was able to play Halion patches expressively with my LinnStrument though Halion wasn‘t (yet) prepared for MPE...
If you don‘t have such a controller and Cubase is your go to DAW you are set already... If you get such a controller (I love my LinnStrument, a real game changer...) you will reject sooner or later all those instrument plug-ins which do not support MPE for their lack of expressivity...

Post Reply

Return to “Production Techniques”