The world of controllers has evolved quite rapidly since I started using computers for music back in 2009. The choices are almost endless, and you can get your hands on almost any configuration of knobs, buttons, sliders, and even touch screens. Along with these controllers, integration options have evolved right along side. There are complex marriages between hardware and software, such as Push and Live, or the Maschine Line of products. How do you resist the urge to continually update and reshape your collection of controllers? You learn how to control them. The ability to define the interaction between hardware and software is more easily done than ever before, and doesn’t require any advanced programming (although it definitely can go that route).
This series of articles will go through some of the basics of communicating with a MIDI controller (in this case the QuNeo from KMI), observing properties of Live via Max 4 Live (M4L from here on out) and the Live API, and controlling aspects of the Live set from within a M4L patch/device. We will use a pair of devices that I recently made as an example, which you can download HERE.
These devices work together to monitor the audio levels on the first four tracks and translate them to stereo VU meters on the QuNeo pads. You can solo and mute these tracks using the lower left and right corners of the corresponding meters. You can launch clips using the other pads on the meter. You can use this with factory preset 5 right out of the box, but you might want to turn off the local LED control for the pads. Place a copy of the ‘Volume_Observer’ device on the first four tracks, and select which column you want to send the data to. Place the ‘QuNeo_Meters’ device on an empty MIDI track, with the input set to QuNeo, the output set to QuNeo on channel 2, and the tracks monitoring set to ‘IN’. This is all setup in the demo set for you.
I’m not going to go into the gritty detail of every patchcord and object, but I will explain some of the fundamental concepts of using the Live API and the M4L API objects. If you’re a Max user you should have no trouble following a long. If you are a Live user who wants to get into M4L patching this will hopefully get you started and show you some ways of interacting with the Live API.
Let’s start by looking at the ‘Volume_Observer’ device.
This is a very simple device, that is used to get information about the audio of a track to the device that will use that data to display meters. It has a row select menu, a gain value, and two meters that show the input level. If you click on the Edit button of the device, it will open up in the Max editor. It will look just as it did in Live, until you click on the ‘Patching Mode’ button in the toolbar at the bottom Max window. This will expand the device to show all of the Max objects, and the patch cords connecting them.
This device takes the audio from the track and sends it through a meter, then takes this stereo data and uses the forward object to send the data to a named receive device in our ‘QuNeo_Meters’ device. There are many different ways to track amplitude of audio signals, and I chose one of the simplest. When using audio data to drive LEDs you don’t want to be updating the display at audio rate, this is unnecessary and can cause performance loss if not dealt with. Luckily the [meter~] object has an interval parameter that allows you to define how often the object will report its value. Using this you can limit the amount of data and still get very pleasing visual results on your controller.
Although this is a simple device it does demonstrate a pretty powerful feature of using Max inside of Live. You can communicate with other devices in your set using [send] and [receive] (or the dynamically assignable [forward] object) pairs. This allows you to make devices that work together, or are influenced by each other, and place them on any track in Live. This can also be used to get around the limitation of only being able to process audio OR MIDI data. Using these objects, you can create devices that can handle both types of data. (In the past this technique did introduce some delay that was dependent on the audio buffer of Live, but upon doing some reasearch, it seems this may have been improved in more recent versions.)
Let’s now move onto where the magic happens, the ‘QuNeo_Meters’ Device. Another very simple interface with gain controls for each channel, meters representing what will be seen on the QuNeo, and an option to toggle the observing between max objects, and javascript. Upon opening the device, and switching to patching mode, you can see some distinct sections of the patch. Let’s start by taking a look at where the data from our Volume_Observers is entering the patch, the four [r] objects.
This data is then processed a bit to put it in the range of 0-8 for our LEDs. All of the math is done in the [LED Math] sub patchers, which you can see to the below.We unpack the audio data coming into our patch, then scale the data to the range 0 – 8. Then we apply the gain scaling, and then we do some interpolation using the [line] object. I actually fudged this a little and allow the data to go up to 8 to make it possible for the red clipping LEDs to light up given the smoothing we do with the [line] object.
This is all contained in a sub patcher. Sub patchers are pieces of code that are wrapped up, or encapsulated, in the language of Max. This is useful for consolidating ideas, and keeping clutter under control. It also allows you to reuse bits of code throughout the patch. You can prototype ideas in one instance, and then just ‘paste replace’ (right click to bring up the contextual menu) the finished code to update all the other instances.
The data is then fed into a [quneoMeters] abstraction. Abstractions are similar to sub patchers but have a few features that extend their usefulness. Abstractions are actually separate patcher files that function similarly to max objects in a few key ways. They can be used elsewhere in the patch, and any edits that you do to the file on your hard drive will be immediately present in all instances of the abstraction. This can make maintaining, updating and reusing code so much easier, especially if the abstraction has a lot of patch cords, objects, or connections. Doing all of the repatching of a mess of patch cords can quickly become time consuming, and using abstractions takes away much of the tedium. They can also be given arguments. Notice the 0,1,2,3 in the [quneoMeter] objects. If you double click the abstraction to open it up, you will see that the [loadmess] object takes this argument as it’s own. If you then unlock the patcher, you will see the argument change to a #1. That tells the patcher to replace that symbol with the first argument given to the abstraction. If you needed more variables/arguments you’d simply and a #1 somewhere in the patch, and then that would take the second argument given. You can know reuse the code but make alterations to the innards on initialization using arguments.
Each of the abstractions contains two [LED_handling] subpatches that take care of lighting and unlighting the LEDs based on the volume level sent into them. If you open those up you can see a javascript object off to the right that contains a very few lines of code that does exactly what the huge group of max objects at the top of the subpatch does. I like to do iterations and checks in javascript because it’s just so much cleaner, and generally easier to read than with Max objects. You can compare the code to the right, with that collection of [split] objects, and decide which one is easier to read. If you’d like to know more about javascript in max, there are some good resources floating around the web, but it falls outside of the scope of this discussion. I was going to do a lot of this patch with javascript but ran into some bugs. Max 7 has some quirks that hopefully will be worked out, but for the time being I decided to go straight max objects. It’s messy but stable.
The numbers then are sent through some objects that scale the data to fit the 0-127 range of midi data, and assign the proper note numbers to light the LEDs. This data is then sent out to our [midiformat] object for proper MIDI formatting, and then out of the [midiout] to continue on down the track in Live, and out to our QuNeo.
So we’ve taken care of getting some audio data from our set and then translating that to LED feedback on the QuNeo. I hope that the techniques we used to do this are clear and that you can adapt these to your own creative endeavors. In the next part of this article we will start digging into the Live API and examining different ways to observe and interact with Live. As always direct any questions or comments to evanbeta@keithmcmillen.com.
Till next time!