Posted by & filed under Ableton Live, Blog.

In the previous article I walked you through one approach for getting information from Live out to a controller using M4L and some remote MIDI routing. In this section we’ll look at how we can use the Live API to fully integrate our devices into the Live environment. 

One thing I wanted to do with this patch was interact with the live set, and get information about the live set into the patch to influence the brightness of the LEDs via the API. I thought that being able to mute and solo tracks and launch clips would be a good exercise. You might be thinking that you could just do this with MIDI mapping for the same functionality. Yes you could, but doing it this way can afford you greater flexibility, complexity and portability. We will see evidence of this in future articles, but for now just trust me on this. You can download the devices in the example set HERE:

live.pathThe Live API gives us access to the nuts and bolts of Live itself. Tracks, clips, devices, pretty much everything that you can interact with in the application we can manipulate from inside a M4L device. There’s a lot of documentation on this available from cycling ‘74 and Ableton, so I won’t get into the nitty gritty, but I’ll tell you how I visualize it. You can see it as a hierarchy of objects, with the live_set at the top. The live_set has properties like ‘BPM’, functions like ‘stop’, and children like ‘tracks’. You can observe properties, you can call functions, and you can navigate to children.  This holds true for all levels of the API. It might take you a bit to wrap your head around this, but once you learn how to find your way around the API you can get anywhere you want really quickly. I have found the group of objects to the right very handy for getting information about the API.

This will print to the max window all of the information about the object located at ‘path live_set’. You can find the children and then just add that to the end of the path at the top to get all the information about that object. Rinse and repeat to go anywhere in the Live set.

mute buttonFor example if we want to control the Mute buttons on the first track. We send a message: ‘path live_set tracks 0’ to a live.path, then send the id out to a live.object.To do this programmatically I used an [uzi] and] to count up the tracks (0 to 3, all counts begin at zero as is the norm in most programming), and then used a gate to route the paths to the live.objects. You can see that construction over to the left here. You can statically define all the paths, but this looks nicer and is a bit more elegant.  Now we simply send a message ‘set_mute 1’ to mute, and ‘set_mute 0’ to unmute the track. You can see this in the ‘solo_mute_fire’ sub patch.

That patch also contains the math for converting the MIDI notes into a 4 track by 7 clip grid. Because all the notes on this preset are consecutive, it was easy to use some division and modulo math to neatly do this, as you can see over to the right.

M4L Clip FiringWhenever a MIDI note is received that is above 7, it calculates the path of that clip that the MIDI note is referencing. Then assigns that path to a live.object, and calls the ‘fire’ function to launch the clip. Dynamically assigning a path to an object before calling a function like this prevents us from having 28 live.objects in our patch. That would be a huge mess!

So now we are telling Live to do things, but one of the great things about controllers these days is the amount of visual feedback you can get. I wanted to monitor the mute and solo properties of the tracks and influence the LED brightness based on if you were hearing the audio on those tracks. This turned out to be the most challenging and problematic part of the patch due to the logic involved in getting all the lights to turn off. Let’s try and write it out in plain English:

The lights will be bright if the track is not muted, or if the track is soloed. The lights will be dim if the track is muted, but not soloed or if any of the other tracks are soloed, but this one is not.

That sentence is really hard to understand, and even harder to understand if you try to write it out with max objects. Open up the Solo_Observers_MaxObjects to see the mess that it created. It does work, but I was disappointed in how unreadable it became, so I decided to educate myself on using javascript to access the API and doing all of the checks and iterations inside of a js patch, which you can see in the solo_observers.js patch. This is much more readable and much more compact. You can use the toggle to switch between the two different methods, and should see absolutely no difference. If you’re interested in taking a look at the javascript, you can open up the [js] object and see some readable, commented code.

That concludes our (somewhat) brief exploration of defining how your controller works with Live using M4L, the Live API, and little bit of Javascript. Although these devices don’t do anything extraordinary, it was a demonstration of how to get information to and from your controller within Live. In the next installment, we will discuss accessing, observing, and manipulating the other side of the API, the Control Surface object. As always, direct any questions comments and concerns to evanbeta@keithmcmillen.com.