Posted by & filed under Ableton Live, Blog.

In the previous article we looked at some ways that you can use Live to interact with controllers. We used the API to get information about Live, we used the API to control Live, we monitored audio across tracks, and used all this information to get useful visual feedback on a controller. If you’d like to check that out you can do so HERE. This was all very useful and demonstrated that you can build interaction like this from scratch for any controller in your arsenal, depending on it’s particular MIDI implementation. One very powerful (although fairly undocumented) part of the API hasn’t been touched upon yet, and that is the Control Surface section of the API.

This section refers to, as you can probably guess, the Control Surfaces that are available from the MIDI preferences in Live. Live Control Scripts are collections of Python files that allow for the scripting of complex interactions between Live and controllers. These scripts are one of the most exciting features of Live, and if you aren’t using one with your MIDI controllers, you are missing out on a lot of tactile music making. Live comes with a huge list of supported controllers out of the box, and you can access properties of all of these control scripts through M4L and the API. Sadly the documentation on this is pretty lacking (or more accurately non-existent) so it is up to the intrepid computer musician to figure things out on their own. There is one fantastic resource which can be found HERE. This is a repository of all the decompiled Python Control Surface scripts and with a little digging, you can figure out quite a bit about how the control surface scripts are written. Of particular interest in the ‘_Framework’ folder. This contains prototype classes for all of the common things you might want to do with control scripts.

Some of the scripts have some nice features that allow for user defined functionality. The Launchpads User 2 mode is one example. It provides all of the buttons as control elements that are available to the API. The Push has a it’s Grab/Release functionality that allows you to take control of any of the buttons, the matrix, the encoders and the display. For this particular discussion though, we’re going to look at ways to observe and interact with active control scripts from an external MIDI controller. A SoftStep user recently contacted us about wanting to launch the scenes contained in a session mode ‘redbox’ (SessionComponent from here on out) using the Softstep keys. This involves assigning a live.object to follow the SessionComponent of the script, reporting where it is in session view, and then firing off scenes according to MIDI notes sent into the device.

This isn’t too complicated to do for one particular script, and we will walk through that as a starting point. But the real challenge comes when you want to make this device work for ANY script that has a SessionMode component. Each script is created differently, so this involves a series of iterations and checks that took a bit of time to get straight. You can grab the device finished device HERE as well as the ‘Session_Box’ control surface script. Alright let’s get to it!

image01I will reference again the importance of getting comfortable navigating the Live API with this basic sequence of max objects. This little setup becomes indispensable when prototyping interactions using the API. This sends a path into the live.path, which sends an id to the live.object and then reports all the info about that object. If we run that code, we will get a huge list of things in our max window, but no mention of any control surfaces. What gives? Control surfaces aren’t actually part of the Live set, they live in a parallel dimension that’s a little hard to find.

The easiest way to get to a Control Surface is to send a message of ‘path control_surfaces x’ where “x” is the slot of the control surface (counting from 0) in your preferences, into a live.path. This will get all the info about the control surface in that slot in Live’s MIDI preferences. To follow along with this load up a control surface that has a session redbox implemented (Launchpad, any APC script, Push, QuNeo etc.). Keep in mind that all the Ableton sanctioned controller scripts won’t display the Session Box unless that controller is connected to Live. To get around this I made a script called ‘Session_Box’ which is available from the device link above. We’ll talk about writing simple scripts like this in a later part of this series.

In the long info list that we printed to the Max Window we can see a variety of different information including children, properties, and functions. We don’t need to worry about most of these things, as they are functions that the script uses for housekeeping, assignment and various other tasks. What we ARE interested in are the children of the script. This is where all the heavy lifting takes place. The children are split into two different types of objects, components and controls. Controls represent ways that you interface with the script on your controller. Knobs, buttons, sliders are all examples of things that are attached to controls. Components are the representation of things in the Live set. Mixers, devices, parameters, and the session mode ring are examples of components. We would like to investigate the list of components.

image00You could programmatically poll the list of components to find our SessionComponent (and we will do that further down the road), but that is more trouble than its’ worth in our current investigation. Using a construction like this allows us to scroll through the list and find what we’re looking for.
Use the number box to increment through the components, and the name will print to the Max window. Luckily we don’t have to go far, as the SessionComponent pops up as the first component. So now we know that for the APC20, the path to our SessionComponent is ‘path control_surfaces 0 components 0’. So put that path into our API ‘getinfo’ construction and we can see a list of it’s properties and functions. One strange thing that I found is that the SessionComponent doesn’t have any properties that will tell us where it is located. It would be great if we could assign a live.observer to report the location whenever it changed, but we’ll have to find another way to accomplish this. Upon closer inspection we can see that the SessionComponent does have two functions that might be of use, the scene_offset and track_offset functions. We are only concerned with the scenes contained in the session view, so we can disregard the track_offset function and just use the scene_offset function. If you send a message ‘call scene_offset’ into the live.object that is assigned to the SessionComponent we will get the distance from the top of the session mode. If you haven’t moved the redbox at all, it should spit out a 0. This is very good.

The downside to this is that we will have to call that function to get the offset. It’s by no means a deal breaker, we will just have to call this function at a regular interval to make sure that our scene offset gets reported in a timely fashion. A [metro 50] object should do the trick in this case. I doubt that someone would want to fire a scene from a separate controller within 50 ms of moving the box, so that should do just fine.

image02So now that we have successfully gotten the location of our session redbox, it should be fairly easy to assign some midi notes to fire scenes contained within. In the case of the APC20 script we know that it only contains 5 scenes (you can verify this by calling the function ‘height’), so we’ll simply need to route 5 midi notes to fire 5 scenes in sequence, and add the scene_offest value. You can get the numbers however you want, but I used a route object that looks for specified MIDI notes and then sends the proper number.

It would also be useful to move the session box from another control if we wanted to, and that’s a simple matter of calling the ‘set_offsets’ function. It takes two arguments that specify the intended track and scene offsets. You can see this construction in the lower left of the patch. It also shows a technique to increment or decrement values that are reported by the location of the Session Box. It’s a nice way to be able to move the session by both the original controller, and whatever you choose to use along with it.

That wraps up this episode of ‘Controlling the Controllers’. Stay tuned for a walkthrough of how we move through the API to observe the redbox of ANY script in the next installment. Direct any questions, comments, or concerns to evanbeta@keithmcmillen.com