At some point in your life of a computer musician, you’ll be tempted to experiment with making your own tools. Whether that means using MIDI mapping to create a performance template or coding a plugin in C, the options for getting into tool making are quite expansive. For me, this meant repurposing controllers with Max 4 Live to make them do interesting things and help me perform music. Eventually I became interested in trying self made audio processing directly to hardware like the Push, or the SoftStep. While developing the audio portion of these M4L patches, I was confronted with a few problems that weren’t easily solved, nor cleanly developed, using standard Max objects. I eventually started to take the dive into the gen~ environment, which is a visual (and text based) low-level DSP environment. In this article (and others in this series), we’ll take a look at the gen~ environment and the genexpr language thats included in recent versions of Cycling 74’s Max. We’ll create an audio effect, then tie that to a piece of hardware and implement some LED feedback to create a tightly integrated way to interact with our sound.
After spending time over the past few years creating some of these tools, I realized that some things can be better defined in a text based format rather then with the graphical approach that Max provides. Max is great for doing all types of things, and you can get pretty low-level. Some things like iteration, sample-accurate timing, and ideas that are more easily expressed in text leave much to be desired . That’s where having some fluency in basic DSP and text-based programming can save you time, headaches and processor cycles. As someone who has always enjoyed mangling, chopping and stuttering audio, I thought it would be a good exercise to create a M4L device that accomplished some of this by using the gen~ environment to code the processes whenever possible or advantageous.
This series of articles will assume basic knowledge of programming in Max and M4L, and will cover basic buffer operations in gen~, a small amount of Python scripting, and a bit of the Live API. In the end we will be writing samples to a buffer, and playing them back in (hopefully) interesting rhythmic ways from a SoftStep or QuNeo while being synchronized to Live’s (or your patch’s) clock.
Let’s first define what we will get done in this first article. My plan for this series is to introduce just a few concepts per article, and this first will focus on synchronization and quantization. Some audio processes, especially ones that are meant to be rhythmic in nature, benefit from being synchronized to other musical elements. Before we begin looking at actually recording and playing back audio we will set up some synchronization systems that will really come in handy as we move along.
Let’s open up a blank M4L audio device and grab some information about Live’s tempo and clock. This will give us information that we need to define recording length, playback length and speed, and other timing things that will be helpful. There are quite a few ways to synchronize an M4L device to Live’s clock:

We will definitely want to use the plugsync~ method for this particular device. This will give us two things we will need: the transport state out of the first outlet and the bpm out of the sixth. The transport state simply gives us a 1 when we start the transport and a 0 when it is stopped. Pretty simple. We will need to calculate how many milliseconds one beat is, and from that we can calculate the length of time for any musical division. To get the length of a bar, simply multiply this by four. Or if we wanted sixteenth notes we would multiply it by .25 (or divide by four). The formula for converting BPM to ms looks like this:

Now that we have information about our transport and the musical time divisions we are working with we can set up a simple quantizer. This will help to only send messages on the next beat (of course we can disable this for more freeform glitching later). For our quantizer we’ll use the plugphasor~ object. This object produces a sawtooth wave that is sample synchronized to Live’s clock. This goes from 0 to 1 every quarter note. It’s important to get familiar with and comfortable manipulating values that are normalized within the range of 0 to 1; a lot of audio programming and DSP (and other creative code stuff) likes to use normalized values. So we have this saw wave, now what? If we want to send values every quarter note we will need to send out a bang whenever we see a 1 to 0 transition. We can use the delta~ object to detect the difference between the current sample and the one before it. When the difference is < 0 that means that the wave restarted, which is our beat. We can also use the change object to report when the wave has changed direction. These methods are equivalent as far as I can tell.

So we are getting bangs every quarter note, that’s well and good, but we will probably want a smaller quantization value. Remember when I said that we will want to get comfortable working with normalized values? We’ll get a chance to practice some of that here. Say we want to have 16th note quantization. We will want the phasor~ to move from 0 to 1 four times as fast, so we multiply the output of our phasor by four. Because both of the objects we are using to quantize look for changes in direction, we will still have the same quantization. The answer to this problem is the modulo operation. The modulo operation finds the remainder after division of one number by another, or, in practical terms, it ‘wraps’ positive values within a certain range. If we insert a [%~ 1.] object after our multiplication, our wave that was going from 0 to 2 within a quarter note will now go from 0 to 1 twice within a quarter note, i.e. produce eighth notes.

You can adjust the rate of quantization with the number box in the upper right. Higher values result in quicker quantization. All that is left to do is create a small logic structure that only passes messages on a quantization event. Max provides us all we need to accomplish this in a few objects. We use the [onebang] and [int] objects to store a number and only fetch the last integer received when we get a quantization event.
Now that we are all set up with timing information and synchronization options, in the next article we will set up a rolling buffer to continually capture incoming audio for manipulation.