Squarepusher: Recording 'Music For Robots'

Pushing Musical Machines To The Limit
Publish date:
Social count:
Pushing Musical Machines To The Limit

Bonus: Watch the making-of video HERE.

WHEN TOM Jenkinson, aka Squarepusher, debuted in 1996 with his now-classic Warp album Feed Me Weird Things, its combination of tremor-inducing drum and bass rhythms, twisted synth phrasing, and Jaco Pastorious-inspired electric bass playing was simply startling. Jenkinson accompanied similarly minded and freakily futuristic English programmers such as Aphex Twin, Chemical Brothers, Luke Vibert and Underworld, diving headlong into a brave new world of sampled/sequenced/cut-up collage.

The Z-Machines guitar robot plays with 78 “fingers.” While his compatriots wandered off into dithering head space or circled the same terrain, Jenkinson continued to push himself and recorded prolifically—not always with predictable, or even pleasing, musical results, but he never lost the plot. He never stopped searching.

Fifteen albums and 18 years later, Jenkinson collaborated with a team of roboticists from the University of Tokyo, composing music for their robot band, Z-Machines, and Music for Robots (Warp) was born. Writing for a guitarist (“Mach”) with 78 fingers that can play at 1,184 bpm, a drummer (“Ashura”) with 22 arms, and a keyboardist (“Cosmo”) that fires green lasers to “play” his keys, Jenkinson sent MIDI files to musical producer Kenjiro Matsuo, who worked with the roboticists to produce Jenkinson’s music, which he then mixed.

Z-Machines have performed metal and J-Pop, but nothing as musically sophisticated as the sounds of Squarepusher. Indeed, the first notes of “Sad Robot Goes Funny” sound exactly like Jenkinson’s lyrical Fender Jazz Bass playing, before the music turns baroque as the guitars enter, then free as the drums execute a Vinnie Colaiuta-worthy cadence and more guitars spin webs of dulcet melodies. It sounds like humans, and it sounds like machines.

Music for Robots is very ambitious, from the pastoral lute sounds of “World Three,” to the sweetly plucked guitars of “You Endless” (Jenkinson has always had a soft spot for mushy ballads), to the madly churning Stravinsky-fusion-in-Hell sensations of “Dissolver.” Some may dismiss the album as heartless prog rock on steroids, or a fusion fatalist’s notion of musical self-indulgence, but in a world increasingly controlled by whirring hard drives, overseen by global surveillance and entertained by programmed music that barely resembles music, Music for Robots is a beautiful embryonic first salvo from a future where glistening sounds transport us, where styles can be dialed in, where the impossible becomes possible, and the final frontier is endless.

What was your goal when you began writing for the robots?

Before embarking on the project my question was, “Can the Z-Machines do something which is compelling for the listener? Can they make music which survives the novelty value of music being performed by a robot?” A lot of the interest in this project comes from the gimmick value of these machines, the robots. But I don’t want this to be a gimmick. There’s no point in recording a music box with attitude. There has to be something at the core of it which is valid, musically. It has to go beyond the simple technical details of how it’s done. It has to make you feel something.

Did you have a plan?

I had four weeks to write the music, so there wasn’t time for an overarching plan. I just had to see what I could do in that time. The whole thing was restrictive. The robots require a hell of a lot of maintenance, they need a team of people to work them, and storage is financially intensive in Japan, so this amounts to a monstrous financial burden. So it had to be done in short space of time, with no time for refinement or developing the big concept. “What can we do that is good?” was the main idea.

So how did you infuse soul into the Z-Machines? Did you use writing software?

I started off experimenting, trying to establish how fast the robots could play and the degree to which I could get them to play consistently, and if there’s any kind of interdependence between what the robots play between one another. Trying to establish the degrees of freedom, if you like. Then it was, how fast the drum robot could play consistently; they’re all run by compressed air.

Z-Machines use pneumatics to play the instruments in the same way as Pat Metheny’s Orchestrion.

Yes, it’s all run pneumatically, so if you play all the drums at once, does that drummer robot literally run out of air? These considerations are what I started off with. I sent MIDI files for [Z-Machines] to do tests with, and they returned the recordings of those tests so I could work out the limits of the machines and use those as guidelines to move forward. But this is not a particularly different process for me. I am always working out the boundaries of the instruments; that is one of the concepts I’ve always liked playing with. The boundaries are explored in this project, as well as what happens when you push them into the zone where they’re not able to completely, consistently deal with the data proficiently anymore.

Do you see the similarities between Z-Machines and Pat Metheny’s Orchestrion?

The Orchestrion is really cool, but Pat should give it to me. [laughs] At least lend it to me! The Orchestrion is impressive, but you need to take music to the outer limits, really see what the machines can do. If I want to play music with sensitivity, nuance, and subtlety, then I will play it. I won’t get a f*cking robot to play it. I’ll do it myself. Like I did on Music Is Rotted One Note or Just a Souvenir, where all the drums were live and nothing was sequenced. I want to use the robots for their strengths. The robots are amazing at playing music of ridiculous complexity with accurate timing. They are not so great at playing music with nuance.

On “Dissolver,” the Z-Machines play the equivalent of 128th notes; the fastest music I’ve heard that wasn’t a total blur. They were broad sweeps of glissandos, basically. Once locked-in, did the robots play back the music consistently each time?

Even though they’re machines, the performances are not identical each time. There’s a degree of inconsistency. There’s a sort of fluidity in the way the robots deliver the performances. It can be extremely fast, and by and large what they play is locked and tight. But they’re not 100 percent consistent; I find that quite interesting, especially when you push the robots to the very upper regions of how fast they can play. Then they start behaving in a less predictable way. There is a section in “Dissolver” where the guitar robot does what you call “shredding.” There’s a section beyond that where the guitarist robot is playing data which is generated mathematically using sets of equations rather than me composing as such. I let the equations run and they generate the data. That’s where I was really testing the upper limits of what the robots can do. It’s playing notes all over the fretboard at virtually the same time; exploring not just in terms of speed but in terms of making sounds with the guitar. It would never be possible for a human being to do that.

Jenkinson in the studio. Tell me about those equations.

I set up equations and recorded the output in MIDI form. In that respect, the robots are not as outlandish as you’d think. They are still running the basic MIDI protocol using the conventional MIDI notes and MIDI controllers, admittedly being employed in a unique fashion.

What about MIDI and dynamics?

None of it plays with any dynamics. The drummer has many sticks but they play at fixed velocities. There are three sticks for each drum: one that plays soft, another medium, and another loud. But the guitar robot doesn’t play with any dynamics whatsoever. You have to build the nuances into the arrangements of the notes. They have this classic robot thing going on, like in Doctor Who with [fictional mutant race] the Daleks. They can take over the universe, but they can’t go upstairs. They can play a million notes a second, but they can’t play with any amplitude control. The guitar robot has this odd rotating pick arrangement for his right hand; it’s a stick that rotates one way then the other way; it has a pick attached to the end so when it rotates one way, it plucks a string, and then after that it plucks it the opposite way, to resemble the sound of up and down strokes.

Is the drummer robot playing all the drums behind his head and in front as well?

He plays everything; all of the drums. It’s basically about the limitations. The guitar robot will move the position of his right hand to strike the strings in different places to emphasize harmonics, but it’s uniform tonality. I am interested in baroque music, and these early instruments such as organ and harp, they are very uniform in their dynamics as well. So I find that a different type of challenge to write for. I thrive on limits; my mind is adapted to dealing with music, making situations that are difficult. When I began in 1996, I didn’t even have a proper sequencer.

When I interviewed you at your studio in Hackney in 1996, all you had were a simple computer, Fender Jazz Bass, and lots of Dizzy Gillespie and Charlie Parker records.

Yes, I had the basses, an Akai sampler, drum machine, a synth, an eight-track recorder, but it was all very limited. My studio doesn’t look that different now, here in Essex. I thrive in the situations where it’s difficult. Necessity is the mother of invention, they say.

How were the Z-Machines recorded?

I’ve never actually seen them. This was all conducted online. But given that the Z-Machines are essentially playing conventional instruments, albeit being played in an unconventional way, it’s still a conventional recording process. So the guitar was DI'd, the drums were miked, and so forth. The recording process was totally old-school. On the one hand, you have this set of instruments that are utterly familiar to everyone. Everyone knows these sounds. They are so drowned in cliché.

Did you apply any plug-ins, any effects during the recording or mixing process?

I have tried to represent the performance that they would give at a show; that was my goal, as if I was at the mixing desk. You can hear a little dynamic processing, a little reverb, a slight bit of motion on the faders for emphasis. But basically, I am representing the actual performance of the Z-Machines. There is no editing, no tweaking, no cutting up of phrases and duplicating them in Pro Tools. It’s not a bunch of machines in a lab that I’ve processed the f*ck out of and you don’t know what is what. This is a robot band and this is what a robot band sounds like.

There is a section in “Remote Amber” in which the drummer is playing a free section on the snare drum, and it sounds like a live jazz drummer.

I played that on my sequencer, just bashing it in. That’s where the robot drummer starts to sound natural. And you can even hear the squeaks of the robot’s arms before it strikes the drum. I could have edited those out to make it perfect, but that’s part of it, its hinges squeaking. Admittedly they could have put a bit of WD40 on it! That was interesting. I am essentially asking, “Can robots play jazz?”

What’s the future of you and the Z-Machines?

I have my own versions of software that will make up sequences of notes; perhaps you can call that improvising. I don’t know if I classify that as improvisation; that has to come from a human mind. Or at least a mind. Improvisation comes from spontaneity, and spontaneity is one thing artifical intelligence hasn’t got a grasp of. It is very good once you set it in a direction, but coming up with new ideas? That’s where improvisation gets exciting, coming up with something new on the spot. Expecting a robot to do that is a little bit optimistic.

Ken Micallef is freelance writer and photographer based in New York City. His work has appeared in many publications, including DownBeat, eMusic, and Modern Drummer.