Mr. MPC: Roger Linn

Read the Remix article in the special "25 Sure Things for 2007" feature on Roger Linn and his thoughts on the present and future of music equipment. Linn creates the Akai MPC and the first sampling drum machine, and think new musical instrument interfaces will be needed to take advantage of progressing technology.
Author:
Publish date:
Image placeholder title

In January 2007, we talked to Roger Linn to get his thoughts about the future of electronic music gear. Read the complete interview below.

Roger Linn created the first sampled-sound drum machine in 1979 with the Linn Electronics LM-1. Later he perfected the sampler/MIDI sequencer/rubber pad workstation with the Linn 9000 (1984), Akai MPC60 (1988) and Akai MPC3000 (1994). Recently, his Roger Linn Design company wowed critics with the AdrenaLinn II beat-synched effects processor/amp modelor/drum machine and co-designed M-Audio's Black Box.

In what direction do you foresee the development of music recording and performance equipment heading?
Roger Linn: People don't really play traditional instruments so much anymore. Rather, they play computers. Everyone uses found art, in the sense of sampled loops. People are basically just manipulating objects that they found. It's the same way in the art world: Nobody's really a painter anymore; they're more mixed-media artists. They take found objects and manipulate them. It's all about concept, and not so much about craft. I like to call it OOC, Object Oriented Composition, where you're taking little objects, and the art is in your combination of the objects as opposed to creating every note on an instrument.

I think the challenge faced today is that you can't manipulate those loops too much. So you get a lot of the that BOOM-CHK music that goes on that has one chord going on for quite a long time and background dance rhythms and stuff like that. And you can filter it, but you can't really deconstruct it. Not only that, but the user interfaces aren't terribly good. You're limited in what you can do. With all those limitations, I think basically an evolution towards new musical instruments will come at some point and is already starting to happen in some ways. Ableton [Live] was a very good stab at turning the tape recorder basically into a musical instrument and making it very live and real-time in its manipulations. And all these wonderful products coming out like Novation's controllers—the Zero SL—where you just have a variety of different ways to control Live and other DAWs.

But still you have the problem of when you're working with loops, there's not so much you can do in terms of deconstructing it. But that's something that's coming. There's some really interesting research being done into separating an actual mixed recording out into its component parts. Doing an analysis of it, then a manipulation and then resynthesis. You can already do some of that in music if you have clean signals with things like fast Fourier transform (FFT) and such, where you can take a signal note instrument, for example, analyze it, turn it into its harmonic components and then resynthesize it. There's a very interesting sample library on the market called Synful by Eric Lindeman, where he does that. You think it's a sample library, but you're actually listening to an extremely complex additive synthesizer; it's just very, very accurate. How it works is all the samples he started with—which are orchestral samples—he reduces them to their mathematical representations, so he can manipulate them at a very fine level in real time, and then reassembles them at the last minute. This is the sort of stuff that I think will only get more and more complex as computers get faster and algorithms get smarter for actually deconstructing mixed music. Once you can do that, you'll be able to have a plug-in program where you could take a loop off an old recording that has a major chord, and turn it into exactly the same loop, except with a minor chord or a 7th. Or turn the piano into an electric piano or brass but leave all the other instruments unchanged. And when you're able to do that, you're still working with loops. You don't have to have the craft—you can concentrate on concept—but you're able to go to a further level of atomization and really manipulate music to a much finer degree and create your own art. And then the next challenge will be coming up with the proper user interface for that technology.

Can you please explain more about Fast Fourier Transforms?
RL: Fast Fourier transform (FFT) is actually a fairly common mathematical transform, but what it basically does is take a single-note signal, as an example, of a piano, guitar or something like that, and it will actually convert it into its individual harmonics in the harmonic series. And once you have the audio analyzed in that way—you've probably seen some of these programs that show audio in kind of a three dimensional mountain that goes over time, and you have a bunch of envelopes for each harmonic. For instance the first harmonic will vary in volume over time, and the second harmonic will vary over time. And when you put them all together up to 60, 100 or 200 harmonics, it looks like this mountain range. And that would be sort of a Fourier analysis. But it's basically just an algorithm for converting recorded audio of single notes into their component parts. Similar things are use to encode MP3s, but with a slightly different process.

The kind of plug-in you talked about that could replace a single instrument within a full mix is amazing. It's the kind of thing people never thought would be possible.
RL: Yeah, but nobody ever thought that a computer in your pocket would be possible.

Regarding user interfaces, how do you think they are lacking now, and what are your ideas for the directions they should head?
RL: Well, it's really a loaded topic, but if you take a look at what the common musical interfaces are right now, they're severely limited, because they were created back in a time where there was no electronic technology and no ability to separate the playing interface from the sound generator. When MIDI came along, you could play a keyboard and get a guitar sound, or play a guitar controller and get a keyboard sound. They may not have sounded great, but you could at least get it to happen. Well, you've got all kinds of limitations with the current playing interfaces. Take piano, for example. If you want to learn to play piano, you have to learn to play in 12 different positions. If you're doing a standard 1, 6, 2, 5 chord progression, if you're in the key of C that's great, because you're all on the white keys, but try doing it C#. You have to play entirely different positions for every chord, and basically you have to learn 12 different positions. Well that's absurd; why should you have to do that?

You could retort with "well, just get a keyboard controller that has a transpose button on it, and then you can play all the white keys." But music is not always falling on the notes of the major scales. Sometimes you want to do a harmonic minor scale or want to do a chromatic scale thrown in the middle of when you're doing a major or other sort of chord. So piano by its very nature is a lousy instrument. The other thing too is that with piano, you don't have much in the way of solo expressivity. And even synthesizers haven't helped much. Even though you've got a pitch bend wheel and a mod wheel on a synthesizer, they're way off to the side, and they require you to make two hand movements to do something that should be logically integrated in one finger. If you're playing for example a cello, your vibrato and selection of a note is done all with one finger. If you do that on a synthesizer, you find that it's difficult to use pitch bend wheels or strips to be able to get really good pitch control, because it doesn't work very well. The other problem is that you're using your other hand for that pitch control, and you can't use it to play chords. So you have this compromise. First of all, you're not even able to get that great pitch nuance the way you can with guitar, coupled with dynamic nuance. On the other hand, a keyboard allows you to play both accompaniment and solo at the same time. And there have been attempts to give you pitch nuance control. There was a product in the '30s called the Ondes Martenot, where you could wiggle the keys left and right and get a vibrato. It wasn't the best system, but at least it was an attempt in that direction. But ever since that time, there haven't been many great solutions for getting very fine and beautiful pitch nuances out of a keyboard.

Now, take a guitar. This is the other most popular instrument today, and it takes both of your hands to play a single note. That's really inefficient. You've got all these body gestures, but if you want to play chords and solo, you either have to be very adept at the sort of work that a jazz player or even an acoustic folk player can do by suggesting melody on the high notes while still holding the chords. But even that's severly limited, because they're not independent. So guitar is great, because it's this great compromise between polyphony and solo expressivity, but still, it's severely limited.

What you really want is an interface to an instrument that allows you to have both accompaniment and fine solo expressivity with fine pitch control in one instrument in real time at the same time. Along those lines, there have been a number of really wonderful attempts at this. I really wouldn't say they're very successful, but they're attempts. There a lot of people who have experimented with what are called hex keyboards, where you have a bunch of little hexagonal chicklets (?) in a keyboard arrangement. You have a horizontal row, but the next row the keys are always sort of in between and above, so you get this arrangement of triangles, and you put them together and you get hexagons. And you get a number of rows up and a number of rows across. The advantage to that is, once you learn the finger positions for one key, if you want to transpose to another key, unlike piano, you just move up, down, left or right and all the same finger positions work just by changing your relative position.

There's an interesting little product out of Australia called the Thummer. And while most guys have tried to make esoteric, academic products out of controllers like these hex keyboards, they're actually trying to sell Thummers for $375 each. They're basically MIDI controllers with a couple of joysticks on each one, two hexagonal keyboards and a few other buttons—and all of the buttons are pressure sensitive, too. So it packs a lot of wallop in there. They're slowly getting it together down there.

There's something called the Starrlabs ZBoard. There's a company called C-Thru, and they have a hexagonal keyboard that's arranged on a flat base. There's a product in development at Yamaha that's really interesting called the Tenori On, that has a 16x16 matrix of lit buttons, and the combination of the lit buttons and the software inspires you to find your own musical path by the feedback you get from the lights. Haken Audio is a great company that has a keyboard called the Continuum, and it's basically wet suit material that has the outlines of keys of a black and white keyboard silkscreened on to it. Any time you press it, you get a note sent out over MIDI. But the beauty of it is as you move your finger left and right, it slides the pitch up and down continuously, and you get another control for moving forward and back and another one for how much pressure you use on this spongy material. That's one that's great. There's another one called the Lemur from JazzMutant. That's wonderful and is along the lines of using the mixer as a musical metaphor as Novation does with their ReMotes. And then you've got Buchla's Thunder and Lightning controllers. The Thunder has these little touch-sensitive strips that you move your fingers in to control MIDI. The Lightning is one where you have these two little remote controls in your hands, and a sensor that sits out in front of you. And you can map any cube in a three-dimensional space to be a particular MIDI note or MIDI message. People have used this for all kinds of very interesting stuff in performances, such as dance performances.

I could go on for hours. There another thing called the Sonalog GypsyMIDI motion capture suit. It's sort of like an exoskeleton for your upper body, and as you move, it puts out MIDI controller messages proportional to your body gestures in your upper body.

Does this relate to where you want to take your own designs?
RL: There are things that are exciting to talk about, which for me are often things far out into the future, and then there are things that you can make money with in the short term, which are usually a step in that direction. So what I like to do, is to try to imagine the future way, way out, and then step back and find some stuff along the way where you can find enough people who see the advantages, but it's a step toward where you want to go. Alternate music interfaces have always been important to me, and that's what I originally did back with the drum machines. And you could argue that the 4-by-4 pad matrix is in fact a new musical interface that's taken root and found popularity in contemporary culture.

I would argue that. Especially when you see some hip-hop producers play pads, it's definitely an instrument for them.
RL: Another one is the mixer as metaphor. If you look at all these control surfaces people are using to control Ableton, they're basically using the recording mixer as a metaphor. So you could say that the recording mixer, which started out as just some slide controls and some knobs is now in a sense a musical creation interface. So there's a couple of examples of two metaphors that have taken hold in the culture.

But the fun thing for me is to ask the question "if you look about 50 years out, what will people use to make music?" Will they still be using piano and guitar? Well, I think they will be, but they''ll be using other things too. And the question is what will it be? Somebody's got to think of something that's great, and I'd sure like to be a part of that decision. For example, if you ask somebody, will there be a brand new instrument that's completely technologically based 50 years from now? A lot of people will say that's hard to imagine. Nothing can beat what piano and guitar or violin or cello or something like that can do. But I think that the odds are very slim that those traditional mechanical instruments will still have precedence. Basically, a synthesizer keyboard is just a new version of a piano. And a piano layout is the way it is because it was convenient to have a key in front of each one of the strings. You don't have that limitation when you have an electronic keyboard.

What you were saying about looking ahead 50 years and then taking a step toward that future reminds me a lot of what Ray Kurzweil says with his projections and in his latest book, The Singularity is Near.
RL: Yeah, that's a good book. He's a great communicator. He's able to take a lot of these ideas that the futurists, or the Singularitarians as he calls them, are presenting in the world and disseminate them well.