Studio Junkie

Games, film, remixes, electronica—Junkie XL does it all.BONUS MATERIALWeb Clip: Watch a video of Junkie XL showing off the gear in his multiroom project studio.Podcast: Listen to more of the interview with Junkie XL.
Publish date:
Social count:
Games, film, remixes, electronica—Junkie XL does it all.BONUS MATERIALWeb Clip: Watch a video of Junkie XL showing off the gear in his multiroom project studio.Podcast: Listen to more of the interview with Junkie XL.

FIG. 1: Guitar is one of the many instruments Holkenborg plays, and he likes to come up with ideas for sounds by playing through an array of (mostly Electro-Harmonix) effects pedals.

When Tom Holkenborg was a young musician growing up in Holland, he spent so much time playing music and recording that his friends dubbed him “Junkie.” The name has stuck — he now records under the moniker Junkie XL — and so has his obsession with music making. From recording his newly released album of hard-edged electronica, Booming Back at You (Artwerk Music, 2008), to writing music for hit games such as The Need for Speed and SSX Blur, composing music for films like Blade and Dead or Alive, and remixing songs by artists such as Coldplay, Elvis Presley, and Britney Spears, Holkenborg has developed a busy and diverse career.

Ground zero for Holkenborg's work is his impressive project-studio complex, a few blocks from the boardwalk in the Venice district of Los Angeles (see Web Clip 1). Here, Holkenborg has built a studio that would be the envy of any recording musician (see Fig. 1), complete with a maxed-out Digidesign Pro Tools HD system; a separate Mac for running Apple Logic Pro, which he uses for MIDI work; several additional PCs for running Native Instruments Kontakt; two huge Apple displays; Dynaudio and M-Audio surround speaker systems; a collection of guitars, basses, and effects pedals; and much more — and that's just for his studio. There are also two other setups in the building at which his assistants, Sam Estes and Andre Ettama, work.

I had a chance to visit Holkenborg's studio recently. And with a cappuccino in hand from the studio's espresso maker (another key piece of gear), I sat down with him to talk about recording, equipment, and his career.

I notice there is an absence of outboard gear in your studio; there are just a couple of Empirical Labs Distressors that I can see. Are you recording pretty much all in the box these days?

Not really. I've got an outboard gear rack in the other room. And then I still have a full-blown analog studio in Amsterdam that I'm dismantling step-by-step and just getting stuff out here. I've got a 132-channel analog desk with Neve EQs, a 24-track Studer, a 2-track Studer, Fairchild compressors, Klein and Hummel compressors and equalizers, Telefunken compressors. I've got all the synths made by Korg since the very beginning: Yamahas, an Oberheim 4-voice and an 8-voice. I've got about 50 synths.

Did you do this most recent CD here, in Amsterdam, or in both places?

I did it all here. One of the main reasons why I work almost completely digitally these days is time. When you work on video games or movies or you work on commercials, everything needs to be done yesterday. And you need total recall to change the slightest little detail. And after directors and film studios and ad agencies have signed off on a certain product that you have delivered, you can't deliver something else afterwards that is even slightly different from what you sent them before. When I work on my artist material, that's the only situation where I can really take the time and just noodle with sounds forever until I'm happy with them.

I was particularly impressed with the synth sounds on your new CD. Did you program those all yourself?

I work like this nowadays: with music, it goes back and forth between different programs all the time [see Fig. 2]. For instance, I program a kick drum, just as a kick click, and then I start jamming with the bass guitar. And then I come up with this bass riff, and I just jam and jam and jam — and then at some point it's like, “Oh, that's pretty cool.” So then I take that section, and I bounce out the bass guitar sound. And then I go to Sam, and I say, “Sam, I've got eight bars of bass guitar here. Load that up in [U&I Software] MetaSynth, and then I want you to do this and this and this with it.” Then I go to my other guy, Andre, with the same bass line, and I say, “Why don't you program 15 or 20 sounds in that synth, in that synth, or in that synth, and copy whatever you did with the bass line?” Within half an hour, I've got both those things back, and I start noodling around with the results from MetaSynth and from [Native Instruments] Reaktor, for instance. I come up with a new sequence, I chop it up, I do my own stuff with it. (I've got [Symbolic Sound] Kyma running here as well. I do a lot of things in Kyma.) That results in a new bass line. Again, it goes back to Sam and it goes back to Andre. So the process is adding sounds to a riff, and then resampling it, chopping it up, reworking it. And then it goes back to the software programs, gets resampled, goes back in. At the end, you're listening to sounds that people are like, “What is that? What synth is that?” It's not a synth, it's not a bass guitar or whatever. It's like a complex sound that has its origins sometimes in three or four different things at the same time.

What other kinds of processing do you use a lot when coming up with your sounds?

FIG. 2: Holkenborg, often with the help of his two assistants, puts a lot of time and energy into programming custom sounds, using a wide range of plug-ins and processing programs.

We do a lot with cross-convolving, where we do like FFT envelope-filter analysis and apply that to something else. Let's say that you want to create an airy pad, but it doesn't have the quality of a pad; it has the quality of something unique. What you can do, for instance, is to take a crash and take the section of the crash where the volume of sounds is really loud, like just after the attack. And just take a section and loop it forever so you have [he makes a shhhhhhh sound]. So take that section, do a filter envelope of the frequencies in there. And then, for instance, play guitar; you play the chords of the song, and then you apply the frequency analysis that you got for the crash and apply it to the guitar. The result that comes out of that is already insane. But what if the result of that gets cross-convolved with a female choir? And what if that gets cross-convolved with the lead vocal that you have in your song? You get all these weird frequencies that are working with each other, and at the same time it's getting all this melodic information from different instruments — like the guitar, like the choir, and like the female voice — to create these really complex harmonic sounds that are impossible to make with one synthesizer or two synthesizers.

On your new CD, there was a really cool, elastic-sounding bass line on the song “Booming Back at You.” How did you come up with it?

That was actually not really hard. That was just a saw wave that sounded really fat. It's funny that every plug-in synth and every hardware synth out there can produce a saw wave, but if you play that same note on 40 different keyboards, it will sound completely different. It's the same note, it's the same saw wave. But it sounds completely different.

Because the rest of the synth architecture is different?

Well, I'm talking with all the filters off. Everything off. Just play it on one synth, and it has full overtones and undertones — whatever they're called. It's almost like picking up ten Gibsons. Like those two Gibsons [pointing to his two Les Pauls] are technically identical. But if I played the E string on one guitar and the E string on the other one, they have a completely different flavor. The guitars feel different, yet they're the same model.

Then again, you get the differences of wood and all that on guitars. Theoretically, that's not the case with a synth.

You would say that, but there's still a difference. With that thing [the “Booming Back at You” bass sound], the trick was to find a saw wave [with] bigness to it, and then just edit it a lot with portamento and glide so that the timing sort of felt right.

How does your work split up percentage-wise between film scoring, game scoring, your own albums, and so forth?

Film and games are 60 percent a year. That includes commercials, video games, movies, doing a title song for a video game, or doing a special version of a song for a movie. Then 5 percent is remixing, and the rest is Junkie XL being the artist and going on the road and doing gigs. Because I work a lot of hours in a year, 5 percent is still a lot of time. Last year I did Coldplay, I did Bloc Party, I did Justin Timberlake, Britney Spears, and Avril Lavigne. I usually do about four or five remixes a year.

Let's focus on your remixes for a moment. I guess that you have to take a completely different approach for them than you do when you're producing tracks for your own solo efforts.

Absolutely. You work with a song that is great or horrible or challenging, or a ballad, and you turn it into something completely different. But I always try to keep the original song in mind, and then do something with it that has a lot of the original flavor in there, even though it's a completely different [type of treatment of the] song.

Click to continue reading "Studio Junkie".

Holkenborg''s diverse workload includes albums; movie, game, and TV music; remixes; and touring.

Besides the vocal track, what are you usually given for source material? Do you get any stems of the other tracks?

It depends. Certain types of mixes, like with Britney Spears, sometimes I get the full Pro Tools session, and sometimes I get just a stereo track with her vocal, with all the backing vocals, and with all the processing and the reverbs — the whole shebang — and that's what I have to deal with. With Coldplay, for instance, I got the full multitrack [session]. Even including the demo recordings and all the 20 vocal takes that Chris Martin did, including the one that they comped, and so that was pretty interesting.

Does having so many of the original tracks make it harder, in a way, to get away from the original feel?

No, I like it. When I remixed Depeche Mode a couple of years back, I got the full multitrack, and I was like, “Damn, those guys were ill — just 24 tracks to create that massive sound.” And I was able to listen to all of the sounds individually and say, “Mmm, that's how they did that. Mmm, that's how they did that.”

I read about your remix of an Elvis Presley song you did a few years ago. Tell me about that.

That was a bitch, because there were no out-takes or multitracks. That was just a mono file that I chopped up into thousands of pieces to work with a click track.

A mono file of his vocal?

No, of the whole song, that was it. And I had to make it work with programming, and I was recording a lot of instruments. I recorded a Hammond organ and lots of female vocals on top of it, and extra brass. I recorded a bunch of guitars and a live bass line, and I programmed drums. The only reason I got away with it was because, luckily, in those days, the vocal was mixed way too loud in the track. That was really hard because there was a percussion player playing on the original song who was all over the place.

What was the song?

“A Little Less Conversation.” And I did that for a Nike commercial for the soccer world championship in 2002. Besides the fact that Elvis is massive in America, and massive outside America, soccer outside of America is like mayhem. So that campaign had a lot of money put behind it by Nike. And that track just started living a life on its own, and it got to No. 1 in more than 26 countries.

What are some of the films you've done, and have you actually scored them, or contributed songs, or what?

How it starts with film music is that you do a couple of little things on a movie. Or you work together with one of the big film composers in this town, and you become one of his assistants/ghostwriters — and you deliver music for that person because he's more overseeing the film. And sometimes you get a credit for it, and sometimes you don't get a credit for it. That's the world that we live in, and everybody has to go through that.

So it's fairly common that a big composer will sort of subcontract out some of the work?

Not even subcontract; those guys are already working for them. That's just the system — that's how it works, and you just have to fight your way through there. A really good example is Hans Zimmer. He doesn't necessarily work on all the movies himself, but he just orchestrates all those people [working for him] and makes sure that there's quality control. He's always in charge of the vibe. It's been very beneficial for the school of composers that have worked for him. A huge amount of people that worked four or five years for Hans have become supersuccessful on their own.

Where do you fit in this film-scoring scene?

I've worked with all those people, and I've done stuff for Hans and for Harry Gregson-Williams and for a couple of other people, and you just get experience. You pick up on how things work, what the whole organization is, how people communicate with each other — like what's the tone of how people talk to each other — and you just sit on the sideline. Even though you do the bulk of the work, you sit on the sideline and you watch all that, and you learn, and you absorb what goes on in that world. And then at a certain point you break out of that, and you start doing things on your own. But since I'm an artist, I've already done a lot of stuff on my own, like little bits and pieces that you get the full credit for.

Like a song in a movie, that kind of thing.

Yeah. Or like a scene or two scenes. Like when I worked on Blade in '96 and '97, I was approached as Junkie XL. “We want your sound. We want whatever you do for that scene. Can you do it?” And then of course you get full credit for that. But with some of the movies that I worked on here in town — like, for instance, when I worked on Catwoman, which was a Hans Zimmer gig, and they had some issues with some of the modern things and some of the modern music in the movie, and I got approached, it was like, “Hey, can you help us out with that?” — then you work with stuff that's already there. There's already orchestra recorded, and you take that and put like a bunch of beats around it and make it sound as cool as you can. And then your function is completely different. For instance, when I did Dead or Alive last year, they approached me like, “Hey, we want you to do the whole film,” so then it's a Tom Holkenborg score or a Junkie XL score no matter what. The same thing goes on in video games. I've been doing video games longer, and I've had more success in the end result of the video game. So I'm way further in the video-game-composer career — I'm pretty much at the top, with scoring games like SSX Blur and Forza Motorsport and Need for Speed. Those are the flagship games of those companies, and they trust those games to me. That's comparable to, I don't know, a Spider-Man movie or The Simpsons Movie. In the movie world, I'm far from being there. I'm still in the growing process.

But you're heading that way?

Yeah, the only way is up for me.

Regarding video-game composition, I guess you have to be really careful writing melodies, knowing that they're likely to be repeated so much, right?

If you make music for a movie, it's a linear experience: movie starts, movie ends. It goes like this and like that, and then a grand finale, happy ending, whatever. So you see all that, and you just have to be sure that the music really fits. But a video game is a dynamic experience; it's interactive. Yeah, you start a race, but you don't know when it's going to end. It might end here [he plays a low note on the piano], and it might end over there [he plays a high note]. And you might play it for 55 minutes. So it's hard to make a linear piece of music for a game. So you're talking in-depth and interactive, and that's the hardest part. Especially when game consoles are getting more sophisticated, with more DSP and more processing. It allows you to do such crazy stuff with the music, and that whole market is breaking open at the moment.

You mean breaking open in terms of what you can do musically in games?

Yeah. For instance, with Need for Speed: Pro Street, we analyzed every race lap, and we said, “Okay, what can happen in every race lap that will trigger some sort of emotion with the player?” And we came up with about 20 or 30 things — like a great start, or if you take your first corner and you're doing really well, or if you hit somebody in the back and your front end is falling off — all the kinds of things that trigger some sort of emotion with the player.

And you try to reinforce that musically?

Yes. So you need software that allows you, without any weird glitches or weird musical vibes, to go to a different musical section when the player does something, so you can underscore that specific moment the best. You also need to translate all those emotions into music. It's like, “Hmm, how am I going to do that?” When somebody gets hit but the car is slightly damaged, how do I translate that into a musical vibe? Then, if you win, obviously it's euphoric; those are the easy parts. But it's the little things in between. And then you go back and forth with the technical team of those games, the audio leads, or the programmers. It's like, “How do we make this happen?” I end up sending every track that I made — like up to 100, 150 audio files, and sometimes way more — that contain transition files that contain multiple layers that can be played at the same time and [with] different balances. It's a lot of technical blah blah and a lot of thinking about that. Because yes, it's really technical, but at the end of the day, it should sound completely natural, as if it was meant to be like that.

The game software is what controls what triggers which piece of music, right? You just have to deliver the music to the game developer, and they figure out how to make that work?

That's what you do hand in hand [with the game developer's tech team]. They say, “What can you do?” And I say, “I can do this, but are you guys able to implement that and that and that in the game?” And they say, “Yeah, we can do that.” And the software that they use is all secret company stuff. But they'll send me beta versions of that audio software, and I'll try it here in the studio and say, “Man, that's not working.” And then they're like, “Okay, well maybe we should do this and that.” And then it's dead-on, and it's awesome. And then once you've set it out, then it's the bulk process of applying that to all the pieces of music that you've done.

How long does it take you to score a game?

Usually you have three months to do it. But sometimes it's a rush job — another composer fell through or they have different ideas out of the blue for how a game needs to be done. On Need for Speed, I had like close to four months to do it, and I was very active that whole time period. And on SSX Blur, I only had four weeks to do it.

So your assistants probably come in really handy during game jobs.

With movies and video games, there's no way that you can do it on your own. It's just too much work.

Finally, with all the different kinds of jobs you do and the huge amount of synths and audio material that you draw from, how do you keep track of all your sounds? You must have about a million of them.

That's the weirdest thing. My girlfriend tries to explain to me over and over again how to program our microwave at home. And every time I just mess it up. Or DirecTV or something — it has a logic to it that works for consumers, but it doesn't work for me. It's not logical, you know? But then again, if you ask me where that kick drum is that I used five years ago on that song, it's like, “Oh, it's on drive 83, and there's that folder in there that I think I made yellow, and there's a folder in there that's called ‘Heavy S**t,’ and it's in there.” I have like a photographic memory when it comes to that kind of stuff. I just know where everything is.

(Editor's note: For more of this interview, in Podcast format, go

Mike Levine is EM's executive editor and senior media producer and the host of the twice-monthly Podcast “EM Cast” (

Junkie XL: Selected Credits


  • Radio JXL: A Broadcast from the Computer Hell Cabin (Koch, 2003)
  • Saturday Teenage Kick (Roadrunner, 1998)


  • “Today” from Today (Roadrunner, 2006)
  • “Beauty Never Fades” from Radio JXL: A Broadcast from the Computer Hell Cabin (Koch, 2003)
  • “Future in Computer Hell” from the Sasha CD Global Underground 013: Ibiza (Global Underground, 1999)


  • Britney Spears, “Gimme More” (Jive, 2007)
  • Justin Timberlake, “What Goes Around” (Jive, 2007)
  • Coldplay, “Talk” (Capitol, 2006)
  • Elvis Presley, “A Little Less Conversation” (RCA, 2002)
  • Rammstein, “Feuer Frei!” (Motor Music, 2001)

Film Music

  • Blind (Klas Film, 2007)
  • Catwoman (Warner Bros. Pictures, 2004), “Who's in Control” remix
  • Chronicles of Riddick: Dark Fury (Universal Studios, 2004)
  • Resident Evil (Sony Pictures Entertainment, 2002)

Game Music

  • Need for Speed series (Electronic Arts, 1995-2007)
  • Forza Motorsport (Microsoft Game Studios, 2005)
  • The Matrix: Path of Neo (Atari for PlayStation, 2005)
  • Quantum Redshift (Microsoft Game Studios, 2002)

A Tale of Two Sequencers

Although Holkenborg''s main recording software is Pro Tools, he also uses Logic a lot and has two Macs (each with its own Apple Cinema Display) at his main work area.

In Holkenborg's studio, two is better than one when it comes to sequencer software. He uses Digidesign Pro Tools for his audio needs and Apple Logic Pro for some of his MIDI work, especially orchestral arrangements. Both applications run on separate Macs and are synced using MIDI Machine Control. “So if I press start, stop, or record on one computer, the other will follow,” Holkenborg explains.

“Logic, for me, is just a very expensive sampler,” he says. “I just look for the sound that I want, I hit it, I record it into Pro Tools, and I do everything in Pro Tools. When it comes to film scoring and video-game scoring, same thing — all the electronic parts are recorded into Pro Tools, and then I edit everything. But then, all the orchestration for an orchestra happens in Logic and stays in Logic until everything has been approved by a director. And then everything gets written out, and [then] recorded with real instruments.”

When working in Logic on orchestral parts, Holkenborg doesn't use the notation features. “I usually edit it in the grid [piano-roll] mode,” he says. “I got really used to the grid mode from Pro 24 even, and the early Cubase. I'm so stuck on that.”

I asked him what he thinks of Logic Pro 8, Apple's latest version. “The new Logic Pro is a massive update compared to what they had before,” he says. He calls the program's MIDI capabilities “phenomenal.” But despite his affection for, and use of, Logic, Pro Tools is the final destination of all his projects. “When it comes to audio, to me, Pro Tools [HD] is absolutely superior. It comes with massive DSP.”

Web Clip: Watch a video of Junkie XL showing off the gear in his multiroom project studio.
Podcast: Listen to more of the interview with Junkie XL.