It's Only Make-Believe - EMusician

It's Only Make-Believe

The sounds that you hear in movies, TV, and video games fit so seamlessly with the visuals that it's easy to forget just how much work goes into producing
Author:
Publish date:

The sounds that you hear in movies, TV, and video games fit so seamlessly with the visuals that it's easy to forget just how much work goes into producing them. Whether the sound you're hearing is a bomb exploding, a spaceship taking off, or silverware clinking, you can bet that a lot of effort was put into designing and recording those sounds.

One of the preeminent sound designers working today is Scott Gershin, who has worked on many major motion pictures including American Beauty, the Chronicles of Riddick, and Shrek and a host of video games such as the James Bond and Mechwarrior series and Lost Planet: Extreme Condition. Gershin is the executive creative director of Soundelux Design Music Group, a postproduction sound company located in Hollywood, California. He is also the cofounder of the Interactive Entertainment Sound Developers, a new branch the Game Audio Network Group. I had the opportunity to speak with him recently about his work, his gear, and his techniques.

Is the termsound designera good way to describe what you do?

Sure, but sound design has become a very generalized and overused description. There are music sound designers, theatre sound designers, movie sound designers, game sound designers, and dialogue designers. What I do is take a look at the story or the action that I'm dealing with, and [then] use and manipulate sound to help tell the story or enhance the experience of the movie goer or game player. My job is to use sounds, whether captured from real life or manufactured and manipulated, to push the emotional buttons of the audience.

Are you always working with a visual component?

Mostly.

Our readers are interested in the nuts and bolts of how you create sounds. Let's take, for example, a film situation. Can you describe your work flow? That is, when do you start preparing, and when do you see materials or visuals?

Every film is a little different, but in a nutshell, I'll receive a script or a rough version of the film. I'll go through the script or video and start looking at potential audio opportunities. Sometimes the movie is already being shot when I get the script; other times, it's still in preproduction. In the latter case, I'll try and sit down with the director and talk about audio opportunities during filming that we might be able to record. If he or she is going to be filming large crowds, weapons, special vehicles (armored attack vehicles, race cars, aircrafts, submarines, and so on) or something that's unique and hard to get access to, then I'll recommend we go out and record those sound sources during the filming, as well as [during] those days before, between, and after the filming of those scenes.

You're talking about recording at the set of the movie?

FIG. 1: Gershin (left) and Peter Zinda mic the avionics of a jet plane while gathering sonic material for the Chronicles of Riddick.

Yes, I do it quite often. On Herbie: Fully Loaded, the second unit had full access to California Speedway filming the NASCAR race. I discussed this opportunity with the director and Disney, and they gave my crew and me full access. On one of the days, there was cloud cover at the speedway; they couldn't match the scene previously shot with the actors, so they had to wait while the weather cleared up. That gave us a chance to wire the cars up, race them around the track, and get some amazing recordings. Disney also built 30 different Herbies to use in different parts of the film. Each one had a slightly different engine setup, so I auditioned them and chose four that I thought had unique sounds. We miked them up and spent days just recording Volkswagen Herbies at an airfield we rented north of Los Angeles. We installed switches in the car that allowed us to turn different spark plugs on and off, giving us the ability to record a palette of sounds to help create Herbie's “emotions.”

Which do you use for your remote recordings — field recorders or laptops?

I'm using mostly field recorders. I've tried laptops, but because of battery issues and glare, I've found them a little inconvenient. Most of the time I need to be fast, flexible, and agile. When I go out to record, I use multiple recorders and a team of recordists covering different perspectives of the sound, similar to a multicamera shoot (see Fig. 1). We use a combination of Sound Devices 744s (see Fig. 2) and Fostex FR-1s. Sometimes we'll use a [Zarcom Audio] Deva V or a Nagra. Microphone choices depend on what we're recording; each mic has a different color, reach, and purpose. We choose the right mics for the right purpose, similar to tracking an album. I just bought a Sanken 5-channel microphone, which I've been quite impressed with. Between Soundelux, myself, and the guys I record with, we have a very impressive mic arsenal to choose from.

Once you're done gathering raw material, where do you go to work on the sounds?

The studio that I work out of most is at Soundelux, although I also have a setup at home. The Soundelux studio is a minitheater; it's about 30 feet long and 24 feet wide with a full screen in front. I'm surrounded by computers, outboard gear, controllers, and anything that can manipulate sound (see Fig. 3).

Do you want the room to sound kind of like a theater does?

I try to simulate as much as possible the environment that the sound is going to play back in. So I've got my film set of speakers, and I've also got another set of speakers for my video-game work.

Are those other speakers studio monitors?

Yeah, my mid/close-fields. I'm using JBL LSR32s with a Bryston amp.

I actually have four speaker setups that I use. I have my EAW theater speakers behind the screen, my JBL LSR32s on stands at my main studio, and a 5.1 Dynaudio Air system that I run digitally out of my Pro Tools setup in my second studio. For critical listening, I also use my home-theater setup in which I'm using Thiels and a Sunfire sub.

Do you work mainly in Digidesign Pro Tools?

Yes. I originally started off with a Synclavier, then became the first Waveframe user in postproduction using the Waveframe 1000. And then I converted over to Pro Tools. One of the advantages to Pro Tools is the ability to interface relatively well with the Avid and dubbing stage systems. Pro tools is the de facto standard in the industry. There are other systems, but because of time restraints and flexibility around town, Pro Tools is a good solution.

FIG. 2: Gershin uses a variety of field recorders, including the 4-track Sound Devices 744T, when gathering sonic material for his work.

I assume there are times when you create sounds without any field-recorded material?

Sure — with synths, [with] sounds I record on Foley stages, [and by] manipulating sounds that I have collected. The great thing about Soundelux is that we've got an ever-growing library; it never gets stale. It's a combination of what we have recorded and what we've collected. For example, tomorrow I'm going to be recording llamas. While doing a Podcast, which I do every other week at http://nowcastnetwork.com, I met a family of musicians on the show who raise llamas. They were gracious enough to allow me and [sound designer] Peter Zinda to spend the day with them recording their llamas. I'm not necessarily using the llamas for any specific show, but the opportunity came up and we found out that llamas have a great wealth of vocalizations. Eventually, I'll need them for some creature sounds or [will] use them to embellish something else.

Can you describe how you typically work with a director on a film job?

Every director likes to interface differently. But basically the goal is to meet with the director to interpret what his or her vision is for the film. Some of the directors are very audio savvy and some are not. So then it's my job and my taste — that's why they've hired me — to interpret that vision and come up with my take on what they're trying to create. And then I'll start playing back scenes with the director, getting a feel for what his or her tastes and likes are, and I'll form a creative relationship with that director and with the picture cutter [the film editor]. Sometimes the sounds I need to create are very simple. When I did American Beauty, one of the things the director wanted was for the sound to help embellish the story [without calling] attention to itself. In other words, to totally blend with the production while paying special attention to negative space — that is, using silence as a tool. I had to find a whole new way of recording Foley and Walla group [actors who speak dialog that's used as a background effect] to seamlessly blend with production. The bottom line is that my job is to make sure that nobody knows I exist. My job is to create illusions with audio. Because what we're really trying to do is tell a story.

A lot of times you make things larger than life, sonically?

So many times you hear the phrase the Hollywood Sound. It's when you embellish real-life sounds in a movie, because many times realism is kind of boring. For instance, the classic example is the Hollywood punch. If anyone hit somebody and produced that sound, it would be a one-punch fight. Like in Herbie, I needed to create a Volkswagen with personality. A Volkswagen by itself doesn't have much personality, let alone the ability to emote joy, sadness, and jealousy. So what I needed to do was what all sound designers strive for — to create the right balance between realism and fantasy.

How did you do that with Herbie's personality?

FIG. 3: Gershin''s main studio at Soundelux is set up in a circular work space inside a large room designed to resemble a movie theater acoustically.

Recording the realistic Herbie and manipulating its engines and driving in unusual ways turned out to be really effective, but it wasn't enough to produce the range of emotions I needed for the movie. Over my career I've used my voice to help make a lot of inanimate objects and characters come to life, giving [them] some personality. So I processed my voice and kind of emulated what the engine was doing and created another level of sound that, when combined with the organic sound, gave it a bit of a personality. Same thing when Herbie smiles or frowns using his bumper — I ended up manipulating real metal from a junkyard, and then added my vocalizations so it had the effect that makes the audience feel like it's talking. But I had to be careful not to go too far; otherwise, it would have become cartoonlike.

Can you take a sound and repitch it and completely change its character?

Yes. These days, there are two ways that I deal with pitch. One, I use pitch plug-ins that have pitch-envelope capabilities, and the two that I like the most are Serato Pitch 'n Time and the Waves Sound Shifter. For other types of pitch-shifting, I use samplers. Lately, I've been using Native Instruments Kontakt. I'm very interested in seeing Digi's new sampler. I've used the Waveframe 1000, the Emulator 4, and the MOTU Mach Five, among others. Ultimately, its great if I can take a sound from a track that I've been manipulating and drag it into a sampler from Pro Tools. But the bottom line is that the sampler or plug-in has to sound good.

What are the basic techniques that you typically use when you're manipulating a sound?

Pitch- and time manipulation is huge. I might pitch it and keep the speed the same, adjust the speed but keep the pitch the same, or adjust both pitch and speed. There are many ways to manipulate pitch and time [speed]. The other area I address is the tone using EQs and filters. You want to be able to reshape a sound and mangle it into something totally different. Another technique is to combine one sound against another and create something totally new. It could be as simple as grabbing five or six gun recordings, which, when combined and manipulated, get the results you're looking for. Or blending multiple animals together to come up with a new creature. Then there's obviously modulation-style and spatial effects that take a sound to a whole other level. Those can be used when you need it to be otherworldly — to sound like something that doesn't exist in nature.

Can you give an example of a sound you created that would fit in the otherworldly category?

When I did the Chronicles of Riddick (see Fig. 4), I used a lot of synthesizers and other instruments. I recorded a guitar through a full Marshall stack, loosened all of the strings, and then bent up the strings, combined with two or three harmonizers — all pitching up at the same time — to create the sound of the starship lifting off. Again, there's no right or wrong way. There's a lot of experimentation and trying to come up with ideas that may sound great or that sometimes aren't so great. Or totally wonderful accidents [can] occur that make you go, “Wow, this is really unique.”

What about the whole ambience issue? Do you use convolution reverbs with odd impulse responses like, say, the inside of a vacuum-cleaner tube or something like that?

Absolutely. [Audio Ease] Altiverb and [Waves] IR-1 are huge components. We've only scratched the surface of all the things that they can do. When you start doing convolution recordings in spaces, with materials, it becomes fascinating.

What do you mean?

Again, like a vacuum tube or something that you would never think you'd want from an acoustical space, but something that may manipulate the sound in very unique and odd ways. That's something I've been using for a couple of years now, again, [and] it's pretty much an amazing thing. I also still use a lot of classic reverbs — a lot of Lexicon and TC Electronic stuff. It really depends on what you're trying to create. With Riddick, I used musical instruments to create the core of the design, rather than going with stuff that's more predictable.

So that it's a more original sound?

Yeah, just something a little bit different. And again, we're kind of audio photographers — constantly going through life listening for sounds. For example, my washing machine at my house sits above a downstairs bathroom, and when the washing machine is on, it creates this interesting resonance in the bathroom. It feels like a starship or a submarine, or something to that effect.

Do you carry a recorder around all the time in case you hear something you want to capture?

FIG. 4: Gershin, in screen, is shown with the guitar he used, with strings loosened, to create part of the starship-liftoff sound in the Chronicles of Riddick.

I've got a couple of recorders that I could grab at a moment's notice. I did a submarine movie a couple of years back and recorded a Jacuzzi with spinning jets that I had in my house. And that was the key element to the sound of the torpedoes' blades. I bought an underwater microphone and recorded Doppler byes with it. And it's just basically a Jacuzzi jet, but against picture it gives off a different illusion.

How about Foley artists — is it their job to put in the regular sounds, like the footsteps and door slams? And what's the difference between a Foley artist and a sound designer?

The Foley artist, location recording, and manipulating sounds are all tools for the sound supervisor/sound designer to use in creating a sound track. When I create sound effects on the Foley stage with the Foley artists [or Foley walkers], I will take the results and combine them with the design to help add definition, detail, and many times something interesting and new. I will, for instance, create a really big explosion — I move the room. But let's say that during the explosion there's dirt that falls on metal. Or I want to crack something before the explosion. I want to enhance it with another level of detail. I'll do that on the Foley stage. Most of the time, the designer will create sounds and combine the artistry of the Foley walkers to add a level of detail.

So how would you define the Foley artist's job?

Foley artists and their recording engineers work in studios that are filled with everyday objects that they know will make specific noises. They also have a multitude of shoes and surfaces so that they can re-create sounds, movement, and footsteps in sync with the picture. In their bag of tricks, they know that if they grab, say, a Whippoorwill branch, that they can create whooshes that sound like a boomerang. And they've got an arsenal and a knowledge base to create sounds based on the junk that they've collected. For example, we've simulated snow by using baking powder to get the right crunch and texture.

From a sound-design standpoint, how does working on a video game compare with working on a film. Is it similar or totally different?

We're using similar artistries, but the release format is a bit different. In film, I'm trying to create a blend of sounds as part of the storytelling — like an aural painting. It's something you observe.

With video games, you are trying to create each individual sound event, which, when combined in an infinite number of ways, will be able to create another aural experience — something you partake in. One difference is that there's a lot more repetition in a video game. When you watch a film, the sound goes by in a linear fashion; you only hear those tracks at that given time within the movie.

In a video game, the player will hear sounds over and over. You need to be very conscientious when you create a sound or a melody that will be heard again and again that the end user won't get sick of it. In terms of sound quality, we treat games the same way we treat movies.

Do you have some advice for those who are interested in developing their sound-design chops?

The main thing I tell people when I give any lectures about sound design is to listen. Everybody has learned how to stop listening. [Instead, they] filter out those things around them that they find distracting. When I interview young sound designers, one of the things I ask them is to name five types of rain. Some answer, “Rain, I don't know: rain, more rain, heavy rain.” I am looking for people who can hear and have an attention to detail — things like rain tapping on the window, rain hitting puddles, rain going down the gutter, the rhythm of the rain. Is it a slow drip-drip like a relaxed southern afternoon shower? Is it heavy pelting against a wood or plastic roof? All these sounds add to storytelling. You have to deal with sounds that evoke childhood emotions and that take you to a special place. I tell them to listen to the birds; listen to cars going by. Listen to anything that's machinery around your house. Listen to thunder and the way it echoes in the area that you live in, and how it slaps against the wall behind you. It's really stopping and relistening to everything around you. It's creating an aural vocabulary.

So like your example of the Jacuzzi jets or the washing machine that sounds starshiplike, people should listen for sounds in their home that can be used as raw material for sound design?

Yes. There is stuff that you can do in a home that would just blow your head away. Like taking [electric] razors and putting them in metal bowls and recording them. Or using the windows in your house on a windy day and opening them a little bit to make it sound like a wind storm or a hurricane or wind whispering across the wing of an aircraft. There are endless amounts of sounds that you can come up with.

Do you generally recommend using stereo mics or stereo miking for capturing remote sounds?

It really depends. It's the same thing as music, like saying when you mic a drum set or a guitar, what do you always use?

Good point.

I come from a music background, and I draw parallels between both industries. For example, when you're recording a gun, it's got a low thump that's kind of like a kick drum. So maybe I'll use an AKG D112 or an EV RE20 or a Sennheiser MD 421. I'll then use other mics to capture other frequencies that guns can make. Or maybe when I record glass breaking, which is kind of like percussion, I'll use a small-diaphragm condenser mic or a ribbon mic similar to overheads on a drum kit. I know it sounds corny, but I feel lucky to be able to be creative and work consistently on projects that I care about. I get to make noise for a living.

Mike Levine is an EM senior editor.

SCOTT GERSHIN: SELECTED CREDITS

Films:

Underworld Evolution (Screen Gems, 2006)

Herbie Fully Loaded (Walt Disney Pictures, 2005)

The Chronicles of Riddick (Universal Pictures, 2004)

Team America: World Police (Paramount Pictures, 2004)

Blade II (New Line Cinema, 2002)

Shrek (Dreamworks SKG, 2001)

American Beauty (Dreamworks SKG, 1999)

Godzilla (Tri-Star Pictures, 1998)

Braveheart (Paramount Pictures, 1995)

JFK (Warner Brothers Pictures, 1991)

Games:

Lost Planet: Extreme Condition (Capcom, 2007)

Transformers: The Game (Activision, 2007)

Onimusha Dawn of Dreams (Capcom, 2006)

Need for Speed: Most Wanted (EA Mobile, 2005)

Devil May Cry 2 (Capcom, 2003)

James Bond: Everything or Nothing (Electronic Arts, 2003)

James Bond: Nightfire (Electronic Arts, 2002)

Mechwarrior series (various publishers, 1994-2002)

A full list of credits can be found at www.IMDB.com.