For many film composers, nothing is more exhilarating than standing in front of an orchestra and hearing one''s music come alive. However, recording the live orchestras heard in today''s blockbuster releases is no easy feat. It requires both a good crew and reliable gear. This article takes a look at the people and technology that go into preparing for a Hollywood scoring session.
SEE IT IN BLACK AND WHITE
For much of the film-scoring process, the composer is working on a computer, but during the weeks of recording, much of the focus is given to traditional music notation. Although computers are an excellent way to create music, nothing is more satisfying, efficient, or creates better results than recording live acoustic instruments playing music off the page. Of course, to get the music onto the page, there are a number of technical steps that need to be addressed.
FIG. 1a: Sibelius MIDI file before quantization.
FIG. 1b: After quantization.
If the music was written in a computer sequencer, the composer or music programmer will export a MIDI file and send the file to a copyist or orchestrator along with an MP3 of the mockup (a demo of the music using sampled instruments). The copyist will import the MIDI file into his or her notation program of choice and begin to stylize the layout of the score page while cleaning up the raw MIDI musical data. To assist in the cleanup process, notation programs will quantize the MIDI on import, which reduces the note''s rhythmic value to a set length, such as 8th notes or 16th notes (see Fig. 1).
The Sibelius notation program allows copyists to set up a “house style” with all of the composer''s score preferences. This helps the copyist quickly conform the MIDI into the composer''s preferred score layout. After cleaning up the MIDI in the notation program, the composer or orchestrator can add written expressive elements such as dynamics or techniques. Depending on the preferences of the composer, files can go back and forth between the composer and the copyist, with the composer making updates or orchestration changes to the score by hand.
FIG. 2: The action line on a film score.
There are a few elements specific to film scores that are useful to incorporate into the notated score. Unlike a concert score, it is helpful for a film score to have an action staff at the top of the score that indicates hit points—places in the film that need to be highlighted with accents or other musical hits (see Fig. 2). These are very useful for the composer and the conductor in shaping the music and orchestration.
Hit points are usually created when composing in a sequencer such as MOTU Digital Performer, which uses these hit points to calculate tempo and meter changes. In the pre-digital days, a music editor would often have to calculate tempos using a stopwatch, but these days the software can determine tempos for you.
When the MIDI file is imported into the notation program, the hit points are also imported and incorporated onto the action line at the top of the score. Sibelius imports these hit points directly, whereas Finale requires the inexpensive plug-in TGTools. The hit points will later be imported into Avid Pro Tools when these same MIDI files are used to create the recording templates.
FIG. 3: Big time signatures are important for conductors who are sight-reading a score.
Because creating tight synchronization with the visuals often means changing the number of beats in each measure, or speeding up or slowing down some sections, film music can often have a lot of tempo and meter changes. To assist the conductor in sight-reading the written score, it is often helpful to have large meter changes (see Fig. 3). Finale and Sibelius default to small time signatures, but with some tweaking, one can create custom layouts of time signatures to allow for quick reference of meter changes. To also assist in quick reference, it is very helpful to have measure numbers listed on every measure of every score and part. It is useful to include each cue''s start time on the written score as well. Starts get changed often due to picture edits during the scoring process, and if the start is clearly stated on the written score, the technical staff can all be sure that their equipment is set to start at the right time.
After the scores have been created in Finale or Sibelius, the digital files are sent to a music librarian at the recording session who will supervise the printing, binding, organization, and delivery of the music to the musicians. The librarian is often hired by the music contractor, who also hires all the musicians for the recordings.
IT'S ABOUT TIME
The film scoring process often feels as if it''s devoted to the management and wrangling of time: there is the start time of the musical cue in the film measured in SMPTE timecode (hours, minutes, seconds, frames); the running time of the cue; the bars, beats, and tempo of the music; the sample rate and frame rate of the audio and video; not to mention the short deadlines for finishing the work.
The music editor has a central role in managing all of these timing details. The role of the music editor can vary widely depending on the film and the composer, but there are a few responsibilities which most music editors share. The first is the spotting of the film, at which point the music editor, composer, director, and sometimes the producer, review the film and decide on basic start and stop times for each scene, as well as a general character and feel for the score.
FIG. 4: A typical SMPTE start-time sheet.
Once the general start times have been determined, the music editor can begin the temp music, which is the process of placing existing music cues into the soundtrack to create a temporary score. This is increasingly common on many films, as the movie studios often want to screen the film to select test audiences before recording the score, and music is needed to sell the emotion of each scene. As the music editor assembles the spotting notes and temp music, he or she begins to compile a list of the start times for the music cues. This list is often called a SMPTE start-time sheet and lists important technical information about each cue such as the title, video reel, name, length, and the start time (see Fig. 4). As the film is edited, this list must constantly be updated to ensure the start time of the music is correct with each new picture version. In addition, many music editors are also asked to conform the music to fit new picture edits by cutting sections out, changing meters or tempos, or requesting that the composer add new sections of music to go under added footage.
As the week of the recording session gets closer, the music editor will start to prep recording templates for Pro Tools. Although much of the composing process is done in Digital Performer, Logic, or a variety of other sequencers, the majority of top-notch, film-music recording studios work with Pro Tools. These templates can be made in Pro Tools LE as well as HD. The Pro Tools session templates need to have the proper frame rate and start time, the tempo and meter map, and any guide tracks such as mockups or other pre-records.
FIG. 5: Setting the session start time and frame rates in Pro Tools.
FRAME RATES AND SESSION START
Each cue in the film will have it''s own Pro Tools session. Pro Tools sessions should be created using the film standard of a 48kHz or, more commonly, 96kHz sample rate, 24-bit resolution, and Broadcast WAV (BWF) file formats. BWF files place a timestamp on the audio files, so that when files are later imported into new sessions at the mix, the file will automatically synchronize to the correct start. Once the session is created, the editor adds in the Session Start Time and the picture frame rate. This is done in the Setup > Session Menu (see Fig. 5).
The session start time is determined by the reel of video that is being used. Each video reel will start on an hour marking. For example, Reel 1 will start at 1:00:00:00 (1 hour, 0 minutes, 0 seconds, and 0 frames), Reel 2 at 2:00:00:00, and so on.
FIG. 6: The Audio Rate pullup/ down menu in Pro Tools.
Next, the music editor will set the time-code rate. Most projects shot on HD video these days use the 23.976 frame rate, which has the 24 frames per second (fps) look of film, but accommodates the 29.97 specification for US television using a technique known as a 3:2 pull-down. While the film rate of 24fps is often used for projects shot entirely on film, the final mix, or dub, is often at 23.976fps. In these cases, the score recording, or more often the layback of the final mixes, will be done using a pull-down (see Fig. 6). This pull-down process will slightly slow down the audio to make it compatible with the different frame rate being used at the dub. If the score is recorded and mixed at the wrong frame rate, the dub stage can accommodate this mistake using sample rate conversion at the final mix, but this is not as desirable.
CUE START TIME AND TEMPO/METER MAPS
Once the session start time has been created for the reel, the music editor needs to set the cue''s start time and import the tempo/meter map. Because film music is usually recorded to a click track to ensure sync, the bar numbers in Pro Tools have to align with the bars in the score. This also allows the composer to do “pick-ups” at the recording session by starting a recording in the middle of a take. The easiest way to set the tempo and meter changes in the Pro Tools templates is to import a MIDI file either from the music programmer or composer. Import the MIDI file and set it to the SMPTE start time for the cue. This will place all of the meter and tempo changes directly into the Pro Tools file. With any technical process, it is usually a good idea to check the tempo and meter in Pro Tools against the written score to make sure no mistakes were made in the MIDI file import.
Once the template sessions have been created they can be delivered to the Pro Tools operator for the recording session, who will import the templates into his or her recording layout template, using File > Import Session Data.
Conducting is a skill that requires not only a great ear and deep musical knowledge, but also a lot of coordination. Cueing orchestra members, coaxing an emotional performance, listening for mistakes, and counting the beat of the music is a difficult task. Film composers not only have to accomplish this, but they also have to synchronize the music to the film at the same time. To assist in this task, many conductors use visual cues such as punches and streamers.
Since the early days of film music, punches and streamers have been used to assist the conductor. Music editors in the pre-digital days would literally punch a whole in the film at designated lengths to flash the downbeat, providing a visual metronome. Using this method, a bright flash would appear over the film that would keep the conductor in time. For important musical hits, a vertical line, called a streamer, would slowly progress across the screen to indicate the entrance of the next musical section. This was accomplished by marking a diagonal scratch across the film stock. When the film was played back, this would manifest itself as a moving line across the screen.
In these modern days of video, these early film techniques have been long abandoned, and now computers provide these important visual cues for the conductor. One of the more beloved systems used by established film composers is the Auricle Time Processor. This is a DOS 3.0-based computer system that was developed by Richard Grant in 1983, who would later go on to receive an Academy Award in 1987 for his creation. This computer chases (or generates) SMPTE timecode created by the master recording computer, and generates MIDI messages that create the visual punches and streamers at designated times. Like much in the film music recording process, proper Auricle setup relies on having the tempo/meter maps from MIDI files, the start times, and the frame rate settings. The Auricle operator converts the MIDI file into an Auricle file for each cue, and sets the start time and the frame rate. Based on the composer''s or conductor''s preferences, streamers, punches, flutters, and other visual indications are added in the Auricle program. During the recordings, an NTSC QuickTime video is played out of Pro Tools via Firewire and through a digital-to-analog converter, such as a Canopus ADVC, and then to the Auricle system. When the Auricle receives the start time in the SMPTE timecode, it starts generating the proper visual indications, all while following the tempo and meter of the MIDI file. The conductor can then watch the movie after it has passed through the Auricle with all of the proper visual overlays.
MOTU Digital Performer has also been implementing visual streamers and punches into their software and can output these visuals right onto the QuickTime movie. Figure|53 also makes a software solution for visual cues called Streamers.
Although the technology for creating mockups has been available since the 1960s, delivering mockups has only been common practice for the past 10 to 15 years. Composers used to play demos on solo piano to illustrate the musical themes, or they''d use small instrumental ensembles. These days emulations of entire orchestras can be convincingly created using a computer. In addition, many composers now compose using a sequencer, and use the mockup as a composition tool in the process. As the quality of mockups has improved over the past few years with the advances in sampling technology and sequencer proficiency, many film scores include the pre-recorded electronic elements in conjunction with the live orchestra to make a fuller, more modern sounding score. These working in the MIDI domain, but once the arrangement is completed, the electronic instruments should be rendered as multitrack audio files. Before the MIDI is converted to audio, the music programmer will want to confirm that the orchestra is tuning to A440 (many European orchestras prefer to tune to A442 or even A444). Almost all modern software samplers allow the user to pitch up or down the samples a few cents to match the live orchestra.
When recording the MIDI into audio tracks in the sequencer, it''s helpful to give the mix engineer as many options as possible. Some music programmers provide the multitrack stems based on the orchestral section—strings, percussion, brass, synths, guitars, and so on. If the stems can be broken down further into instrument sections (violin I, violin II, horns, trumpets, etc.), it provides the recording engineer with greater control over the mix. How the samples are routed to the audio tracks depends on the music programmers''s setup, and will be determined by what samplers are used, whether it is a host-based sampler such as Native Instruments Kontakt, an inter-application sampler running over Rewire or Vienna Ensemble Pro, or an external sampler running on another computer.
Recording engineers usually prefer to have the samples printed dry, without any effects processing. If the processing is a key element to the sound, such as an amp simulator, pattern generator, or audio mangler, then the samples should be printed after passing through the effects plugin. If desired, the music programmer can also create separate audio tracks of only the reverb.
There are a growing number of European studios and orchestras that tailor to film music recording and provide all of the technical and music services required for a film music recording. One such orchestra, known as the F.A.M.E.S orchestra based in Macedonia, provide a large orchestra with conductor, recording and mix engineers, and music preparation services.
With the use of remote recording technology such as Source Elements'' Source Connect audio plug-in, the composer can even attend the recording sessions from his or her own studio by listening to the live playback in MP3 quality over the Web. With the addition of a talkback microphone, the composer can lead the recording sessions just as if he or she were in the actual control room.
James Sizemore is a professional composer and producer in NY, and has been the Scoring Technical Director for recent films The Twilight Saga: Eclipse and The Edge of Darkness.