Print Page



In music, as in life, timing is everything, and the technology ofsynchronization has evolved to address the persistent timing issuesthat arise from our increasingly complex desktop music systems. If youhave more than one piece of gear in your studio, you probably need tostart thinking about synchronization. As your studio becomes morecomplex, you must get a better understanding of how timing informationis shared among devices and what level of timing accuracy isappropriate for your needs.

As you'll see, accurate timing is a context-sensitive notion thatbegins with the first rule of synchronization: No two clocks areidentical, no matter how expensive they may be. World-class studioshave to synchronize all of their equipment to a single clock to preventthings from drifting apart-and, to one degree or another, so doyou.


The second rule is that accurate synchronization depends on alldevices getting their timing information from a single master clock. Ofcourse, this means that all of your other gear must be able to slave toan external clock, usually through a switch or a software checkboxlabeled something like "Internal/External Clock." Ideally, you woulduse the most accurate clock in your studio as the master clock; if youhave another device that functions only as a timing master, youroptions are more limited.

Timing information is carried from device to device by a signalknown as time code. There are several different types of time code (seethe sidebar "Syncspeak"), all of which are commonly lumped togetherunder the term SMPTE time code, or just SMPTE. SMPTE divides the timeline into hours, minutes, seconds, and frames in the formatHH:MM:SS:FF. (The term frames refers to frames of film and reveals theorigin of the standard.) A second is typically divided into a number offrames ranging from 24 to 30, depending on the film or video formatbeing used.

Simply put, proper synchronization depends on conveying bothlocation and rate. Without location information, two multitrackrecorders can't start playback from the same point; without rateinformation, they will gradually drift apart from each other.

With analog tape machines, synchronization is accomplished byadjusting the speed of the drive motors to counteract any drift fromthe SMPTE time line. With digital devices, this coordination of speedsis achieved with word clock. A word-clock signal cues every digitalaudio device in the system to record, play back, or transfer eachsample at the same time. Variations in the timing of the master deviceare precisely duplicated in every slave device.


Horticulturists will tell you that a weed is just the right plant inthe wrong place. The same could be said of a wrong note. The questionis, how far out of place does a note have to be before it becomes"wrong"? If members of an orchestral violin section sneak into a quietnote over the span of a fourth of a second, the note will blossombeautifully. However, that same quarter-second discrepancy between atrumpet and a tenor sax can turn an intricate bebop line into a chaoticechofest.

Synchronization technology enables us to guarantee accuracy in termsof picoseconds (millionths of a millisecond), but when does that kindof precision matter? It certainly doesn't in a typical multitracksession. Part of what makes live musicians sound alive is the subtleinteraction of their minor imperfections. That's one reason whyquantized MIDI sequences often sound overly mechanical: the parts lineup too precisely. To counteract this excess precision, most sequencingsoftware now features algorithms for "humanizing" (randomizing) asequence and offers degrees of quantization for fixing only the mostegregious rhythmic errors.

The more definite a sound's attack, the more it suffers from timingerrors. If you record two drummers playing the same part, timingdiscrepancies introduced during the recording or playback process willbe more obvious than with vocals or strings. To see for yourself justhow forgiving our ears are of timing "errors" within an ensemble, trythis experiment. Open your sequencer and record a simple drum orpercussion part, such as a scale exercise on a marimba patch. Now copyit into another track and assign the new track to a xylophone patch.(Be sure to assign this track to a different MIDI channel.) Both partsshould play back in perfect unison, as if the two timbres were layeredat the patch level.

Now slide one part a tick or two later and see what happens. Theresults, of course, will depend on the tempo and the MIDI resolutionyou're using; but after a couple of ticks, you'll start to hear some"flamming" of each attack. As you slide the parts further away fromeach other, you'll hear each one more and more distinctly from theother, until at some point they cross the line and end up just soundingsloppy. Now change both parts to string patches and see how muchfurther apart you can slide them before they sound wrong. Your earswill probably forgive about twice the discrepancy between string partsas between percussion parts.

If you're trying to sync a digital audio sequencer to your MDM fortypical multitrack sequencing and recording, then perhaps you don'trequire the highest-resolution synchronization hardware on the market.In fact, countless major recordings from the past decade have relied onMIDI Time Code (MTC) for this sort of arrangement. MTC is accurate to aquarter of a frame, which translates to a maximum error of about 8milliseconds (see Fig. 1). That may sound like a lot, but consider twothings: first, the typical margin of error is significantly less; andsecond, the errors in subsequent overdubs aren't cumulative.

Why does synchronization matter at all? After all, once an audiofile is in your computer, you're going to drag it where you want itanyway. In fact, if you mix all of the audio in your computer, syncdoesn't matter. But if you want to lay edited audio back to your MDM,you'll want to return it to where it came from with reasonableprecision, and MTC does this adequately under many circumstances. (Itpays to be resourceful, though. Once, caught without a viable syncarrangement, I re-recorded an edit back to tape-on the fly- and thennudged it into place with the recorder's track offset.)


Lest you think that synchronization is all hype, consider the taskof fixing a digital glitch in the left channel of a stereo mix. Youdump the single track from tape into your audio-editing program, cleanup the problem using the Pencil tool, and lay the track back to tape.The result is chorusing, flanging, phase cancellation-you name it.Stereo pairs are extremely unforgiving of timing errors betweenchannels.

If you open any stereo mix in your editor and drag one channel morethan a few samples in either direction, the discrepancy is immediatelyapparent. Do a little math, and it's easy to see why. One completecycle of a waveform at 440 Hz (the standard tuning A) takes about 100samples, or 2.27 milliseconds. Drag the wave 50 samples in eitherdirection and you've achieved complete phase cancellation (see Fig. 2).It takes only a fraction-of-a-millisecond discrepancy to createpronounced chorusing.

If you need to work within such tight tolerances, you'll want theability to sync your gear to within at least a couple of samples. Withthe right combination of gear, you can even get sample-accuratesynchronization. This is the purpose of ADAT's proprietary 9-pin syncconnection: it enables you to achieve single-sample positioningaccuracy. Whereas older DA-88s need a separate connection for sample-accurate sync, the TDIF connection carries all the required data.

Even lacking sample-accurate sync, you can move tracks back andforth without creating phase problems if you're resourceful. Using theaforementioned example, you could transfer both channels at once(instead of just the problematic left channel) to the computer forediting. They would arrive still in phase, and if you didn't changetheir relative position they would stay in phase when you transferredthem together back to tape. They might end up offset by a few samplesfrom their original position on tape, but if you choose your editpoints according to phrase structure, this discrepancy shouldn't benoticeable. If your audio interface has enough inputs and outputs, youcan even transfer an entire multimicrophone set of drum tracks atonce.


Synchronization doesn't have to be a nightmare, and it doesn'tnecessarily require expensive gear. When you understand the timingissues of different musical contexts, keeping everything in sync is notthat difficult. Ultimately, it boils down to a few points:

1. Use your most accurate clock source as the master timingreference.

2. Slave all other devices to the master clock.

3. Use sample-accurate sync to prevent phase problems betweensimilar audio content.

4. Don't be afraid to rely on MTC in most situations.

5. Be resourceful; cheat whenever necessary.

Above all, use your ears. If it sounds right, it is right. Remember,it's all in the timing.

Brian Smithers is searching for a master clock to sync hiswork/sleep cycle with Earth's day/night cycle. Contact him through hisWeb site,

black burst Synonymous with house sync, its namederives from the fact that it is a video signal with no picture, whichwould yield a black screen if displayed. Also known as video sync.

frame rate The number of frames of film or videodisplayed per second. Film runs at 24 fps, video at 29.97 fps. Mostmusic-only production is done at 30 fps.

house sync A video signal used as a master timingreference for video and audio devices. Like word clock, it conveys rateinformation but not location information.

LTC Longitudinal Time Code, the most common form ofSMPTE in audio applications. The time code is converted into amodulated audio tone similar to modem noise and then striped (recorded)to one track of an audiotape recorder. The playback of that tone issubsequently read by a synchronizer, which controls the speed of slavedevices.

MTC MIDI Time Code, a form of SMPTE that can betransmitted over MIDI connections.

SMPTE time code The now-ubiquitous format ofhours:minutes:seconds: frames adopted by the Society of Motion Pictureand Television Engineers for conveying location and timing informationto video and audio devices. Also known simply as SMPTE.

Superclock Digidesign's version of word clock thatruns at 256 times the sample rate for extra precision.

VITC Vertical Interval Time Code, a form of SMPTEcommonly used in video applications. A video frame is drawn in twointerlaced passes of the cathode ray gun. The point at which the gunresets itself from the bottom corner to start over at the top corner iscalled the vertical blanking interval, and time-code information isinserted at this point. The time-code display window on a video screenis derived from VITC.

word clock The signal that defines the precisetiming by which each sample is recorded, played, or transferred. Unliketime code, word clock doesn't carry location information.

  Print Page