Maximum Conversion

Many people regard A/D converters simply as boxes that translate analog voltages into digital 1s and 0s, changing nothing else in the process. But every

Many people regard A/D converters simply as boxes that translate analog voltages into digital 1s and 0s, changing nothing else in the process. But every A/D also has analog front-end circuitry that affects the signal levels going into the digital domain in much the same way that a mic preamp affects levels going to tape. That front-end circuitry can be calibrated to either boost or attenuate the analog signal before it's converted to digital format(s), in effect raising or lowering the sensitivity of the converter.

In a perfect — that is, imaginary — world, the analog-circuitry calibration for all A/Ds would be permanently set at the factory to always deliver a 0 dBFS (0 dB Full Scale) meter reading in the digital domain, regardless of the signal you fed it. But in the real world, your A/D's calibration should routinely be readjusted to accommodate the varying levels that tracking and mixing applications present, so that robust input levels and optimal signal-to-noise ratios are always attained at the converter. Sure, you can get by with a set-it-and-forget-it approach, but your recordings might suffer audible signal-to-noise degradation if you do.

For example, say your A/D's calibration is set at too low a level (thus lowering the converter's sensitivity). In that case, you might not be able to get enough gain out of a typical mic preamp to attain full-scale levels (0 dBFS) at the A/D when tracking, say, delicately picked acoustic guitar or soft-spoken voice-overs. Or you might have to crank your preamp to the point at which its gain-boost circuitry is maxed out and thus not operating optimally. In either case, the noise floor and apparent resolution of your recordings will suffer when recording such quiet sources.

Conversely, if your A/D's calibration is set at too high a level (thus increasing the converter's sensitivity), superhot signal feeds from an analog mixing console during mixdown will probably clip the A/D's input, causing ugly-sounding digital “overs” (levels exceeding 0 dBFS on the analog side of things). You could lower your mixer's stereo-bus output levels to avoid that fate, but why pull levels down at the board's output only to jack them up again at the A/D's analog front end? Such poor gain-structure practices can sabotage your signal-to-noise ratio and therefore the resolution of your recordings. And if you use a digital console's analog outputs when mixing (to insert analog gear before the A/D, for instance), you're faced with the same dilemma. You don't want to be forced into lowering your digital console's (or analog outboard gear's) output levels to accommodate a rip-snortin' A/D front end. Peak console-output levels that are well below 0 dBFS are a recipe for veiled, flat-sounding (as opposed to round-sounding) mixes.

To avoid such booby traps and capture the highest-quality recordings, you'll probably need to recalibrate your A/D (depending on the ancillary gear you use with your converter) when switching from tracking to mixing and vice versa. This article will guide you through the procedures for calibrating A/Ds and will make suggestions for ballpark settings appropriate to each application. Because most outboard A/D converters are optimized to integrate with professional levels and balanced signals, I'll limit my discussion of calibration settings to those pertinent to working with +4 dBu-nominal gear.

Calibrating an A/D is actually a quick and simple process, once you understand how analog levels relate to digital-meter readings. I'll make that relationship crystal clear in a moment, but first let's look at the calibration controls found on currently available A/D converters.


Most A/D converters offer some means of calibrating their analog inputs, typically supplying an independent calibration control for each channel of A/D conversion. Assuming that the model A/D you own can be calibrated, the controls for adjusting the settings are usually in the form of either a trim pot (adjustable with a small screwdriver); a dual- or multiposition switch, such as that found on the Benchmark AD2402-96 (see Fig. 1); or continuously variable gain-control knobs, such as those found on the Lucid AD9624 (see Fig. 2). Some units, such as the Sonifex Redbox (see Fig. 3) use a combination of switches and level trims for setting levels. Check your unit's owner's manual for the type of calibration controls offered and their location on the unit.

Both trim pots and continuously variable gain-control knobs are generally preferable to switches for calibration purposes, as they offer infinite adjustment resolution within their range. Obviously, trim pots are more of a hassle to adjust than gain-control knobs because you have to deliberately turn them with a tiny screwdriver. Then again, that makes them less apt to get changed unintentionally, so the settings are pretty much locked in. Switches constrain you to a limited choice of preset calibration levels. And though switches do lock in your calibration levels, analog circuitry has a nasty habit of “drifting” over time, raising the possibility that calibration levels for different channels could eventually become mismatched.


The procedure for calibrating an A/D converter is to route a sine-wave generator's output to the analog input(s) of your converter and then tweak the A/D's calibration controls to achieve the desired level on the converter's digital meters. (More on what that digital-meter reading, called a reference level, later.) Just about any sine-wave generator (also known as an oscillator) that can output a 1 kHz tone at +4 dBu will work for calibrating an A/D, whether it's an outboard unit or one that's built in to a mixing console. The oscillator's frequency should be set to 1 kHz because 1 kHz is smack in the middle of the audible frequency range and thus provides a good proxy for average levels that the converter is likely to encounter when presented with broadband material. The 1 kHz tone should be output at +4 dBu because that is the standard for all pro-level gear that the A/D is likely to interface with in your studio. In fact, most converter manufacturers suggest you use a +4 dBu signal to calibrate their A/D converters.

How do you know when your oscillator is outputting +4 dBu? If you're using an outboard unit, it should have an output level knob and an accompanying readout that tells you when the level is at +4 dBu. If you're using a pro-level (+4 dBu-nominal) analog mixer with an onboard oscillator, 0 VU should be equivalent to +4 dBu level. Route the generator's output to your console's master L/R outputs and adjust the generator's output level for a 0 VU reading on your console's L/R meters.

Adjusting the sine-wave generator on a digital console for +4 dBu output takes a bit more thought. Your owner's manual should give the level the master L/R digital meters should be in dBFS for an equivalent +4 dBu output level at your analog (+4 dBu-nominal) stereo-bus outputs. For example, many consoles are set up to output +4 dBu at their analog stereo-bus outputs when their L/R digital meters read -18 or -20 dBFS (18 or 20 dB below a 0 dB full-scale reading).

But what if the owner's manual for your digital console doesn't specify an equivalent +4 dBu level in dBFS? In that case, you can determine yourself which digital-meter reading on the console equates to +4 dBu by checking the specifications table in the owner's manual to see what the maximum output level is for the console's analog (+4 dBu) L/R outputs. For example, the Yamaha 02RV2's maximum output level for its analog L/R outputs is stated to be +24 dBu. Knowing that, we can surmise that +4 dBu is 20 dB down from 0 dBFS on the 02RV2's meters (that is, +24 dB minus 20 dB equals +4 dB). Put another way, +4 dBu equals -20 dBFS on the 02RV2's L/R meters, and +24 dBu equals 0 dBFS on those same meters. So, when I want to send the 02RV2's internal oscillator's 1 kHz tone out the console's balanced analog outputs at +4 dBu level, all I need to do is adjust the generator's output level for a -20 dBFS reading on the console's L/R meters.


Once you've set the oscillator's 1 kHz tone to output a +4 dBu level, route the signal to your A/D converter, making sure that no other equipment that could alter the signal's level is placed between the oscillator source and converter. (Rather than remove any gain-altering gear from the signal path, you could simply switch the gear to hardwire bypass, assuming that there is one.) With a +4 dBu tone now routed to an analog input on your A/D converter, it's time to adjust the input's calibration control to attain the desired reference level on the converter's digital meters.

I generally like to calibrate my Apogee Rosetta 96 A/D converter (see Fig. 4)to a reference level of -14 dBFS when tracking with outboard mic preamps capable of delivering 60 dB of gain maximum. In plain English, that means I adjust the A/D's respective trim pots for each channel clockwise or counterclockwise as needed until a +4 dBu oscillator tone at the Rosetta's inputs results in a reading of -14 dBFS on the Rosetta's meters. Your needs may vary. You may, in fact, find that a reference level of -12, or even -10 dBFS, works better for recording ultraquiet sources in which an additional 2 or 4 dB of gain is needed at the A/D's inputs. (This would especially be the case when recording with dynamic or other mics with very low output.) With my Rosetta calibrated such that +4 dBu equals -14 dBFS, my mic pre needs to deliver only +18 dBu output to get a 0 dBFS reading on my Rosetta's meters. (That is, 14 dB plus -14 dBFS equals 0 dBFS. Add the same amount, 14 dB, to +4 dBu to arrive at +18 dBu as the analog equivalent of 0 dBFS at this particular calibration setting).

Of course, if your A/D converter offers only switched calibration presets, you'll need to choose the setting with the value that's closest to your desired reference level (assuming your preferred level setting is not provided). Also, you should check to see if the converter has a special calibration mode for the A/D's meters, as this will typically shrink the range of the meters down to where each meter segment represents a 1 dB step up or down in level from adjacent segments, allowing for more exact adjustments. The Apogee Rosetta's calibration mode goes one step further, using a combination of meter LEDs and Over indicators to guide you to calibrate your reference level to within ±0.1 dB tolerance (which comes in handy for exacting mastering applications).

Some A/D converters lack meter increments that are fine enough to accurately set a +4 dBu reference level. Don't fret. There is an alternative method for calibrating such ill-equipped A/Ds. Instead of using +4 dBu as your analog reference level, calculate which analog level would be needed to deliver 0 dBFS and use that level for your oscillator's output to align your A/D to a full-scale reading. For example, I stated previously that my mic pre needs to deliver +18 dBu to get a 0 dBFS reading on my Rosetta when the Rosetta is calibrated to a -14 dBFS reference level. So it follows that, rather than feed my A/D a +4 dBu-level oscillator tone, I can accomplish the same thing by feeding it a +18 dBu signal and calibrating my Rosetta (or any other model A/D) so that the meters read 0 dBFS.

Actually, the Rosetta's meters can show only -0.5 dBFS or Over (a level exceeding 0 dBFS) at the very top of their range. To obtain an exact 0 dBFS reading on a given Rosetta channel, I simply turn the channel's calibration pot clockwise until the corresponding Over LED lights up, and then back off the pot barely enough to make the Over LED go out when I clear the meters. (The Rosetta has peak-hold metering, so you must clear the meters to make the Over LEDs go out and show current levels.)

I've discussed calibration techniques in general and ballpark reference levels for tracking in particular. Next I'll discuss calibrating A/D converters for mixing applications.


Depending on how hot your mixes are, the reference level you calibrate your A/D to at mixdown might need to be set anywhere from -24 dBFS to -14 dBFS. Provided that no gain-altering signal processors (compressors, for example) are placed between the mixer and A/D converter, the mixer's maximum analog output level will dictate which reference level you should calibrate the A/D to. That is true regardless of whether you mix on an analog or digital console.

For example, the Mackie Analog 8-Bus Series analog consoles are specified to deliver a maximum balanced output level of +28 dBu. You want your A/D to produce a 0 dBFS reading when the Mackie 8-Bus console is cranking out +28 dBu, so that both console and converter top out at the exact same level. Therefore, for basic mixdown applications, calibrate the A/D so that 0 VU (+4 dBu) on the Mackie 8-Bus registers as -24 dBFS on the converter's meters. That gives you 24 dB of headroom (above +4 dBu nominal) on both the console and converter before they distort.

As another example, my 02RV2 dishes out +26 dBu at its balanced analog stereo outputs when the console's meters read 0 dBFS (that is in spite of the fact that the mixer is specified to deliver only +24 dBu maximum output levels). Since +26 dBu is 22 dB hotter than +4 dBu, I would need to calibrate my A/D converter to a reference level of -22 dBFS when mixing with my 02RV2 and unity-gain outboard gear.

Yamaha's new 02R96 console, on the other hand, is currently calibrated at the factory to output only +18 dBu, which is the European standard. (According to Yamaha Corporation of America, the company is working with the main headquarters in Japan to change the 02R96's calibration so that the board outputs a maximum +24 dBu, which is what American engineers are used to working with.) Because +18 dBu is only 14 dB hotter than +4 dBu, you would need to calibrate your A/D to -14 dBFS when feeding it signal from the 02R96's balanced, stereo analog outputs and any unity-gain analog outboard gear placed between the mixer and A/D converter. For mixing on most consoles issued in the United States, however, you would need to set your calibration levels considerably lower (typically to -20 dBFS or lower) to accommodate the hot output levels typically characteristic of mixdowns.


Unless you're using a mic preamp that can deliver 65 to 70 dB of gain, basic tracking applications generally call for calibrating your A/D to a higher reference level than that used for mixing. No matter whether you're tracking or mixing, however, always make sure you adjust the controls for each channel to the same exact reference level when calibrating two channels of A/D for use on stereo program material. Otherwise, your stereo image may become skewed.

I've covered only the basics of calibrating A/D converters in this article. The procedures and optimal reference levels change as soon as you start using analog dynamics processors and other gain-altering gear with an A/D converter. Moreover, certain creative mixing and mastering applications suggest alternative calibration methods — but those will have to wait for another article. Until then, I hope you'll use the techniques discussed here to make higher-quality recordings. The sonic rewards will be well worth your efforts.

EMcontributing editorMichael Cooperis the owner of Michael Cooper Recording, located in beautiful Sisters, Oregon.


Apogee Electronics Corporation
tel. (310) 915-1000

Benchmark Media Systems, Inc./Sonic Sense, Inc. (distributor)
tel. (800) 262-4675 or (315) 437-6300

tel. (425) 742-1518

Mackie Designs
tel. (800) 898-3211 or (425) 487-4333

Sonifex, Ltd./Independent Audio (distributor)
tel. (207) 773-2424

Yamaha Corporation of America
tel. (714) 522-9011