6/8/2012 9:22 AM
Now that computers are becoming more powerful, are people starting to record at 96kHz/24-bit, or even 192kHz instead of 44.1kHz/16-bit? Does the resolution of the DAW affect this? In other words, is there any value to 32-bit floating point or 64-bit audio engines if you’re just recording with 24-bit converters? —Phil Hassenger
The resolution at which you record and your DAW’s internal resolution have different ramifications. Most people can hear a definite sonic improvement when recording at 24 bits compared to 16 bits, but this also depends on your converters. With a 16-bit converter, you’re probably getting more like 14 “real” bits, and with a 24-bit converter, 20 “real” bits, because the least significant bit is constantly dithering back and forth, and there are also noise-floor issues. Recording 16-bit files with 24-bit converters sounds much better than back in the days of 16-bit converters, but most engineers would agree that recording 24-bit files is preferred.
As for sample rate, few listeners can tell the difference between material recorded at 44.1 and 96kHz. For broadcast or video work, 48kHz is the standard sample rate, and some engineers swear that recording at 88.2kHz is superior to 44.1kHz—but there’s no definitive, “one-size-fits-all” answer. Besides, some interfaces don’t even offer 88.2kHz or sample rates higher than 96kHz because they aren’t commonly used.
Also note that working in higher sample rates may have disadvantages; you can’t stream as many tracks through a USB or Firewire interface, and some plug-ins don’t handle 96kHz well. Recording at 44.1/24 or 48/24 is fine for most people.
The DAW’s internal audio engine resolution affects the accuracy of calculations for internal processing, so you want at least 32-bit floating point to avoid round-off errors that accumulate over multiple calculations. There’s no significant penalty for running DAWs at a high internal resolution (e.g., 64-bit), so unless you encounter problems, use the highest available resolution.—The Editors