THE CASE FOR MIDI IN THE 21ST CENTURY

“MIDI—hey, wasn’t that the thing that happened, like, more than 25 years ago where a bunch of manufacturers thought that maybe computers were going to be the Next Big Thing in music?
Author:
Publish date:

Fig. 1. The MIDI Devices tab has been selected in Apple’s Audio MIDI Setup utility. Line 6’s
KB37 and M-Audio’s Axiom 49 have been set up as MIDI devices, and their “properties” windows
have been opened. Here, you can further edit the devices’ functionality.

1_MacMIDI

FROM AUTOMATION TO SONGWRITING, THIS DECADES-OLD PROTOCOL STILL ROCKS

“MIDI—hey, wasn’t that the thing that happened, like, more than 25 years ago where a bunch of manufacturers thought that maybe computers were going to be the Next Big Thing in music? Wasn’t it kind of slow and stupid, but people got behind it because it was cheap? Yeah, something like that. Well, now we have digital audio, so we’ll just file MIDI under ‘interesting historical footnotes,’ and. . . .”

Hold on there. MIDI is alive, well, and a vital part of what we do with recording—whether you know it or not. Sure, the 5-pin DIN connector isn’t king of the MIDI hill any more; the data is more likely to fly over USB, and inside your computer. But it’s controlling your virtual instruments, big chunks of your automation, and letting your control surfaces talk to your computer—and that’s not all MIDI can do by a long shot. Here are some tips and techniques you'll find essential for deep dives into the ever-evolving world of MIDI.

MIDI VS. DIGITAL AUDIO

Fig. 2. In Presonus Studio One Pro, the Options menu has a tab for External Devices, where you can select your main MIDI ports from a dropdown list that shows all available MIDI devices.

2_SOP_MIDI

MIDI isn’t sound; MIDI is a computer language that consists of commands and data. For example, a command that emanates from a MIDI keyboard might be “play a note,” with data that specifies the note pitch and the dynamics with which you played the note. Or the command might be, “change a mixer channel’s level,” with data that expresses the level of a control surface fader. Or the command might even be, “notify the guy holding this smart phone that someone’s trying to call him,” at which point MIDI triggers a ring tone. There are even MIDI commands for stage lighting, machine control, and my personal favorite—pyrotechnics. Why play a recording of the cannon in the 1812 Overture when you can trigger the real thing from a MIDI footswitch?

But seriously, being data gives MIDI unique characteristics, so let’s discuss a few potential applications. However, first we need to set up our MIDI devices.

GETTING MIDI AND DAWS TO “PLAY NICE”

Here’s the general setup procedure for getting MIDI devices like keyboard controllers and control surfaces to work with your DAW. The process is different for Windows and Mac systems; on a Mac, you can open the Audio MIDI Setup application (found in Applications > Utilities—see Figure 1) and add, edit, and set up inter-application communications among various devices. In Windows, you take care of your MIDI housekeeping within individual programs.

If needed, install a MIDI driver. With an audio interface that has MIDI capability, if a driver is needed, it will be installed as part of the installation process. For gear that communicates over USB, in many cases you won’t need a specific driver because the MIDI device will be class-compliant, which means that it provides basic MIDI communications. However, you may need a specialized, sophisticated driver like those from Yamaha and Korg, which allow their keyboards to communicate MIDI data with your computer over USB for parameter editing—or even for using the keyboard as a physical “plug-in” in the virtual world.

Fig. 3. Reason supports the various control aspects of M-Audio’s Axiom keyboard series, and even
presents the helpful hint that you should use Axiom preset 10 with Reason.

3_ReasonKB

Tell your DAW where to find your MIDI device. Your interface may have a hardware MIDI input, where you plug in a hardware keyboard controller’s MIDI out. Your DAW will have some kind of Preferences file that lets you tell it what you’re using for a MIDI interface; in this case, you’d specify that particular port (Figure 2). Alternatively, your MIDI device might send/receive data over USB. In this case, the USB MIDI connection itself will be listed as one of the available MIDI devices.

Tell your DAW the specific MIDI device you’re using. This feature won’t be available in all programs, but many DAWs have a list of “supported” devices and if the device you’re using is supported, you may be able to take advantage of special features like mapping keyboard faders to mixer channels and the like (Figure 3). If you’re using a device that’s not supported, don’t worry—there will generally be an option for “generic” controllers, and you can describe the features of your particular controller to your DAW. This setup may not offer as many functions as an officially- supported device, but ultimately, all that matters is being able to get data into (and out of) your DAW.

USING MIDI WITHIN YOUR DAW
Now that everything’s configured, you won’t have to think about setup again unless you change controllers or need to set up another program. At this point, you can get into using MIDI within your DAW. These days, it’s really quite simple.

Create a MIDI track. You need a track that can record and playback MIDI data. This may be distinct from an audio track, or there may be no obvious differentiation other than how these two types of data are handled “under the hood.” Note that you may not need to create a specific MIDI track with virtual instruments— some programs create “instrument tracks” that automatically create a MIDI track for receiving notes, while providing an audio output for the instrument.

Fig. 4. Xpand for Pro Tools can play back four instrument sounds at once—it’s an instant rhythm section for your songwriting endeavors.

4_ProTools

Specify the MIDI track’s input. The MIDI track will have some kind of input field that lets you choose the MIDI device you told your DAW about during setup. If you have several MIDI devices, you’ll see a list where you choose which one you’re using to provide input data. You may also have the option to specify a certain MIDI channel. This is because MIDI can “channelize” data so the MIDI input will accept data coming in over, for example, channel 1. Another option, “omni,” means that the MIDI track will accept any incoming data. In most cases this is what you’ll use, because it’s convenient and there are better ways to channelize data. Namely . . .

Specify the MIDI track’s output. This could be a virtual instrument, a physical MIDI output port on your audio interface that feeds a hardware synthesizer, a signal processor plug-in that accepts MIDI input for control, etc. This is also where you’ll likely be able to specify an output channel. We’ll see why this is important when we get into using MIDI for songwriting.

GOTCHAS!

Here are a few potential MIDI issues to watch for when you’re working in your DAW:

Fig. 5. This shows three Cubase MIDI FX—StepDesigner, Context Gate, and Arpache 5—and a fourth is about to be selected. Note how Cakewalk’s MIDI FX show up in the menu, because a wrapper has been added to Cubase that lets it recognize MIDI FX from other manufacturers.

5_Cubase-MIDI-FX

Record filters. Some DAWs include record filters that let you record or exclude certain types of MIDI data. This feature is partly a holdover from the early days of MIDI, when computers weren’t fast enough to handle large amounts of incoming data. MIDI parameters such as aftertouch (which indicates how much pressure you’ve applied to a key while holding it down—useful for adding vibrato and the like to sustaining notes) generated quite a bit of data, and could “clog” the MIDI stream. The ability to filter this out improved computer efficiency. While filtering MIDI data isn’t as important today as it once was, there’s no need to record data you don’t need, especially if it clutters up your GUI and makes it harder to see the data that does matter.

Unintended note doubling. An issue arises when you use a hardware synthesizer as both a controller and a tone generator being driven by your DAW. Typically, the notes you play will trigger the sound generator, but your DAW will also pass your performance along to the MIDI out, which will also trigger the notes—giving an unintended “doubling” effect. There may be a MIDI track “input echo” feature that you can turn off at the DAW (thus preventing the input data from making it to the output), but if not, your keyboard will likely have a feature called “local control on/off.” This simply means that you can prevent the keyboard from feeding the internal tone generator; in this case, the only notes it receives come from your keyboard going into the DAW, and exiting via the MIDI out.

Fig. 6. Three Cakewalk MFX have been inserted as effects into a Sonar X1 MIDI track: Quantize, Velocity, and Echo. Note that an arpeggiator is included in every MIDI track, as shown in the track Inspector on the left.

6_Sonar

One instance at a time for hardware synths. When using a hardware synth as a plug-in within a DAW (not all synths can do this, but many can), you can only insert one instance because the hardware is generating the sound, and there’s only one piece of hardware. With virtual instruments, the computer generates sound based on instructions it receives, so it can create instrument sounds until it runs out of CPU power.

SONGWRITING: THE MIDI ADVANTAGE
Songwriting can be a very fluid process, as ideas come fast and furious—and part of that fluidity may involve changing key, tempo, or even instruments. With digital audio, all these types of changes are possible, but they’re not always easy. However, MIDI, being data, doesn’t care whether it spits out data at 85 or 175 bpm, or in the key of C or E.

For keyboard players, one of the best aspects of MIDI and songwriting involves multi-timbral instruments (Figure 4), which took off in the virtual world with instruments such as IK Multimedia’s SampleTank, Native Instruments’ Kontakt, and arguably the ultimate “MIDI studio,” Propellerheads’ Reason. With these, you can load up a collection of instruments—drums, bass, piano, effects, whatever—and basically lay down parts as fast as you can assign your keyboard controller to a particular sound. Each instrument responds to data over a specific MIDI channel, so all you need to do to trigger a specific instrument is to change a MIDI track’s output channel assignment.

Fig. 7. Tobybear’s “Humanisator” MIDI effect has been inserted between the Sequencer and Synthesizer modules in energyXT’s Modular page. When the sequencer plays back, its notes are processed before feeding thesynthesizer.

7_EnergyXT

With today’s DAWs, it’s easy to route your keyboard controller to a particular instrument. Your DAW will often “know” which MIDI instruments are available, so when you select a track output, you may not have to think about a MIDI channel—you’ll see a list of instruments. This even happens if, for example, you ReWire Reason into another DAW: When assigning an output to a Reason instrument, you can specify it by name rather than MIDI channel.

When songwriting, multi-timbral instruments let you lay down tracks easily. However, another advantage of MIDI is that it’s so easy to replace instrument sounds. If you’re playing a bass part, you can choose any bass sound as a placemarker, then concentrate on choosing the perfect option later.

MIDI EFFECTS

MIDI data lends itself to data processing, and some programs support MIDI plug-ins that process MIDI data in a way similar to how audio plug-ins process audio signals, while others include their own proprietary types of MIDI processors. Either way, MIDI processing allows for a variety of effects—some are utilitarian functions, like compressing the dynamics of MIDI data or quantizing non-destructively in real time, while others can work like mini-drum machines or sophisticated step sequencers.

As with plug-in formats, unfortunately there are multiple MIDI effects formats and of course, they’re incompatible—although the good news is that some “wrappers” allow programs to use the formats that they don’t natively support.

The Windows-only MFX format is the granddaddy of MIDI effects. It’s supported by Steinberg DAWs (Cubase and Nuendo; see Figure 5) as well as all Cakewalk DAWs (Sonar, Home Studio, Guitar Tracks, etc.—see Figure 6). With a suitable “wrapper,” you can use Cakewalk’s MFX (and MFX from other manufacturers) in Cubase without any problems—they show up on the MFX menu along with all of the other Steinberg MIDI FX.

Download the MFX wrapper from www.soundtrek.com/catalog/product info.php?cPath=6& products id=35, or just go to Steinberg’s Knowledge Base and do a keyword search on MFX. Then drop the mfxwrapper.dll file into the Cubase “components” folder (C:\Program Files\Steinberg\Cubase 5\components), and you’re good to go.

Fig. 8. You can drag-and-drop MIDI effects (in this case, Arpeggiator and Random) from Live’s browser into MIDI tracks, just as you would drag audio effects into audio tracks.

8_AbletonLive

This doesn’t work the other way around, though; you can’t use Cubase/Nuendo’s MIDI FX in Sonar because they’re compiled into the executable program file, not separate DLLs as is the case with Sonar. Most Cakewalk-compatible MFX files have an installer that takes care of getting them into your system, although if all you have is the DLL, you can drop it into Cakewalk’s Shared MIDI Plugins folder (C:\Program Files\Cakewalk\Shared MIDI Plugins) and use Windows’ regsvr32.exe routine to register them. To register, use the Windows “run” command line interface and type regsvr32 [filepath]\[name of dll]. Hit Return, and the plug-in will be registered.

Cubase is arguably the king of MIDI effects; the program includes 18 MIDI effects, and some of them are remarkably sophisticated—almost like mini-sequencers in themselves. Cakewalk’s array of MFX processors is impressive as well, but Sonar also includes an arpeggiator that is built in to every MIDI track, and it can run Cakewalk Application Language (CAL) files—created with a LISP-like scripting language that provides MIDI functions like splitting notes, “strumming” chords, and the like. Cakewalk has de-emphasized CAL files over the years; however, there are many CAL files in existence, and a quick web search will unearth them.

In addition to MFX, there are also two types of VST MIDI plug-ins: “standard,” and the less-used VST Module Architecture. These require a host where VST plug-ins can receive MIDI data, and the plug-in then outputs MIDI data. Compatible hosts include Cubase/Nuendo, Ableton Live, energyXT (Figure 7), FL Studio, and Tracktion.

Several programs offer their own ways of handling MIDI effects, which may or may not involve plug-ins. For example, Reaper has its own format, JS MIDI plug-ins, and comes with a JS plug-in scripting engine (very much like writing in C) so you can write your own scripts. Kontakt foregoes the MIDI plug-in concept but includes extensive scripting options, also similar to C, for processing incoming MIDI data. These aren’t the only two programs that take this approach, but they’re representative of the power of scripting. Ableton Live includes a wide variety of useful MIDI processing effects (Figure 8), but MAX for Live lets you take that whole concept even further by designing your own data processors. And while Reason doesn’t offer MIDI plug-ins, per se, the RPG-8 arpeggiator is extremely full-featured as a note processor.

Logic Pro takes yet another approach by letting you construct MIDI processing within Logic’s Environment. The possibilities are pretty much unlimited, including the ability to control channel strips, but casual users will probably find it intimidating. Fortunately, you don’t have to get into creating MIDI effects, because Logic includes MIDI processors in the Track Inspector—you can even think of the Transform window as a MIDI effect.

MOTU’s Digital Performer includes destructive MIDI effects (echo, transpose, arpeggiator, re-assign controller data, etc.) in the Regions menu, but you can also insert MIDI effects into mixer MIDI tracks. These process data non-destructively, in real time, and include effects such as humanize, invert pitch, deflam, arpeggiator, quantize, transpose, etc.

Adobe Audition was a latecomer to the world of MIDI, but still manages to include limited MIDI processors: humanize, quantize, randomize velocity, and transpose.

Here’s one example of why MIDI effects are cool. When songwriting, you can quantize your parts non-destructively using plug-ins just to make sure all your rhythms are lining up, then take out the plug-ins to regain the “feel” of your original parts, and edit only those notes that are in real need of quantization. Another useful MIDI effect applies dynamics compression to MIDI notes. You can usually accomplish this sort of thing without plug-ins by editing, but it’s a much more tedious process—for example with compression you have to divide all MIDI velocities by a particular amount (e.g., 50% for 2:1 compression), then add a value to all velocities to provide the MIDI compression equivalent of the “makeup gain” function found in analog compressors.

SCRATCHING THE SURFACE
What? I’m out of space already? Well hopefully, the above will have inspired you to take a fresh look at the things MIDI can do in today’s generation of DAWs. Don’t forget that MIDI can also be a powerful tool for automating not just the usual suspects, like level and pan, but various effect and instrument parameters. And if you ever run out of hands while recording, a MIDI footswitch setup like the kind favored by guitarists might be just the ticket for remote foot control of crucial parameters and functions.

Sure, MIDI is over a quarter-century old . . . but it’s definitely not ready for retirement.