Honoring the people and technology that have shaped the way we create music
The world of musical electronics has been graced with giants, and we have benefited from their contributions. For many of them, simply seeing the results of their inspiration has been reward enough—when Bob Moog saw Reason’s “patch cords” sway when the program switched views to the back of the “rack,” he witnessed his past become the future. And he loved it.
As the beneficiaries of the geniuses in our industry, it’s time to give them the recognition they deserve. In this spirit, we have inaugurated The Electronic Musician Hall of Fame to honor those who have given us the tools, the technology, and the music we enjoy today. We’ve deliberately limited the number of categories so the spotlight can shine that much more brightly on the selected inductees, but we already have plenty of worthy nominations for the future.
We hope you will join us in this yearly tradition to honor a select group of people, as well as the gear and technologies they created, that changed our world. For that, we are thankful and appreciative.
Quick—name some other technology introduced in 1983 that’s still with us. Are you running Lotus 1-2-3 and Microsoft Word V1.0 on your IBM PC XT? Or computing on an Apple IIe with a whopping 64K RAM—and undecided about whether to go with Beta or VHS?
Maybe the 5-pin DIN connector isn’t MIDI’s dominant lifeform any more, but the protocol is still going strong in USB interfaces, ringtones, the controller protocol within your DAW, and much more. Why?
First, rival companies set aside their rivalry for the good of the industry—MIDI represents the opposite of the “not invented here” syndrome. Despite Sequential Circuits’ Dave Smith and Roland’s Ikutaro Kakehashi being major players in birthing MIDI, giants like Yamaha understood the potential and fully supported the open, royalty-free specification. Second, high-tech companies like E-Mu (which would have preferred to see a more sophisticated Ethernet-based protocol) nonetheless adopted MIDI because being so inexpensive, it was economically feasible to gamble on including MIDI—even if it failed.
Of course, MIDI did anything but fail. It linked the world of computers with musical instruments, live performance, recording, and even theater, leading to ramifications we still feel today. And let that be a lesson to the rest of the world: cooperation can trump competition. Our industry can be proud not just for adopting MIDI, but crucially, for adopting the philosophy that made it possible.
If you’ve ever tweaked an ADSR envelope, changed the cutoff on a 24dB/octave lowpass filter, or clamored for the sound of true analog synthesis, you’re already familiar with Robert A. Moog, one of electronic music’s most influential engineers. However, although his contributions to contemporary music made his name a household word in the ’60s, Moog’s influence extends far beyond his pioneering work in subtractive analog synthesis.
As a precocious teen, Moog began marketing Theremin kits under his own name, but his deep appreciation for the instrument remained throughout his life: He not only helped rediscover virtuoso Thereminist Clara Rockmore, but decades later, launched a Theremin resurgence when he started making them again under the Big Briar name.
Moog was a curious, restless, no-nonsense person whose wide-ranging interests went well beyond the confines of instrument design—which was not confined to analog circuitry. Whether with Big Briar, or Moog Music after he regained the legal rights to his name, Bob continued innovating with digital products for companies such as Kurzweil and Bomb Factory, and by designing custom controllers for musicians and composers. After his death in 2005, he left behind a breathtaking amount of ideas and unfinished projects, spanning decades of work that’s detailed in his notebooks and prototypes.
The Bob Moog Foundation was created to preserve this work, and honor his youthful inventiveness by providing science education through the use of music via the MoogLab Student Outreach Program.
In the early ’90s, digital tape recording wasn’t news. But in 1991, Alesis, known primarily for low-cost MIDI effect modules, blew the minds of recordists everywhere by creating a new multitrack recording paradigm: modular digital multitrack (MDM). Combining several tape transport technologies and software development from Fast Forward Designs, the ADAT was a 3U 8-track digital recorder with eight VU meters, SVHS videotape-based transport, high-quality 16-bit converters, ample I/O, an onboard sample-rate/speed control, and a sleeper feature dubbed Lightpipe, which is still in use today.
Seemingly overnight, recordists went from clunky analog tape-based multitrack machines to something ideal for a home-studio Goldilocks: a form factor that was not too big, not too small, but just right.
Not content to stop with the innovation of recording digitally onto readily available videotape cassettes, Alesis included the LRC remote control to complement the well-designed interface, and created an architecture that allowed synchronizing up to 16 machines for a total of 128 tracks. It was a trip to hear the machines shuffling back and forth to get their sync on, but it worked.
Long after the transports were idled by technological successors, recordists still used ADATs for their converters and Lightpipe interfacing. And despite Alesis’ assumption that ADAT would be a transitional technology, the original model evolved into additional tape-based versions (including one designed in conjunction with Studer) and around a decade later, transitioned to the hard-disk based HD24—which yes, had the same basic form factor.
Composer, singer-songwriter, and producer Trent Reznor has been chopping, tweaking, and layering dissonant chords with razor-sharp expertise for over 20 years. Primarily known as founder and leader of Nine Inch Nails, Reznor channeled his self-loathing and vitriol (Pretty Hate Machine, 1989; Broken, 1992; The Downward Spiral, 1994), despair and loss (The Fragile, 1999), anger and sobriety (With Teeth, 2005), cynicism (Year Zero, 2007), and struggle and catharsis (Ghosts I-IV and The Slip, 2008) to create an evolutionary catalog of music.
In addition to musical evolution, as an artist Reznor adapted to music industry changes by taking the DIY route and creating his own paradigm. Tired of record labels, he started Null Corporation in 2007—shattering traditional distribution models and doing business and art on his terms. You can download entire albums (even many multitracks of songs) for free. Fans can still buy physical CDs, LPs, and special limited-edition versions; favoring fidelity, he offers much of his work in FLAC, CD-quality M4A, and even 24/96 formats.
Reznor and longtime creative partner Atticus Ross ventured into film scoring, melding music and sound design for The Social Network (2010) and The Girl With the Dragon Tattoo (2011), garnering them a Golden Globe and an Oscar (SN), and a Golden Globe nomination (GWTDT).
While Reznor has eschewed the cornstarch, eyeliner, drugs, and alcohol, he’s held onto the black clothing—only now you can occasionally catch him in a tux while he picks up an award or two for his moody, tension-filled compositions.
Before 1983, FM synthesis had yet to leave academia and enter the music industry. Then Yamaha incorporated the FM research of Stanford’s John Chowning into the DX7, making an FM instrument for the masses. And the masses went nuts.
When early adopters first gathered around a DX7 and stepped through the presets, they couldn’t believe their ears: gasp-inducing tubular bells, a carillon choir of eerie metallic verisimilitude (complete with ambience), and an arsenal of digital pianos—not just several variations of ultra-realistic Rhodes sounds, but convincing clavinet and Hammond organ, complete with key click. Anything with a “struck and pluck” quality, like the gorgeous marimba, leaped to life with the DX7’s algorithms. As one musician said, “The DX7 was not only lighter and smaller than a Rhodes, but better.” For several years after its introduction, a keyboardist couldn’t even get an audition without owning a DX7. A typical ad read: “Wanted: Keyboardist with DX7 who can play.” Those requirements were listed in order of priority.
The DX7 transmitted on only one MIDI channel, was difficult to program, and was capable of full-range MIDI velocity only if you wielded a sledgehammer. But once you heard those bell, piano, and percussion sounds, all was forgiven. The DX7’s superlatively crispy transients and stunning realism just couldn’t be matched by subtractive synthesis. FM—and by implication, digital synthesis—became part of the permanent vocabulary of synthesists, sound designers, and musicians everywhere.
Les Paul and Mary Ford:
“How High The Moon” (1951)
In 1946, Bing Crosby suggested that Les Paul assemble a collection of gear in his garage—and thus began a series of recordings that has influenced music making for 60 years. His first sound-on-sound experiments involved cutting a second groove into an acetate record, but he eventually settled on recording to one disc lathe, then playing along with the music while recording to a second lathe. Although his 1948 single “Lover” demonstrates his overdubbing prowess (and his mastery of half-speed recording to achieve double-speed, octave-transposed melodies), we’ve chosen a subsequent release to mark the beginning of modern recording because it exemplifies the DIY aesthetic upon which Electronic Musician was founded.
By 1951, Paul’s studio consisted of an Ampex 900 tape recorder, homemade mixer, RCA 44BX ribbon mic, Lansing speaker, and military-surplus headphones—state of the art for the time, with analog tape providing the breakthrough that allowed Paul to maintain audio fidelity despite numerous generations in overdubbing. The mixer helped maximize each overdub’s effectiveness, as he recorded his guitar simultaneously with Mary Ford’s vocal; Ford sang only a few inches from the mic to increase her voice’s intimacy, thus introducing close miking into the engineer’s vocabulary.
When the duo cut “How High the Moon,” all the pieces were in place. The tight harmonies, bathed in Paul’s trademark slapback echo, rocketed the single to Number One for nine weeks in 1951—and ultimately inspired generations of artists and engineers.
Roland stands tall in full-line electronic music manufacturers, offering deep inventory in amps, keyboards, electronic drums, digital pianos, effects, guitar synths, and even MIDI accordions. Yet many people are surprised to learn that, unlike some other monolithic Japanese electronics companies, Roland is the product of a single boot-strapping visionary: Ikutaro Kakehashi.
Mr. Kakehashi was born in Osaka, Japan, in 1930, and after overcoming a tough childhood, gained entry into competitive chemical engineering school. He instead used his mechanical engineering skills and considerable tinkerer’s wherewithal to operate first a clockand- watch repair business, and later, an appliance- repair shop. During this time, he was plying his innovations in the field of electronic instruments, inventing and repairing electronic organs. In 1972 Kakehashi founded Roland Corporation, where he served as president until 2001. He still serves with the company in a senior advisory role.
Today, Roland stands on the forefront of progressive electronic gear. From nowestablished lines, like their iconic BOSS effects, V-Studio series and V-Drums, to their more recent forays into the V-Piano and the Capture interfaces, Roland continues to innovate. But to see more evidence of a visionary in action, look no further than the electronic guitar market— where Roland virtually curated the industry by releasing new instruments, modules, and ever-evolving technology for more than three decades. One can well assume that the far-seeing Ikutaro Kakehashi is standing watch over this, as well as other trends both emerging and established, at Roland.
LIVE SOUND RE-INVENTION
Bose L1 System
The L1’s originators, Ken Jacob and Cliff Henricksen, felt something was wrong with the traditional “P.A./monitors/backline” way of amplifying music—it put too much gear on stage, separated musicians from their audience, and was difficult to control in a nuanced way.
After defining what was wrong with existing systems, they set about defining what would be right. They were inspired by multiple factors, but in particular, by the way live performance traditionally involved individual musicians creating sounds from their own spatial locations. In 1995, this concept translated to the breakthrough of a system that deployed an individual system for each instrument or voice—in essence, the amplification system became an extension of the instrument, rather than a separate entity. Furthermore, they decided that the system had to be both portable and cost-effective.
After going through several prototypes, they made another breakthrough—the separate bass box—and the team grew over the years as more people became involved in testing the concept and gear. Before its release in 2003, the final element— ToneMatch, which voiced the system to various instruments—fell into place.
When the L1 hit the market, the initial wave of “this can’t possibly work” skepticism dissipated as more musicians had the chance to hear the benefits of this radical approach. The transportable, line array-plus-bass bin “personal P.A.” has now become a standard—but it all traces back to the pioneering work that created the L1.
Stephan Schmitt, with Volker Hinz, founded Native Instruments in 1996—before the 1999 introduction of VST instruments. Since then, CEO Daniel Haver and CTO/President Mate Galic have helped guide the company from two to 300 employees, and into the position of being an undisputed leader in both virtual instruments and hardware. Yet this growth was built in large part on the foundation that Schmitt created with Reaktor (née Generator).
Two of Schmitt’s early decisions accurately anticipated the future. The first was realizing that native processing would be powerful enough to create sophisticated virtual instruments, which led him to drop doing DSP-based development. The second was creating a modular system (inspired by software’s modular nature) that offered vastly more potential than simply emulating a particular synthesizer or pursuing a samplebased route.
Reaktor also became a prototyping environment for NI’s instruments and effects, including FM7, Guitar Rig, Pro-52, B4 organ (one of the tipping points that caused keyboard players to ditch their B3 in favor of a laptop!), Massive, various Maschine and Traktor effects, and more. NI also provides a free version of Reaktor that can host new instruments, like Razor.
Schmitt left NI in 2011 to start the company Nonlinear Labs, with the goal of designing more specialized instruments for niche markets. Yet he continues to be involved with NI on various projects, and remains a part of the company he helped found.