30th Anniversary Special: The History of MIDI

A Look at the Format's Past, Present, and Future
Image placeholder title

This article is part of Electronic Musician's special 30th Anniversary issue. To read more commemorative content, visit www.emusician.com/30thAnniversary.

Maybe it goes without saying, but electronic musicians’ lives would be radically different without MIDI. Alongside synthesizers and digital audio, MIDI is the technology that has had the greatest impact on how we create electronic music. Its influence on the instruments we play and the software and peripherals we use can’t be overstated.

Before MIDI became a standard more than 30 years ago, most electronic musicians lived in a sort of pre-digital Dark Ages, waiting for something to sweep them into the music industry’s mainstream. Since then, it has given us sophisticated, affordable tools essential to modern music production and performance. MIDI makes software possible that every component in an integrated system can understand on its own level. It is the bridge between musical instruments and computers.

Because all MIDI gear is inherently compatible, you can start with a modest system and build onto it gradually while minimizing obsolescence. At the 2013 NAMM Show, as part of celebrating MIDI’s 30th anniversary, members of the MIDI Manufacturers Association (MMA) connected a sequencer running on a Commodore 64 computer to Moog Music’s Animoog app running on an iPad and guess what? It just worked.


You probably know that MIDI is an acronym for Musical Instrument Digital Interface. It is a communications protocol that comprises both a set of instructions and the physical connections between compatible devices.

The language and rules of MIDI were precisely defined in a document called the MIDI Specification 1.0 and published by the MMA, a consortium of synth manufacturers who agreed to make products that adhere to the standard. Now comprising hardware and software companies that include the likes of Apple, Microsoft, and Google, the MMA defines, extends, and enforces the standard, ensuring that MIDI-compatible software and hardware are interoperable, meaning that they work together in a system as seamlessly as possible.


At its most basic level, MIDI is a collection of messages or commands that transmit information such as when a sound plays and at what pitch. Some of the most important messages for music include data about what notes are played, for how long, and at what velocity, but the role of Control Change (CC) messages has become more critical in recent years because it’s used for so many purposes.

All of these commands were originally intended for synthesizers, but because MIDI is a cheap and ubiquitous standard, today it’s used in all manner of technology, musical and otherwise. DJ controllers use MIDI messages to trigger clips and control their playback speed, for example. MIDI is an integral part of stage lighting consoles, and it even controls rides in theme parks and choreographed fountains outside of Las Vegas casinos.

In the early days, MIDI was inseparable from the 5-pin DIN connectors that you see on many instruments. When MIDI became a standard, the companies that ratified the specification chose that particular connector because it was inexpensive and readily available. Since then, the MMA has specified alternate hardware for MIDI transmission, most often because USB, FireWire, and Ethernet offer much faster transmission speeds allowing much greater bandwidth (see Figure 1). Most newer instruments sport single USB ports rather than a trio of DIN connectors. Although you can’t connect two MIDI devices using USB unless you have a computer between them, USB connections offer enough advantages that they’ve become the preferred choice for most manufacturers.

Fig 1. Because it furnishes MIDI connections via Spin DIN, USB, and Ethernet, iConnectivity's iConnectMIDI4+ is well-equipped for advances in MIDI hardware. HAMMERING OUT A STANDARD

Image placeholder title

In the late 1970s and early 1980s, companies that built electronic musical instruments were transitioning from designing instruments based entirely on analog technology to instruments with microprocessors at their cores. As the price of digital electronics fell, they sought to augment and replace the discrete circuitry in their products with affordable computer chips. To realize the potential of these new instruments fully, however, many realized that the microprocessors in their machines would need to communicate with each other.

Companies such as Oberheim, Roland, and Sequential Circuits introduced proprietary communications buses that allowed a synth, a drum machine, and a sequencer to record, store, and play thousands of notes in an integrated performance, revolutionizing music production by radically advancing the concept of a one-man band. Unfortunately, all the components in a system had to come from the same manufacturer to be compatible. The obvious solution to the problem was to convince manufacturers to agree to some sort of standard.

At the 1981 AES convention, Dave Smith, president of Sequential Circuits, presented a proposal inviting other synth makers to create a standard for digital communication, but most of his competitors had no interest in cooperating. The major exception was Roland founder Ikutaro Kakehashi, who had earlier suggested the same kind of standardization to synth designer Tom Oberheim.

A few months later, at the NAMM Show in January 1982, Smith and Kakehashi met with other synth makers and proposed the basic elements of what became MIDI. Yamaha, Korg, and Kawai agreed to the standard, and along with Sequential Circuits and Roland, they announced ratification of the MIDI Specification at the 1983 NAMM Show and published the spec in August of that year.

During the year that followed, practically every electronic musical instrument maker began building products that conformed to the MIDI standard, and synthesizer sales increased dramatically. Since then, the MIDI Specification has been a continually evolving document subject to extensions that keep it up with the times. A few extensions approved since 1983 are Standard MIDI Files (1990), General MIDI (1991), MIDI over FireWire (2000), and Downloadable Sounds (2004). Other extensions to MIDI 1.0 currently under consideration by the MMA include Polyphonic Expression Controllers, Web MIDI, and MIDI over Bluetooth.



Looking to the future, the MMA’s members imagine a time when they may want to go beyond MIDI’s inherent constraints. To that end, they’ve been discussing the HD Protocol, which some have erroneously called MIDI 2.0 or MIDI HD. First proposed in 2006, development of the specification has continued for almost a decade. The official position, according to MMA president Tom White, is that “HD Protocol is just a proposal until it is ratified by the industry, and we don’t know if and when that will happen.”

HD’s most crucial aspect is that it must be completely interoperable with MIDI 1.0. All your MIDI hardware and software must operate exactly as it does now, but with extended capabilities for products that support the new protocol. Because HD would be backward compatible and recognize current MIDI messages, all MIDI devices would function as they always have. Although Ethernet is HD’s preferred hardware connection, USB is an acceptable alternative, with options for any additional hardware standards that arise in the future.

A few of the protocol’s most outstanding improvements would include thousands of channels and controllers, massively high resolution, and Direct Pitch control. Whereas now you need to specify a MIDI Note Number and a Pitch Bend message to achieve a non-standard pitch, Direct Pitch specifies any possible pitch, regardless of tuning. Because Direct Pitch could alter pitch at any instant, you’d no longer need to offset the current pitch with Pitch Bend. In addition, a new Note Update message would modify controllers or other parameters over the duration of a single note, allowing more precise articulation than is currently possible.

In the meantime, new extensions to the MIDI spec are ongoing, with various working groups ensuring that MIDI 1.0 remains a viable document today and into the future. Even if the MMA finally approves the HD Protocol and new products begin to appear, MIDI is here to stay.

Multidimensional Polyphonic Expression: Polyphonic multidimensional controllers (PMCs) such as the Roli Labs Seaboard, Roger Linn Design LinnStrument, and Haken Continuum Fingerboard expand the limits of real-time MIDI performance. PMCs can generate data that most MIDI software doesn’t understand, putting unnecessary restrictions on what you can do with them. Because PMC manufacturers have a vested interest in extending MIDI 1.0 so that it supports the new capabilities their products offer, they have joined together to create standards for Multidimensional Polyphonic Expression (MPE).

One of MPE’s most significant features is per-note expression. When it’s turned on, you’ll be able to address as many as 15 notes on their own channels and apply Aftertouch to each channel individually, while still retaining a single channel for controlling parameters globally—sustain, in particular. You’ll also be able to apply Pitch Bend to individual notes and gain direct, continuous control over their timbre. In addition, MPE will allow you to extend Pitch Bend to a default range of 48 semitones and a maximum 96 semitones.

MPE-compatible hardware and software will offer an MPE mode you can toggle on and off. When it’s on, controllers will automatically reconfigure themselves for it. The MME’s MPE working group expects to have a finished version of the MPE proposal by the time you read this.


MIDI over Bluetooth LE: Developed and implemented by Apple, Bluetooth LE (Low Energy—BTLE for short) is a wireless connection specification designed to extend the battery life of mobile accessories that don’t stream data continuously, including MIDI keyboards and controllers. Minimizing latency and jitter are just two of the challenges to any kind of wireless MIDI, but Microsoft and Google have already expressed their support for MIDI over BTLE.

As with all Bluetooth devices, you’ll need to pair MIDI devices before they can communicate their MIDI capabilities to each other and transfer data between them. Although MIDI over BTLE is designed for use with mobile devices, you should be able to use the same accessories with your computer, just as you may already use a Bluetooth QWERTY keyboard and mouse. By the time you read this, MMA members will have voted on making MIDI over BTLE part of the MIDI Spec, meaning that someday you could be using wireless MIDI devices both onstage and in your studio.

Web MIDI: A few months ago, Google announced that the latest update to its Chrome browser offered built-in support for MIDI, allowing Web apps to communicate with MIDI devices. A Web-based synth emulation soon appeared, enabling anyone with a MIDI keyboard connected to a computer to play it online. Soon after that came a Web-based drum machine with MIDI I/O capabilities. If you have Chrome, you can access compatible apps at webaudiodemos.appspot.com.

Browser-based MIDI applications rely on the Web MIDI API and Web Audio API, both supplied by the World Wide Web Consortium (W3C), whose members include Google, Mozilla, Apple, and Microsoft. The MIDI API lets you select MIDI devices connected to your computer, tablet, or phone and use them to change parameters and play music on Web apps. A generally accepted standard for Web MIDI is still a work in progress, but once approved, support for MIDI devices will become a standard feature for browsers and operating systems across hardware platforms.

Android MIDI: Demand for MIDI apps on mobile devices is exploding. Unlike iOS users, Android users have been limited by the platform’s lack of standards for connecting MIDI devices and handling MIDI data. When it’s released, however, the latest update to Google’s mobile OS, Android 6.0 Marshmallow, will offer built-in support for a MIDI API that helps programmers write new MIDI apps and modify existing ones. Android users will be able to use MIDI keyboards and peripherals to control apps and use the apps to control external instruments and devices, either via BTLE or a USB connection. You’ll also be able to generate MIDI data in one Android app and route it to another. What’s more, the MIDI API makes it possible to write apps that use your Android device as a multi-touch controller for controlling MIDI software running on your laptop or desktop computer.


Windows 10: One reason that Apple computers and devices have been so popular with musicians is that their operating systems integrate Core MIDI and Core Audio, making music software and hardware peripherals inherently compatible with minimal hassle for both product developers and users. Windows users have often relied on third-party solutions to make everything in their studios play well together, leading to inevitable incompatibilities because software that works with one product might not work with another. Microsoft seeks to remedy this situation by addressing compatibility at the system level in Windows 10.

Pete Brown, Microsoft’s principal program manager working with music and audio hardware and software developers, expects to see Windows 10 installed on as many as a billion devices in the next three years. That’s because the same operating system will run on smartphones, notebooks, desktops, tablets, and Xboxes, and even on a Raspberry Pi or Microsoft’s augmented-reality headset HoloLens. In fact, the next version of the Akai MPC will run an embedded version of Windows 10 under the hood.

To take full advantage of all this newfound compatibility, software must adhere to the Universal Windows Platform (UWP) model, a set of design guidelines that ensure software will run on a range of devices. Once it is implemented, you should be able to run the same DAW on all your Windows gear. Several developers are already updating their software for UWP. Bitwig has a UWP version of Bitwig Studio in the works, and Propellerhead has demonstrated a UWP version of Figure.

To enable compatibility for music applications across hardware platforms, Microsoft has introduced a new MIDI API for software developers. This modern, multiclient set of development tools, routines, and protocols allows multiple applications running in Windows 10 to share multiple MIDI interfaces—something you couldn’t do with previous APIs. In addition, Microsoft is providing new APIs for audio that improve performance by reducing buffer sizes for lower latency and preventing spikes caused by external processes, for example. In future APIs, Microsoft plans to address capabilities such as wireless MIDI, MIDI routing, and time stamping.


The MIDI Manufacturers Association is a group of product developers, engineers, and other technology leaders from member companies who implement and enforce the MIDI Specification, ensuring interoperability now and in the future. Because of budget constraints, the MMA’s role in promoting MIDI and educating users has been mostly limited to publishing the specification and hosting a website that disseminates information about MIDI. But all that’s about to change with the creation of The MIDI Association (TMA), a spinoff that aims to create a global community of people who create music and companies that create products using MIDI.

Recent developments—particularly the growth of MIDI’s potential audience thanks to Windows 10 and Android—suggest a greater need to make users aware of what they can do with MIDI. TMA’s role is to appeal to all MIDI users, including musicians, sound designers, audio engineers, hobbyists, retailers, manufacturers, developers, and educators. The focus of TMA’s online community is the website MIDI.org, which has been the MMA’s domain until now. For end-users, it will be a source of information about anything related to MIDI technology.

These are exciting times in the world of MIDI. Almost 33 years after it became a standard, its influence on music, creativity and commerce is greater than ever, and you can fully expect its influence to grow. You have more opportunity than ever to be an active part of that growth.

Thanks to his embracing MIDI networking, synth programming, sample editing, and sequencing software well before most other musicians in Atlanta in the 1980s, former EM senior editor Geary Yelton enjoyed a studio career that lasted only until better players caught on to what he was doing.