Electronic Music Wiki
Advertisement

MIDI (An acronym for “Musical Instrument Digital Interface”) is a communications system and set of software protocols allowing synthesizers and sequencers to control other units via remote control, as well as adjust patch parameters, load and save patch data, and a variety of other uses. The standard specifies both a set of message protocols and a physical interface; in the intervening years since the original standard was published, methods have been developed for carrying the protocols on other media such as USB and Ethernet.

History

Precursors

Prior to the development of MIDI, companies that would play important roles in MIDI's development had designed proprietary mechanisms for their own products.

Roland developed the DCB interface, which was integrated throughout much of their product line between 1979 and 1982. Its primary function was to allow synths to be played by Roland's sequencers, and to allow said sequencers to be synchronized with Roland's drum machines. It used a parallel-style interface, which was fast but expensive to implement. Performers disliked dealing with the bulky multi-conductor cables, and options for building networks of devices were limited.

In 1980, Roland introduced the DIN sync interface to synchronize different electronic musical instruments. It was introduced with the Roland TR-808 in 1980, followed by other Roland equipment in 1981. In 1981 Roland introduced Digital Control Bus (DCB). DCB was the precursor to MIDI, which adopted most of its features from the DCB protocol and used the same type of connectors as DIN sync.[1]

Sequential Circuits had an interface known as the Remote Prophet, which was available as a factory option on the Prophet-5. It allowed the synth to be played remotely via a keytar-style controller keyboard. It used a high-speed serial interface. It did not sell well and Sequential did not develop any other uses for it.

Development

In 1981, Roland founder Ikutaro Kakehashi proposed the concept of standardization to Oberheim Electronics and Sequential Circuits, and they then discussed it with Yamaha, Korg and Kawai.[2] A common MIDI standard was developed, working with Roland's pre-existing DCB as a basis,[1] by Roland, Yamaha, Korg, Kawai, and Sequential Circuits.[2][3]

As the story goes, Sequential's Dave Smith met Roland's Ikutaro Kakehashi at a trade show in 1981, and they began discussing the possibility of an industry-standard interface. With some input from Tom Oberheim, they sketched out an interface that would be less expensive to implement, using off-the-shelf communications integrated circuits, and inexpensive 5-pin DIN cables (already widely used as a means to connect tape decks to amplifiers in semi-pro audio equipment). Based on their discussion, Sequential built a prototype, which the other participants used to evaluate ideas over the next two years. The manufacturers involved formed a trade group, the MIDI Manufacturers Association, which became the owner of the rights to the specification, as well as a means of pooling research money.

The standard was then discussed and modified by representatives of Roland, Yamaha, Korg, Kawai, and Sequential Circuits,[2][3] and was named Musical Instrument Digital Interface.[4] MIDI's development was announced to the public by Robert Moog, in the October 1982 edition of Keyboard magazine.[5]

Introduction

MIDI was publicly announced in 1982.[5]

The first MIDI specification was published by Sequential and Roland in 1983. Roland's JX-3P synth was the first synth on the market with MIDI installed from the factory. Quickly, Sequential introduced the Prophet 600, its first MIDI-equipped synth, and offered MIDI as a factory option on the Prophet-5, with retrofits available for some existing units.

MIDI allowed communication between different instruments and general-purpose computers to play a role in music production.[6] Since its introduction, MIDI has remained the musical instrument industry standard interface through to the present day.[7] Kakehashi received the 2013 Technical Grammy Award for the invention of MIDI.[8][9]

Physical and Electrical Interface

The MIDI physical interface takes the form of serial transmission cables with a 5-pin male DIN plug at each end. Each cable carries data in one direction only; two cables are required for two-way communication. Data is carried on the cable via an asynchronous start/stop serial method that somewhat resembles the common RS-232 serial standard. The baud rate (the rate at which state transitions can occur on the line, which along with the byte format determines the data rate) is 31,250 baud (a somewhat unfortunate choice, since the computer industry was settling on 38,400 baud for this speed range). With an 8-bit byte, plus one start bit time and one stop bit time, this equates to 3,125 data bytes, or 25,000 data bits per second. The logic-1 voltage is +5 volts, and the logic-0 voltage is 0 volts.

The cable has identical male connectors at both ends; all connectors on all devices use female connectors. All conductors are wired straight through, with a pin at one end connected to the same-number pin at the other end; there is no need for "crossover" cables. Only three pins of the 5-pin connector are used. Pin 4 is the "positive" pin, on which the transmitter places the signal; pin 5 is the "return". Pins 4 and 5 form an isolated-from-ground current loop. Pin 2 on the cable ends is connected to the cable shield. The transmitting device grounds pin 2. Pins 1 and 3 are not used. (However, many cables have all five pins wired; this allows the cable to be used for other purposes, such as DIN sync.) At the receiver end, the standard mandates that an optoisolator be used. This electrically isolates the cable, and the transmitter connected at its other end, from the receiving device. At the receiving end, pin 2 is not to be connected. These measures prevent ground loops, and help reduce noise on the line by creating a "balanced line".

A fully equipped standard MIDI device has three jacks. An "in" jack receives data from a transmitter. An "out" jack sends data to a receiver. A "thru" jack electrically retransmits everything that appears at the "in" jack. This allows several receiving devices to be daisy-chained to a single transmitter. (The transmitter can use MIDI channel assignments to direct data to specific receiving devices.) Not all devices will have all three jacks; for instance, an effects unit that does not produce any data may not have an "out" jack. All devices that have an "in" jack are supposed to also have a "thru" jack, but not every manufacturer follows the standard in this regard.

Data Protocols

The MIDI communications protocol uses sequences of 8-bit bytes. Depending on the numeric value, a byte is categorized as either a "command" byte or a "data" byte. A command byte has its most significant bit set, so that in hexadecimal, its value is between 80 and FF. A data byte has a value between 0 and 7F. Most command bytes divide the byte into two four-bit chunks (each of which can be thought of as a single hex digit). The first digit is the message type, and the second digit is a channel number.

The concept of a channel allows several receiving devices to be chained off of a single transmitter, such that the transmitter can send data to each receiver individually, without confusion. By convention, channel numbers are referred to as being from 1 to 16, although in the actual command byte, these translate in hex to values from 0 to F (15 in decimal). Generally, the channel number that a receiving devices receives on is set by the performer using a setup menu selection on the device, although some devices may also use a rotary switch or a set of DIP switches. Multitimbral synths usually have the capability to receive on more than one channel. Some very old devices (e.g., early versions of the Yamaha DX7) can only receive on channel 1. There are some message types which are not channel-specific; they are received by every device connected to the transmitter For these types of messages, all 8 bits of the command byte specify the message type.

The list below describes the most commonly used basic message types, with the command byte value -- in hex -- for that message type. In the command byte values, an 'x' represents a channel number,. which may be any hex digit from 0 to F.

  • Note on (9x), which tells a synth to begin playing a note. Corresponds to a key being pressed on a keyboard. Each note on a conventional music keyboard is assigned a note number, with middle C being note 60. The range of the note numbers covers 127 notes, or about 10-1/2 octaves. The message format also conveys a velocity value, which corresponds to how sharply the key is struck.
  • Note off (8x), which tells a synth to end playing a note (or, more specifically, transition the note to its release phase). This corresponds to a key being released on a keyboard. The message format also allows for the conveyance of a release velocity.
  • Program change (Cx), which instructs the synth to load a patch.
  • Pitch wheel (Ex), which corresponds to the movement of a pitch bend lever or wheel. The message format allows for a 14-bit value, or roughly 16,000 individual steps.
  • Channel pressure, better known as aftertouch (Dx). This corresponds to the force applied to the keyboard on an aftertouch-sensitive keyboard.
  • Key pressure, better known as polyphonic aftertouch (Ax).
  • Control change (Bx). This message format allows for the transmission of a variety of control message values corresponding to either common synth features, or to modes of MIDI operation. The format allows for a Controller nnumber and a 7-bit value. (In the MIDI standard, when the word Controller appears capitalized, it refers specifically to a controller number or name that is assigned a fixed meaning in the standard.) To extend the range of possible values, some Controller types have been assigned two numbers, and the value is transmitted as two separate messages when the range extension is needed.
  • MIDI Clock (F8), used for synchronization of devices with a recording device, sequencer, or master timing device.
  • MIDI Time Code (F1). A more advanced synchronization mechanism.
  • Active Sensing (FE). Sometimes used to prevent stuck notes in the event that a cable becomes disconnected while playing.
  • System exclusive, commonly referred to as sysex (F0) allows individual synth and device manufacturers to define device-specific message formats on top of the standard formats. Sysex formats are totally under the control of the individual manufacturer. Each manufacturer is assigned a manufacturer number, which they are required to use in their sysex formats, to prevent conflicts. Sysex messages are used for a wide variety of tasks, ranging from patch loading/dumping, to real-time patch editing via remote control, to sample loading, to operating system upgrades, and other uses.
  • End of system exclusive (F7). Needed to indicate the end of a system exclusive message, since the standard allows such messages to be of variable length.

Protocol Additions and Extensions

Later additions to the original standard include MIDI Files, General MIDI and MIDI Machine Control. Over time, a number of smaller additions and extensions to the data protocols have been added, such as the MIDI Sample Dump Standard (SDS), which is a means of using sysex messages to transfer sample data between devices. In several places, further definition has been put onto areas which were originally left open or undefined, such as the standard definitions for continuous controller numbers, and the mechanism for extending the range of manufacturer ID numbers. Many change pages and edits were published for the original standard through the 1980s and 1990s.

In 1996, the MMA finally decided to codify all of the changes in a new version of the standard. For some reason, they decided to return to referring to this as the "MIDI 1.0" standard. It is still known by that name today, despite having been revised several times since 1996. It can be downloaded from the MMA Web site (registration is required).

Several extensions and revisions of the MIDI standard have been developed for control of theatrical lighting. One extension approved by the MMA is called MIDI Show Control; it consists of a set of defined system exclusive messages that are allocated for the purpose, which would allow music and lighting applications to exist within the same MIDI system (although this is seldom done). Some other de facto standards redefine MIDI standard messages and continuous controller numbers. Controller devices are made specifically for MIDI lighting applications, and gateway devices are available that convert MIDI lighting protocols into the widely used DMX512 lighting control system. MIDI has also seen some use in home automation applications.

Computer interfaces

Although the possibility of computers having MIDI interfaces was envisioned in the original early-1980s development, it was not known at the time what form such an interface might take, or what it might be required to do, so the original specification had no specific provisions for computer intefaces. The Atari ST microcomputer, introduced in 1985, was the first computer to include built-in MIDI in and out ports. System calls in the computer's operating system allowed programs to read and write MIDI data directly to/from the ports. As a result, the first software sequencer applications appeared on the ST platform.

However, the trend did not catch on; neither Apple nor IBM ever adopted built-in MIDI ports in their computers. As the Atari ST lost market share in the late '80s, third party manufacturers developed small MIDI interfaces that connected to a DOS/WIndows PC, or an Apple, using a serial port. Originally these were just "write-through" devices, but soon the hardware vendors, in response to the increasingly large MIDI studios and rigs being built by some performers, began building MIDI interfaces that included multiple in and out ports. These were intended to solve two problems: (1) multitimbral devices using multiple channel numbers were limiting the number of devices that could be put on one cable chain, and (2) intensive used of things like sysex messages, controllers and aftertouch were reaching the maximum data rate that the MIDI media can accommodate, resulting in the dreaded MIDI choke, and erratic / glitchy performances.

Multi-port interfaces created the need to devise a protocol, known to the sequencer software and the computer's MIDI interface, to tell the interface which MIDI out jack to send data to, or to tell the sequencer which MIDI in jack data came from. Sequencer manufacturers began collaborating with the hardware vendors to devise protocols. None of these were standardized and protocols proliferated. As data rates became higher, a new problem appeared: it took too long for data to transition from sequencer software, through the various layers of the computer's operating system, and through the interface hardware, before it was sent over the MIDI cable to the device. Performers began to complain of sloppy timing when executing complex sequences with larger setups. This problem got worse with the advent of USB interfaces on computers; USB is capable of high data rates but tends to have highly variable latency. So the next addition to the sequencer-to-interface protocol was time stamping. This allowed a sequencer application, when playing back a sequence, to "look ahead" and send data to the MIDI interface device with a time stamp saying, "Send this data at time X". When the interface was properly designed and time synchronized with the computer, time stamping improved timing accuracy when playing back sequences.

By the late '90s, most MIDI computer interfaces were being made by one of two manufacturers, MOTU and Opcode, and each had its own standard for the sequencer-to-interface protocol. Apple superceded both of these with the advent of their OS X operating system in 2002, which included the Core MIDI standard interface software. All sequencer and DAW applications now being marketed for the Macintosh platform use the Core MIDI standard to communicate with the hardware.

Today, computer MIDI interfaces are available which range from one-in-one-out basic interfaces, to large rack mount interfaces with eight or more of each. These often also have mechanisms that interface to other synchronization methods such as SMPTE or DIN sync, and provide translation back and forth to MIDI synchronization protocols.

MIDI protocols on other media

A rapidly growing trend is for MIDI devices to interface directly with computers using USB. (Some of the lower end of such devices no longer have traditional MIDI jacks.) Such devices usually conform to the "class compliant" standard published by the USB trade association in 1999, which allows them to interface with most WIndows, OS X or Linux computers without needed to have a device-specific driver loaded. Apple has developed a similar standard for Firewire, although it is little used. Yamaha also developed a MIDI-over-Firewire standard, called mLAN. At one time there was great industry interest in mLAN, but the development of faster USB bus standards led computer hardware manufacturers to cease including Firewire interfaces built into their products, and this caused mLAN to lose interest. By 2010, no mLAN products remained on the market.

Several standards have been developed for transmitting MIDI over Ethernet. Some proprietary solutions in the past have relied on creating custom transport layers on top of the Internet Protocol. However, support custom transport layers in most computer operating systems is problematic, often requiring the user to have administrator-level access, as well as needing system calls that are not well standardized. A new standard called RTP-MIDI transmits MIDI within the Internet standard Real Time Protocol transport. So far the main adopter has been Yamaha, which has implemented the interface on some models of its Motif workstations and some other products.

References

  1. 1.0 1.1 Kirn, Peter (2011). Keyboard Presents the Evolution of Electronic Dance Music. Backbeat Books. ISBN 978-1-61713-446-3. Archived from the original on 1 February 2017. https://books.google.co.uk/books?id=IbtJAgAAQBAJ&pg=PT72&lpg=PT72&dq=%22mark+vail%22+808&source=bl&ots=dOOpEyQGfI&sig=nPF6yAIeQlupw3Pw0Drg6LE34r4&hl=en&sa=X&ved=0ahUKEwir3b7qhsfRAhUFJcAKHfSNCyMQ6AEIHzAB#v=onepage&q=%22mark%20vail%22%20808&f=false. 
  2. 2.0 2.1 2.2 Chadabe, Joel (1 May 2000). "Part IV: The Seeds of the Future". Electronic Musician (Penton Media) XVI (5). http://www.emusician.com/gear/0769/the-electronic-century-part-iv-the-seeds-of-the-future/145415. 
  3. 3.0 3.1 Holmes, Thom. Electronic and Experimental Music: Pioneers in Technology and Composition. New York: Routledge, 2003
  4. Huber, David Miles (1991). The MIDI Manual. Carmel, Indiana: SAMS. ISBN 9780672227578. 
  5. 5.0 5.1 Manning, Peter. Electronic and Computer Music. 1985. Oxford: Oxford University Press, 1994. Print.
  6. Russ, Martin (2012). Sound Synthesis and Sampling. CRC Press. p. 192. ISBN 1136122141. https://books.google.co.uk/books?id=X9h5AgAAQBAJ&pg=PA192. Retrieved 26 April 2017. 
  7. The life and times of Ikutaro Kakehashi, the Roland pioneer modern music owes everything to, Fact
  8. Technical GRAMMY Award: Ikutaro Kakehashi And Dave Smith (29 January 2013).
  9. Ikutaro Kakehashi, Dave Smith: Technical GRAMMY Award Acceptance (9 February 2013).
Advertisement