Example of music created in MIDI formatUsing MIDI, a single controller (often a musical keyboard, as pictured here) can play multiple electronic instruments, which increases the portability and flexibility of stage setups. This system fits into a single rack case, but before the advent of MIDI, it would have required four separate full-size keyboard instruments, plus outboard mixing andeffects units.
Musical Instrument Digital Interface (/ˈmɪdi/;MIDI) is an American-Japanesetechnical standard that describes acommunication protocol,digital interface, andelectrical connectors that connect a wide variety ofelectronic musical instruments,computers, and related audio devices for playing, editing, and recording music.[1] A single MIDI cable can carry up to sixteen channels of MIDI data, each of which can be routed to a separate device. Each interaction with a key, button, knob or slider is converted into a MIDI event, which specifies musical instructions, such as a note'spitch, timing and velocity. One common MIDI application is to play a MIDIkeyboard or other controller and use it to trigger a digitalsound module (which contains synthesized musical sounds) to generate sounds, which the audience hears produced by akeyboard amplifier. MIDI data can be transferred via MIDI orUSB cable, or recorded to asequencer ordigital audio workstation to be edited or played back.[2]
MIDI also defines afile format that stores and exchanges the data. Advantages of MIDI include smallfile size, ease of modification and manipulation and a wide choice of electronic instruments andsynthesizer ordigitally sampled sounds.[3]: 4 A MIDI recording of a performance on a keyboard could sound like a piano or other keyboard instrument; however, since MIDI records the messages and information about their notes and not the specific sounds, this recording could be changed to many other sounds, ranging from synthesized or sampled guitar or flute to full orchestra.
Before the development of MIDI, electronic musical instruments from different manufacturers could generally not communicate with each other. This meant that a musician could not, for example, plug a Roland keyboard into a Yamaha synthesizer module. With MIDI, any MIDI-compatible keyboard (or other controller device) can be connected to any other MIDI-compatible sequencer, sound module,drum machine, synthesizer, or computer, even if they are made by different manufacturers.
MIDI technology was standardized in 1983 by a panel of music industry representatives and is maintained by theMIDI Manufacturers Association (MMA). All official MIDI standards are jointly developed and published by the MMA in Los Angeles, and the MIDI Committee of theAssociation of Musical Electronics Industry (AMEI) in Tokyo. In 2016, the MMA established The MIDI Association (TMA) to support a global community of people who work, play, or create with MIDI.[4]
In the early 1980s, there was nostandardized means of synchronizingelectronic musical instruments manufactured by different companies.[5] Manufacturers had their own proprietary standards to synchronize instruments, such asCV/gate,DIN sync andDigital Control Bus (DCB).[6]Ikutaro Kakehashi, the president ofRoland, felt the lack of standardization was limiting the growth of the electronic music industry.[6] In June 1981, he proposed developing a standard to theOberheim Electronics founderTom Oberheim,[5] who had developed his own proprietary interface, the Oberheim Parallel Bus.[7]
Kakehashi felt that Oberheim's system was too cumbersome, and spoke toDave Smith, the president ofSequential Circuits, about creating a simpler, cheaper alternative.[7] While Smith discussed the concept with American companies, Kakehashi discussed it with Japanese companiesYamaha,Korg andKawai.[5] Representatives from all companies met to discuss the idea in October.[5] Initially, only Sequential Circuits and the Japanese companies were interested.[8]
Dave Smith (right), one of the creators of MIDI
Using Roland's DCB as a basis,[6] Smith and Sequential Circuits engineer Chet Wood devised a universal interface to allow communication between equipment from different manufacturers. Smith and Wood proposed this standard in a paper,Universal Synthesizer Interface,[9] at theAudio Engineering Society show in October 1981.[10][11]: 4 The standard was discussed and modified by representatives of Roland, Yamaha, Korg, Kawai, and Sequential Circuits.[5][12]: 20 Kakehashi favored the name Universal Musical Interface (UMI), pronouncedyou-me,[7] but Smith felt this was "a little corny".[13] However, he liked the use ofinstrument instead ofsynthesizer, and proposedMusical Instrument Digital Interface (MIDI).[13][11]: 4 Robert Moog, the president ofMoog Music, announced MIDI in the October 1982 issue ofKeyboard.[14]: 276
At the 1983 WinterNAMM Show, Smith demonstrated a MIDI connection betweenProphet 600 andRoland JP-6 synthesizers. The MIDI specification was published in August 1983.[5] The MIDI standard was unveiled by Kakehashi and Smith, who receivedTechnical Grammy Awards in 2013 for their work.[15][16][17] In 1983, the first instruments were released with MIDI, theRoland Jupiter-6 and the Prophet 600. In 1983, the first MIDIdrum machine, theRoland TR-909,[18][19] and the first MIDIsequencer, the Roland MSQ-700, were released.[20]
The MIDI Manufacturers Association (MMA) was formed following a meeting of "all interested companies" at the 1984 Summer NAMM Show in Chicago. The MIDI 1.0 Detailed Specification was published at the MMA's second meeting at the 1985 Summer NAMM Show. The standard continued to evolve, adding standardized song files in 1991 (General MIDI) and adapted to new connection standards such asUSB andFireWire. In 2016, the MIDI Association was formed to continue overseeing the standard.[8] In 2017, an abridged version of MIDI 1.0 was published as an international standardIEC 63035.[21] An initiative to create a 2.0 standard was announced in January 2019.[22] The MIDI 2.0 standard was introduced at the 2020 Winter NAMM Show.[23]
TheBBC cited MIDI as an early example ofopen-source technology. Smith believed MIDI could only succeed if every manufacturer adopted it, and so "we had to give it away".[24]
MIDI's appeal was originally limited to professional musicians andrecord producers who wanted to use electronic instruments in the production ofpopular music. The standard allowed different instruments to communicate with each other and with computers, and this spurred a rapid expansion of the sales and production of electronic instruments and music software.[12]: 21 This interoperability allowed one device to be controlled from another, which reduced the amount of hardware musicians needed.[25] MIDI's introduction coincided with thedawn of the personal computer era and the introduction ofsamplers anddigital synthesizers.[26] The creative possibilities brought about by MIDI technology are credited for helping revive the music industry in the 1980s.[27]
MIDI introduced capabilities that transformed the way many musicians work.MIDI sequencing makes it possible for a user with no notation skills to build complex arrangements.[28] A musical act with as few as one or two members, each operating multiple MIDI-enabled devices, can deliver a performance similar to that of a larger group of musicians.[29] The expense of hiring outside musicians for a project can be reduced or eliminated,[2]: 7 and complex productions can be realized on a system as small as a synthesizer with integrated keyboard and sequencer.
MIDI also helped establishhome recording. By performingpreproduction in a home environment, an artist can reduce recording costs by arriving at a recording studio with a partially completed song.[2]: 7–8 In 2022, theGuardian wrote that MIDI remained as important to music asUSB was to computing, and represented "a crucial value system of cooperation and mutual benefit, one all but thrown out by today's major tech companies in favour of captive markets". In 2005, Smith's MIDI Specification was inducted into theTECnology Hall of Fame, an honor given to "products and innovations that have had an enduring impact on the development of audio technology."[30] As of 2022, Smith's original MIDI design was still in use.[31]
MIDI was invented so that electronic or digital musical instruments could communicate with each other and so that one instrument can control another. For example, a MIDI-compatible sequencer can trigger beats produced by a drumsound module. Analog synthesizers that have no digital component and were built prior to MIDI's development can be retrofitted with kits that convert MIDI messages into analog control voltages.[14]: 277 When a note is played on a MIDI instrument, it generates a digital MIDI message that can be used to trigger a note on another instrument.[2]: 20 The capability for remote control allows full-sized instruments to be replaced with smaller sound modules, and allows musicians to combine instruments to achieve a fuller sound, or to create combinations of synthesized instrument sounds, such as acoustic piano and strings.[32] MIDI also enables other instrument parameters (volume, effects, etc.) to be controlled remotely.
Synthesizers and samplers contain various tools for shaping an electronic or digital sound.Filters adjusttimbre, and envelopes automate the way a sound evolves over time after a note is triggered.[33] The frequency of a filter and the envelope attack (the time it takes for a sound to reach its maximum level), are examples of synthesizerparameters, and can be controlled remotely through MIDI. Effects devices have different parameters, such as delay feedback or reverb time. When a MIDI continuous controller number (CCN) is assigned to one of these parameters, the device responds to any messages it receives that are identified by that number. Controls such as knobs, switches, and pedals can be used to send these messages. A set of adjusted parameters can be saved to a device's internal memory as apatch, and these patches can be remotely selected by MIDI program changes.[a][34]
MIDI events can be sequenced withcomputer software, or in specialized hardwaremusic workstations. Manydigital audio workstations (DAWs) are specifically designed to work with MIDI as an integral component. MIDIpiano rolls have been developed in many DAWs so that the recorded MIDI messages can be easily modified.[35][better source needed] These tools allow composers to audition and edit their work much more quickly and efficiently than did older solutions, such asmultitrack recording.[citation needed] Compositions can be programmed for MIDI that are impossible for human performers to play.[36]
Because a MIDI performance is a sequence of commands that create sound, MIDI recordings can be manipulated in ways that audio recordings cannot. It is possible to change the key, instrumentation or tempo of a MIDI arrangement,[37]: 227 and to reorder its individual sections,[38] or even edit individual notes. The ability to compose ideas and quickly hear them played back enables composers to experiment.[39]: 175
Algorithmic composition programs provide computer-generated performances that can be used as song ideas or accompaniment.[2]: 122
Some composers may take advantage of the standard, portable set of commands and parameters in MIDI 1.0 andGeneral MIDI (GM) to share musical data files among various electronic instruments. The data composed via the sequenced MIDI recordings can be saved as astandard MIDI file (SMF), digitally distributed, and reproduced by any computer or electronic instrument that also adheres to the same MIDI, GM, and SMF standards. MIDI data files are much smaller than corresponding recordedaudio files.[citation needed]
Thepersonal computer market stabilized at the same time that MIDI appeared, and computers became a viable option for music production.[14]: 324 In 1983 computers started to play a role in mainstream music production.[40] In the years immediately after the 1983 ratification of the MIDI specification, MIDI features were adapted to several early computer platforms. TheYamaha CX5M introduced MIDI support andsequencing in anMSX system in 1984.[41]
The spread of MIDI on home computers was largely facilitated byRoland Corporation'sMPU-401, released in 1984, as the first MIDI-equippedsound card, capable of MIDI sound processing[42] and sequencing.[43][44] After Roland sold MPUsound chips to other sound card manufacturers,[42] it established a universal standard MIDI-to-PC interface.[45] The widespread adoption of MIDI led to computer-basedMIDI software being developed.[40] Soon after, a number of platforms began supporting MIDI, including theApple II,Macintosh,Commodore 64,Amiga,Acorn Archimedes, andIBM PC compatibles.[14]: 325–7 The 1985Atari ST shipped with MIDI ports as part of the base system.
In 2015, Retro Innovations released the first MIDI interface for aVIC-20, making the computer's four voices available to electronic musicians and retro-computing enthusiasts for the first time.[46] Retro Innovations also makes a MIDI interface cartridge forTandy Color Computer andDragon computers.[47]
MIDI files contain sound events such as a finger striking a key, which can be visualized using software such asSynthesia.
A MIDI file is not an audio recording. Rather, it is a set of instructions – for example, for pitch or tempo – and can use a thousand times less disk space than the equivalent recorded audio.[52][53] Due to their tiny filesize, fan-made MIDI arrangements became an attractive way to share music online, before the advent ofbroadband internet access and multi-gigabyte hard drives.[54] The major drawback to this is the wide variation in quality of users' audio cards, and in the actual audio contained as samples or synthesized sound in the card that the MIDI data only refers to symbolically. Even a sound card that contains high-quality sampled sounds can have inconsistent quality from one sampled instrument to another.[52] Early budget-priced cards, such as theAdLib and theSound Blaster and its compatibles, used a stripped-down version of Yamaha'sfrequency modulation synthesis (FM synthesis) technology[55] played back through low-quality digital-to-analog converters. The low-fidelity reproduction[52] of these ubiquitous[55] cards was often assumed to somehow be a property of MIDI itself. This created a perception of MIDI as low-quality audio, while in reality MIDI itself contains no sound,[56] and the quality of its playback depends entirely on the quality of the sound-producing device.[37]: 227
public.midi-audio"midi".Apple Developer Documentation: Uniform Type Identifiers. Apple Inc.Archived from the original on May 22, 2023. RetrievedMay 22, 2023.
TheStandard MIDI File (SMF) is afile format that provides a standardized way for music sequences to be saved, transported, and opened in other systems. The standard was developed and is maintained by the MMA, and usually uses a.mid extension.[57] The compact size of these files led to their widespread use in computers, mobile phoneringtones, webpage authoring and musical greeting cards. These files are intended for universal use and include such information as note values, timing and track names. Lyrics may be included asmetadata, and can be displayed bykaraoke machines.[58]
SMFs are created as an export format of software sequencers or hardware workstations. They organize MIDI messages into one or more paralleltracks and time-stamp the events so that they can be played back in sequence. Aheader contains the arrangement's track count, tempo and an indicator of which of three SMF formats the file uses. A type 0 file contains the entire performance, merged onto a single track, while type 1 files may contain any number of tracks that are performed synchronously. Type 2 files are rarely used[59] and store multiple arrangements, with each arrangement having its own track and intended to be played in sequence.
The main advantage of the personal computer in a MIDI system is that it can serve a number of different purposes, depending on the software that is loaded.[2]: 55 Multitasking allows simultaneous operation of programs that may be able to share data with each other.[2]: 65
Sequencing software can be used to manipulate recorded MIDI data with standard computer editing features such ascut, copy and paste anddrag and drop.Keyboard shortcuts can be used to streamline workflow, and, in some systems, editing functions may be invoked by MIDI events. The sequencer can set each channel to play a different sound and gives a graphical overview of the arrangement. A variety of editing tools are made available, including a notation display orscorewriter that can be used to create printed parts for musicians. Tools such aslooping,quantization, randomization, andtransposition simplify the arranging process.
Beat creation is simplified, andgroove templates can be used to duplicate another track's rhythmic feel. Realistic expression can be added through the manipulation of real-time controllers. Mixing can be performed, and MIDI can be synchronized with recorded audio and video tracks. Work can be saved, and transported between different computers or studios.[61][62]: 164–6
Sequencers may take alternate forms, such as drum pattern editors that users can use to create beats by clicking on pattern grids,[2]: 118 and loop sequencers such asACID Pro, which combine MIDI with prerecorded audio loops whose tempos and keys are matched to each other. Cue-list sequencing is used to trigger dialogue, sound effect, and music cues in stage and broadcast production.[2]: 121
With MIDI, notes played on a keyboard can automatically be transcribed tosheet music.[12]: 213 Scorewriting software typically lacks advanced sequencing tools and is optimized for the creation of a neat, professional printout designed for live instrumentalists.[62]: 157 These programs provide support for dynamics and expression markings, chord and lyric display, and complex score styles.[62]: 167 Software is available that can print scores inbraille.[63]
Users can program their equipment through the path editor as a computer interface. These became essential with the appearance of complex synthesizers such as theYamaha FS1R,[65] which contained several thousand programmable parameters, but had an interface that consisted of fifteen tiny buttons, four knobs and a small LCD.[66] Digital instruments typically discourage users from experimentation, due to their lack of the feedback and direct control that switches and knobs would provide,[67]: 393 but patch editors give owners of hardware instruments and effects devices the same editing functionality that is available to users of software synthesizers.[68] Some editors are designed for a specific instrument or effects device, while other,universal editors support a variety of equipment, and ideally can control the parameters of every device in a setup through the use of System Exclusive messages.[2]: 129 System Exclusive messages use the MIDI protocol to send information about the synthesizer's parameters.
Patch librarians have the specialized function of organizing the sounds in a collection of equipment and exchanging entire banks of sounds between an instrument and a computer. In this way the device's limited patch storage is augmented by a computer's much greater disk capacity.[2]: 133 Once transferred to the computer, custom patches can be shared with other owners of the same instrument.[69] Universal editor/librarians that combine the two functions were once common, and included Opcode Systems' Galaxy,eMagic's SoundDiver, and MOTU's Unisyn. Although these older programs have been largely abandoned with the trend toward computer-based synthesis using virtual instruments, several editor/librarians remain available, including Coffeeshopped Patch Base,[70] Sound Quest's Midi Quest, and several editors from Sound Tower.Native Instruments' Kore was an effort to bring the editor/librarian concept into the age of software instruments,[71] but was abandoned in 2011.[72]
Programs that can dynamically generate accompaniment tracks are calledauto-accompaniment programs. These create a full-band arrangement in a style that the user selects and sends the result to a MIDI sound-generating device for playback. The generated tracks can be used as educational or practice tools, as accompaniment for live performances, or as a songwriting aid.[73]: 42
Computers can use software to generate sounds, which are then passed through adigital-to-analog converter (DAC) to a power amplifier and loudspeaker system.[12]: 213 The number of sounds that can be played simultaneously (thepolyphony) is dependent on the power of the computer'sCPU, as are thesample rate andbit depth of playback, which directly affect the quality of the sound.[74] Synthesizers implemented in software are subject to timing issues that are not necessarily present with hardware instruments, whose dedicated operating systems are not subject to interruption from background tasks as desktopoperating systems are. These timing issues can cause synchronization problems, and clicks and pops when sample playback is interrupted. Software synthesizers also may exhibit additionallatency in their sound generation.[75]
The roots of software synthesis go back as far as the 1950s, whenMax Mathews ofBell Labs wrote theMUSIC-N programming language, which was capable of non-real-time sound generation.[76] Reality, by Dave Smith'sSeer Systems was an early synthesizer that ran directly on a host computer's CPU. Reality achieved a low latency through tight driver integration, and therefore could run only onCreative Labs soundcards.[77][78] Syntauri Corporation's Alpha Syntauri was another early software-based synthesizer. It ran on the Apple IIe computer and used a combination of software and the computer's hardware to produce additive synthesis.[79] Some systems use dedicated hardware to reduce the load on the host CPU, as withSymbolic Sound Corporation's Kyma System,[76] and theCreamware/Sonic Core Pulsar/SCOPE systems,[80] which power an entire recording studio's worth of instruments,effect units, andmixers.[81] The ability to construct full MIDI arrangements entirely in computer software allows a composer to render a finalized result directly as an audio file.[32]
Early PC games were distributed on floppy disks, and the small size of MIDI files made them a viable means of providing soundtracks. Games of theDOS and early Windows eras typically required compatibility with eitherAd Lib orSound Blaster audio cards. These cards usedFM synthesis, which generates sound throughmodulation ofsine waves.John Chowning, the technique's pioneer, theorized that the technology would be capable of accurate recreation of any sound ifenough sine waves were used, but budget computer audio cards performed FM synthesis with only two sine waves. Combined with the cards' 8-bit audio, this resulted in a sound described as "artificial"[82] and "primitive".[83]
Wavetabledaughterboards that were later available provided audio samples that could be used in place of the FM sound. These were expensive, but often used the sounds from respected MIDI instruments such as theE-mu Proteus.[83] The computer industry moved in the mid-1990s toward wavetable-based soundcards with 16-bit playback but standardized on a 2 MB of wavetable storage, a space too small in which to fit good-quality samples of 128 General MIDI instruments plus drum kits. To make the most of the limited space, some manufacturers stored 12-bit samples and expanded those to 16 bits on playback.[84]
Despite its association with music devices, MIDI can control any electronic or digital device that can read and process a MIDI command. MIDI has been adopted as a control protocol in a number of non-musical applications.MIDI Show Control uses MIDI commands to direct stage lighting systems and to trigger cued events in theatrical productions.VJs andturntablists use it to cue clips, and to synchronize equipment, and recording systems use it for synchronization andautomation. Wayne Lytle, the founder ofAnimusic, derived a system he dubbed MIDIMotion in order to produce theAnimusic series of computer-animated music video albums; Animusic would later design its own animation software specifically for MIDIMotion called Animotion.[85]Apple Motion allows for a similar control of animation parameters through MIDI. The 1987first-person shooter gameMIDI Maze and the 1990Atari STcomputer puzzle gameOxyd used MIDI to network computers together.
Per the original MIDI 1.0 standard, cables terminate in a180° five-pin DIN connector (DIN 41524). Typical applications use only three of the five conductors: aground wire (pin 2), and abalanced pair of conductors (pins 4 and 5) that carry the MIDI signal as anelectric current.[86][73]: 41 This connector configuration can only carry messages in one direction, so a second cable is necessary for two-way communication.[2]: 13 Some proprietary applications, such asphantom-powered footswitch controllers, use the spare pins fordirect current (DC) power transmission.[87]
Opto-isolators keep MIDI devices electrically separated from their MIDI connections, which preventsground loops[88]: 63 and protects equipment from voltage spikes.[14]: 277 There is noerror detection capability in MIDI, so the maximum cable length is set at 15 meters (49 ft) to limitinterference.[89]
To save space, some MIDI devices (smaller ones in particular) started using 3.5 mmTRS phone connectors (also known as audio minijack connectors).[90] This became widespread enough that the MIDI Manufacturers' Association standardized the wiring.[91] The MIDI-over-minijack standards document also recommends the use of 2.5 mm connectors over 3.5 mm ones to avoid confusion with audio connectors.[92]
Most devices do not copy messages from their input to their output port. A third type of port, thethru port, emits a copy of everything received at the input port, allowing data to be forwarded to another instrument[14]: 278 in adaisy-chain arrangement.[93] Not all devices feature thru ports, and devices that lack the ability to generate MIDI data, such as effects units and sound modules, may not include out ports.[67]: 384
Each device in a daisy chain adds delay to the system. This can be avoided by using a MIDI thru box, which contains several outputs that provide an exact copy of the box's input signal. A MIDI merger is able to combine the input from multiple devices into a single stream, and allows multiple controllers to be connected to a single device. A MIDI switcher allows switching between multiple devices, and eliminates the need to physically repatch cables. MIDI routers combine all of these functions. They contain multiple inputs and outputs, and allow any combination of input channels to be routed to any combination of output channels. Routing setups can be created using computer software, stored in memory, and selected by MIDI program change commands.[2]: 47–50 This enables the devices to function as standalone MIDI routers in situations where no computer is present.[2]: 62–3 [94] MIDI data processors are used for utility tasks and special effects. These include MIDI filters, which remove unwanted MIDI data from the stream, and MIDI delays, effects that send a repeated copy of the input data at a set time.[2]: 51
A computer MIDI interface's main function is to synchronize communications between the MIDI device and the computer.[93] Some computer sound cards include a standard MIDI connector, whereas others connect by any of various means that include theD-subminiature DA-15game port,USB,FireWire,Ethernet or a proprietary connection. The increasing use ofUSB connectors in the 2000s has led to the availability of MIDI-to-USB data interfaces that can transfer MIDI channels to USB-equipped computers. Some MIDI keyboard controllers are equipped with USB jacks, and can be connected directly to computers that run music software.
MIDI's serial transmission leads to timing problems. A three-byte MIDI message requires nearly 1 millisecond for transmission.[95] Because MIDI is serial, it can only send one event at a time. If an event is sent on two channels at once, the event on the second channel cannot transmit until the first one is finished, and so is delayed by 1 ms. If an event is sent on all channels at the same time, the last channel's transmission is delayed by as much as 16 ms. This contributed to the rise of MIDI interfaces with multiple in- and out-ports, because timing improves when events are spread between multiple ports as opposed to multiple channels on the same port.[75] The termMIDI slop refers to audible timing errors that result when MIDI transmission is delayed.[96]
Smaller MIDI controllers are popular due to their portability. This two-octave unit provides a variety of controls for manipulating various sound design parameters of computer-based or standalone hardware instruments, effects, mixers and recording devices.
There are two types of MIDI controllers: performance controllers that generate notes and are used to perform music,[97] and controllers that may not send notes, but transmit other types of real-time events. Many devices are a combination of the two types.
Keyboards are by far the most common type of MIDI controller.[69] MIDI was designed with keyboards in mind and any controller that is not a keyboard is considered an "alternative" controller.[98] This was seen as a limitation by composers who were not interested in keyboard-based music, but the standard proved flexible, and MIDI compatibility was introduced to other types of controllers, including guitars, and other stringed instruments anddrum controllers andwind controllers, which emulate the playing ofdrum kit and wind instruments, respectively and specialized and experimental controllers.[12]: 23 Nevertheless, some features of the keyboard playing for which MIDI was designed do not fully capture other instruments' capabilities;Jaron Lanier cites the standard as an example of technological "lock-in" that unexpectedly limited what was possible to express.[99] Some of these shortcomings have been addressed inextensions to the protocol.
Software synthesizers offer great power and versatility, but some players feel that division of attention between a MIDI keyboard and a computer keyboard and mouse robs some of the immediacy from the playing experience.[100] Devices dedicated to real-time MIDI control provide an ergonomic benefit and can provide a greater sense of connection with the instrument than an interface that is accessed through a computer. Controllers may be general-purpose devices that are designed to work with a variety of equipment, or they may be designed to work with a specific piece of software. Examples of the latter include Akai's APC40 controller forAbleton Live, and Korg's MS-20ic controller, a reproduction of the control panel on theirMS-20 analog synthesizer. The MS-20ic controller includespatch cables that can be used to control signal routing in their virtual reproduction of the MS-20 synthesizer and can also control third-party devices.[101]
Asound module, which requires an external controller (e.g., a MIDI keyboard) to trigger its sounds. These devices are highly portable, but their limited programming interface requires computer-based tools for comfortable access to their sound parameters.
A MIDI instrument contains ports to send and receive MIDI signals, a CPU to process those signals, an interface for user programming, audio circuitry to generate sound, and controllers. The operating system and factory sounds are often stored in aread-only memory (ROM) unit.[2]: 67–70
A MIDI instrument can also be a stand-alone module (without a piano-style keyboard) consisting of a General MIDI soundboard (GM, GS and XG), onboard editing, including transposing, MIDI instrument selection and adjusting volume, pan, reverb levels and other MIDI controllers. Typically, the MIDI module includes a screen, so the user can view information for the currently selected function.
Synthesizers may employ any of a variety of sound generation techniques. They may include an integrated keyboard or may exist as sound modules that generate sounds when triggered by an external controller, such as a MIDI keyboard. Sound modules are typically designed to be mounted in a19-inch rack.[2]: 70–72 Manufacturers commonly produce a synthesizer in both standalone and rack-mounted versions, and often offer the keyboard version in a variety of sizes.
Asampler can record and digitize audio, store it inrandom-access memory (RAM), and play it back. With a sampler, users typically can edit asample and save it to a hard disk, apply effects to it, and shape it with the same tools thatsubtractive synthesizers use. They also may be available in either keyboard or rack-mounted form.[2]: 74–8 Instruments that generate sounds through sample playback, but have no recording capabilities, are known as "ROMplers".
Samplers did not become established as viable MIDI instruments as quickly as synthesizers did due to the expense of memory and processing power at the time.[14]: 295 The first low-cost MIDI sampler was theEnsoniq Mirage, introduced in 1984.[14]: 304 MIDI samplers are typically limited by displays that are too small to use to edit sampled waveforms, although some can be connected to a computer monitor.[14]: 305
Drum machines typically are sample playback devices that specialize in drum and percussion sounds. They commonly contain a sequencer for creating drum patterns and arranging them into a song. There often are multiple audio outputs so that each sound or group of sounds can be routed to a separate output. The individual drum voices may be playable from another MIDI instrument or from a sequencer.[2]: 84
Yamaha'sTenori-on controller allows arrangements to be built by "drawing" on its array of lighted buttons. The resulting arrangements can be played back using its internal sounds or external sound sources, or recorded in a computer-based sequencer.
Sequencer technology predates MIDI.Analog sequencers useCV/Gate signals to control pre-MIDI analog synthesizers. MIDI sequencers typically are operated by transport features modeled after those oftape decks. They are capable of recording MIDI performances and arranging them into individual tracks using amultitrack recording paradigm. Music workstations combine controller keyboards with an internal sound generator and a sequencer. These can be used to build complete arrangements and play them back using their own internal sounds and function as self-contained music production studios. They commonly include file storage and transfer capabilities.[2]: 103–4
Someeffects units can be remotely controlled via MIDI. For example, theEventide H3000 Ultra-harmonizer allows such extensive MIDI control that it is playable as a synthesizer.[14]: 322 TheDrum Buddy, a pedal-formatdrum machine, has a MIDI connection so that it can have its tempo synchronized with alooper pedal or time-based effects such as delay.
MIDI messages are made up of 8-bitbytes transmitted at 31,250[b] (±1%) baud using 8-N-1asynchronous serial communication as described in the figure. The first bit of each byte identifies whether the byte is astatus byte or adata byte, and is followed by seven bits of information.[2]: 13–14
A MIDI link can carry sixteen independent channels, numbered 1–16. A device may listen to specific channels and ignore messages on other channels (omni off mode), or it can listen to all channels, effectively ignoring the channel address (omni on).
A device that ispolyphonic can sound multiple notes simultaneously, until the device's polyphony limit is reached, or the notes reach the end of theirdecay envelope, or explicitnote-off MIDI commands are received. A device that ismonophonic instead terminates any previous note when newnote-on commands arrive.
Some receiving devices may be set to all four combinations ofomni off/on andmono/poly modes.[2]: 14–18
A MIDI message is an instruction that controls some aspect of the receiving device. A MIDI message consists of a status byte, which indicates the type of the message, followed by up to two data bytes that contain the parameters.[37] MIDI messages can bechannel messages sent on only one of the 16 channels and monitored only by devices on that channel, orsystem messages that all devices receive. Each receiving device ignores data not relevant to its function.[67]: 384 There are five types of message: Channel Voice, Channel Mode, System Common, System Real-Time, and System Exclusive.[103]
Channel Voice messages transmit real-time performance data over a single channel. Examples includenote-on messages which contain a MIDI note number that specifies the note's pitch, a velocity value that indicates how forcefully the note was played, and the channel number;note-off messages that end a note; program change messages that change a device's patch; and control changes that allow adjustment of an instrument's parameters. MIDI notes are numbered from 0 to 127 assigned to C−1 to G9. This extends beyond the 88-note piano range from A0 to C8 and corresponds to a frequency range of 8.175799 to 12543.85 Hz.[c]
System Exclusive (SysEx) messages send information about a synthesizer's functions, rather than performance data such as which notes are being played and how loud. Because they can include functionality beyond what the MIDI standard provides, they are a major reason for the flexibility and longevity of the MIDI standard. Manufacturers use them to create proprietary messages that control their equipment more thoroughly than the limitations of standard MIDI messages.[14]: 287
The MIDI Manufacturers Association issues a unique identification number to MIDI companies.[104] These are included in SysEx messages, to ensure that only the specifically addressed device responds to the message, while all others know to ignore it. Many instruments also include a SysEx ID setting, so a controller can address two devices of the same model independently.[105]
Universal System Exclusive messages are a special class of SysEx messages used for extensions to MIDI that are not intended to be exclusive to one manufacturer.[106]
Devices typically do not respond to every type of message defined by the MIDI specification. The MIDI implementation chart was standardized by the MMA as a way for users to see what specific capabilities an instrument has, and how it responds to messages.[2]: 231 A populated MIDI implementation chart is usually published as part of the documentation for MIDI devices.
To transmit a logic 0 and a start bit, the sender'sUART[f] produces a low voltage. This results in a nominal 5 milliamperes[102] current flowsourced from the sender's high voltage supply,[g] which travels rightwards along the red lines though theshielded[h]twisted-pair cable and into the receiver's opto-isolator. The current exits the opto-isolator and returns back leftwards along the blue lines into the sender's UART, whichsinks the current.[i]Resistors R1 and R2 limit the current and are equal to provide abalanced impedance. Thediode is for protection.[110] This current turns on the opto-isolator's[j]LED andphototransistor, so the receiver's UART can read the signal with the help ofpull-up resistor R3 to the receiver's voltage supply. While the supplies in the original specification are 5 volts, the receiver and sender may use different voltage levels.
To transmit a logic 1, a stop bit, and while idle, the sender'sUART produces the same high voltage as itsvoltage supply provides, which results in no current flow. This avoids wasting power when idle.
MIDI's flexibility and widespread adoption have led to many refinements of the standard, and have enabled its application to purposes beyond those for which it was originally intended.
General MIDI's Percussion Key Map specifies the percussion sound that a given note triggers. MIDI note numbers shown in parentheses next to their corresponding keyboard note.
MIDI allows the selection of an instrument's sounds through program change messages, but there is no guarantee that any two instruments have the same sound at a given program location.[111] Program #0 may be a piano on one instrument, or a flute on another. The General MIDI (GM) standard was established in 1991, and provides a standardized sound bank that allows a Standard MIDI File created on one device to sound similar when played back on another. GM specifies a bank of 128 sounds arranged into 16 families of eight related instruments, and assigns a specific program number to each instrument.[112] Any given program change selects the same instrument sound on any GM-compatible instrument.[113] Percussion instruments are placed on channel 10, and a specific MIDI note value is mapped to each percussion sound.
The GM standard eliminates variation in note mapping. Some manufacturers had disagreed over what note number should represent middle C, but GM specifies that note number 69 playsA440, which in turn fixes middle C as note number 60.
GM-compliant devices must offer 24-note polyphony.[114] GM-compatible devices are required to respond to velocity, aftertouch, and pitch bend, to be set to specified default values at startup, and to support certain controller numbers such as forsustain pedal, and Registered Parameter Numbers (RPNs).[115]
A simplified version of GM, calledGM Lite, is used for devices with limited processing power.[111][116]
A general opinion quickly formed that the GM's 128-instrument sound set was not large enough. Roland's General Standard, orRoland GS, included additional sounds, drumkits and effects, provided abank select command that could be used to access them, and used MIDI Non-Registered Parameter Numbers (NRPNs) to access its new features. Yamaha's Extended General MIDI, orYamaha XG, followed in 1994. XG similarly offered extra sounds, drumkits and effects, but used standard controllers instead of NRPNs for editing, and increased polyphony to 32 voices. Both standards feature backward compatibility with the GM specification but are not compatible with each other.[117] Neither standard has been adopted beyond its creator, but both are commonly supported by music software titles.
Member companies of Japan'sAMEI developed theGeneral MIDI Level 2 specification in 1999. GM2 maintains backward compatibility with GM, but increases polyphony to 32 voices, standardizes several controller numbers such as forsostenuto andsoft pedal (una corda), RPNs and Universal System Exclusive Messages, and incorporates the MIDI Tuning Standard.[118] GM2 is the basis of the instrument selection mechanism in Scalable Polyphony MIDI (SP-MIDI), a MIDI variant for low-power devices that allows the device's polyphony to scale according to its processing power.[111]
Most MIDI synthesizers useequal temperament tuning. TheMIDI tuning standard (MTS), ratified in 1992, allows alternate tunings.[119] MTS allowsmicrotunings that can be loaded from a bank of up to 128 patches, and allows real-time adjustment of note pitches.[120] Manufacturers are not required to support the standard. Those who do are not required to implement all of its features.[119]
A sequencer can drive a MIDI system with its internal clock, but when a system contains multiple sequencers, they must synchronize to a common clock. MIDI timecode (MTC), developed byDigidesign,[121] implements SysEx messages[122] developed specifically for timing purposes, and can translate to and from theSMPTE timecode standard.[14]: 288 MIDI interfaces such as Mark of the Unicorn's MIDI Timepiece can convert SMPTE code to MTC.[123] While MIDI clock is based on tempo, timecode is based onframes and is independent of tempo. MTC, like SMPTE timecode, includes position information and can recover in the event of adropout.[124]
MIDI Machine Control (MMC) consists of a set of SysEx commands[125] that operate the transport controls of hardware recording devices. MMC lets a sequencer sendStart,Stop, andRecord commands to a connected tape deck or hard disk recording system, and to fast-forward or rewind the device to start playback at the same point as the sequencer. No synchronization data is involved, although the devices may synchronize through MTC.[126]
MIDI Show Control (MSC) is a set of SysEx commands for sequencing and remotelycueing show control devices such as lighting, music and sound playback, andmotion control systems.[128] Applications include stage productions, museum exhibits, recording studio control systems, andamusement park attractions.[127]
One solution to MIDI timing problems is to mark MIDI events with the times they are to be played, transmit them beforehand, and store them in a buffer in the receiving device. Sending data beforehand reduces the likelihood that a busy passage overwhelms the transmission link. Once stored in the receiver, the information is no longer subject to timing issues associated with MIDI or USB interfaces and can be played with a high degree of accuracy.[129] MIDI timestamping only works when both hardware and software support it. MOTU's MTS, eMagic's AMT, and Steinberg's Midex 8 had implementations that were incompatible with each other, and required users to own software and hardware manufactured by the same company to work.[75] Timestamping is built into FireWire MIDI interfaces,[130] Mac OS XCore Audio, and Linux ALSA Sequencer.
An unforeseen capability of SysEx messages was their use for transporting audio samples between instruments. This led to the development of the sample dump standard (SDS), which established a new SysEx format for sample transmission.[14]: 287 SDS was later augmented with a pair of commands that allow the transmission of information about sample loop points, without requiring that the entire sample be transmitted.[131]
The Downloadable Sounds (DLS) specification, ratified in 1997, allows mobile devices and computersound cards to expand their wave tables with downloadable sound sets.[132] The DLS Level 2 specification followed in 2006, and defined a standardized synthesizer architecture. The Mobile DLS standard calls for DLS banks to be combined with SP-MIDI, as self-contained Mobile XMF files.[133]
MIDI Polyphonic Expression (MPE) is a method of using MIDI that enables pitch bend, and other dimensions of expressive control, to be adjusted continuously for individual notes.[134] MPE works by assigning each note to its own MIDI channel so that controller messages can be applied to each note individually.[135][134] The specifications were released in November 2017 by AMEI and in January 2018 by the MMA.[136] Instruments like theContinuum Fingerboard,LinnStrument,ROLI Seaboard,Sensel Morph, andEigenharp let users control pitch, timbre, and other nuances for individual notes within chords.[137]
In addition to using a 31.25 kbit/s current-loop over aDIN connector, the same data can be transmitted over different hardware transports such asUSB,FireWire, andEthernet.
Members of the USB-IF in 1999 developed a standard for MIDI over USB, the "Universal Serial Bus Device Class Definition for MIDI Devices".[138] MIDI over USB has become increasingly common as other interfaces that had been used for MIDI connections (ISA card,game port, etc.) disappeared from personal computers. Linux, Microsoft Windows, Macintosh OS X, and Apple iOS operating systems includestandard class drivers to support devices that use the "Universal Serial Bus Device Class Definition for MIDI Devices".
Apple Computer developed the FireWire interface during the 1990s. It began to appear ondigital video (DV) cameras toward the end of the decade, and on G3 Macintosh models in 1999.[139] It was created for use with multimedia applications.[130] Unlike USB, FireWire uses intelligent controllers that can manage their own transmission without attention from the main CPU.[140] As with standard MIDI devices, FireWire devices can communicate with each other with no computer present.[141]
The Octave-PlateauVoyetra-8 synthesizer was an early MIDI implementation usingXLR3 connectors in place of the5-pin DIN. It was released in the pre-MIDI years and later retrofitted with a MIDI interface but kept its XLR connector.[142]
As computer-based studio setups became common, MIDI devices that could connect directly to a computer became available. These typically used the8-pin mini-DIN connector that was used by Apple forserial ports prior to the introduction of theBlue and White G3 models. MIDI interfaces intended for use as the centerpiece of a studio, such as theMark of the Unicorn MIDI Time Piece, were made possible by a fast transmission mode that could take advantage of these serial ports' ability to operate at 20 times the standard MIDI speed.[2]: 62–3 [141] Mini-DIN ports were built into some late-1990s MIDI instruments and enabled such devices to be connected directly to a computer.[143] Some devices connected via a PCs'DB-25 parallel port, or through theDA-15 game port found on many PC sound cards.[141]
Yamaha introduced themLAN protocol in 1999. It was conceived as alocal area network for musical instruments using FireWire as the transport and was designed to carry multiple MIDI channels together with multichannel digital audio, data file transfers, and timecode.[139][140] mLan was used in a number of Yamaha products, notablydigital mixing consoles and theMotif synthesizer, and in third-party products such as the PreSonus FIREstation and theKorg Triton Studio.[144] No new mLan products have been released since 2007.
Computer network implementations of MIDI provide network routing capabilities, and the high-bandwidth channel that earlier alternatives to MIDI, such asZIPI, were intended to bring. Proprietary implementations have existed since the 1980s, some of which usefiber optic cables for transmission.[2]: 53–4 TheInternet Engineering Task Force'sRTP-MIDI open specification has gained industry support. Apple has supported this protocol fromMac OS X 10.4 onwards, and aWindows driver based on Apple's implementation exists for Windows XP and newer versions.[148]
Systems for wireless MIDI transmission have been available since the 1980s.[2]: 44 Several commercially available transmitters allow wireless transmission of MIDI andOSC signals overWi-Fi andBluetooth.[149] iOS devices are able to function as MIDI control surfaces, using Wi-Fi and OSC.[150] AnXBee radio can be used to build a wireless MIDI transceiver as a do-it-yourself project.[151] Android devices are able to function as full MIDI control surfaces using several different protocols overWi-Fi andBluetooth.[152]
TheMIDI 2.0 standard was unveiled on January 17, 2020, at the WinterNAMM Show in Anaheim, California. RepresentativesYamaha,ROLI, Microsoft,Google, and the MIDI Association introduced the update,[153] which enables bidirectional communication while maintaining backward compatibility.[154]
AMEI and MMA announced that complete specifications will be published following interoperability testing of prototype implementations from major manufacturers such as Google, Yamaha,Steinberg,Roland,Ableton,Native Instruments, and ROLI, among others.[161][162][163] In January 2020, Roland announced the A-88mkII controller keyboard that supports MIDI 2.0.[164] MIDI 2.0 includes MIDI Capability Inquiry specification for property exchange and profiles, and the new Universal MIDI Packet format for high-speed transports which supports both MIDI 1.0 and MIDI 2.0 voice messages.
Some devices operating MIDI 1.0 can "retrofit" some 2.0 features. Since its release in early January 2020 by the MIDI Manufacturers Association, more details have yet to come out about the new update. As of 2021[update] there were five components to MIDI such as; M2-100-U v1.0 MIDI 2.0 Specification Overview, M2-101-UM v1.1 MIDI-CI Specification, M2-102-U v1.0 Common Rules for MIDI-CI Profiles, M2-103-UM v1.0 Common Rules for MIDI-CI PE and M2-104-UM v1.0 UMP and MIDI 2.0 Protocol Specification. Other specifications regarding MIDI 2.0 include; allowing the use of 32,000 controllers and wide range note enhancements. These enhancements are made better through the property exchange.[165] In June 2023 updated and new MIDI 2.0 specifications were released consisting of M2-100-U MIDI 2.0 Specification Overview, Version 1.1, M2-101-UM MIDI Capability Inquiry (MIDI-CI), Version 1.2, M2-102-U Common Rules for MIDI-CI Profiles, Version 1.1, M2-104-UM Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol, Version 1.1, and M2-116-U MIDI Clip File (SMF2), Version 1.0.[166]
^The MIDI standard allows selection of 128 different programs, but devices can provide more by arranging their patches into banks of 128 programs each and combining a program change message with a bank select message.
^The original MIDI 1.0 specification mandated DIN-5. The current source pin or hot pin ("H" in this schematic) corresponds to pin 4 of a 5-pin DIN. The current sink or cold pin ("C" in this schematic) corresponds to pin 5 of that DIN. The shield pin ("S" in this schematic) corresponds to pin 2 of that DIN.
^Three variants on how to use TRS phone connectors are calledType A,Type B, andTS (a.k.a.Type C orNon-TRS).Type A became part of the MIDI standard in 2018.Type A pin assignments are: the current source or hot pin ("H" in the schematic) is ring of the TRS, the current sink or cold pin ("C" in the schematic) is the tip of the TRS, and the shield ("S" in the schematic) is the sleeve of the TRS.
^Universal Asynchronous Receiver/Transmitter (UART) is hardware that transports bytes between digital devices. When MIDI was new, most synthesizers used discrete, external UART chips, such as the8250 or16550 UART, but UARTs have since moved intomicrocontrollers.[109]
^MIDI nominally uses a +5 volt source, in which case the resistance assignments are R1=R2=R4=220Ω and R3=280Ω. But it is possible to change the resistance values to achieve a similar current with other voltage supplies (in particular, for 3.3 volt systems).
^The MIDI specification provides for a ground "wire" and a braid or foil shield, connected on the Shield pin, protecting the two signal-carrying conductors on the Hot and Cold pins. Although the MIDI cable is supposed to connect this Shield pin and the braid or foil shield to chassis ground, it should do so only at the MIDI out port; the MIDI in port should leave its Shield pin unconnected and isolated. Some large manufacturers of MIDI devices use modified MIDI in-only DIN 5-pin sockets with the metallic conductors intentionally omitted at pin positions 1, 2, and 3 so that the maximum voltage isolation is obtained.
^It is often easier to useNPN ornMOS transistors tosink current than to usePNP orpMOS transistors tosource current, becauseelectron mobility is better than hole mobility.
^MIDI's original reference design uses the obsoleteSharp PC900, but modern designs frequently use the 6N138.[109] The opto-isolator providesgalvanic isolation, so there is no conductive path between the two MIDI devices. Properly designed MIDI devices are therefore relatively immune to ground loops and similar interference.
^Swift, Andrew. (May 1997),"A brief Introduction to MIDI",SURPRISE, Imperial College of Science Technology and Medicine, archived fromthe original on August 30, 2012, retrievedAugust 22, 2012
^samples, Electronic Musician – featuring gear reviews, audio tutorials, loops and."The MIDI Association Launches at NAMM 2016".Archived from the original on October 14, 2016. RetrievedAugust 31, 2016.{{cite web}}: CS1 maint: multiple names: authors list (link)
^Battino, David.Finally: MIDI 2.0Archived 16 August 2012 at theWayback Machine O'Reilly Digital Media Blog. O'Reilly Media, Inc. 6 October 2005. Web. 22 August 2012
^"MIDI Files".midi.org. Music Manufacturers Association. Archived fromthe original on August 22, 2012.a Type 2 was also specified originally but never really caught on
^Gellerman, Elizabeth. "Audio Editing SW Is Music to Multimedia Developers' Ears".Technical Horizons in Education Journal. Vol. 22, No. 2. Sep 1994
^abcDesmond, Peter. "ICT in the Secondary Music Curriculum".Aspects of Teaching Secondary Music: Perspectives on Practice. ed. Gary Spruce. New York: RoutledgeFalmer, 2002
^Johnson, Derek; Poyser, Debbie (December 1998)."Yamaha FS1R".Sound on Sound. Archived fromthe original on April 15, 2007.
^abcGibbs, Jonathan (Rev. by Peter Howell) "Electronic Music".Sound Recording Practice, 4th Ed. Ed. John Borwick. Oxford: Oxford University Press, 1996
^"Patch Base".Archived from the original on September 7, 2022. RetrievedSeptember 7, 2022.
^Price, Simon (July 2006)."Native Instruments Kore".Soundonsound.com. Sound on Sound.Archived from the original on June 2, 2013. RetrievedNovember 27, 2012.
^Russ, Martin (January 1, 1988)."Practically MIDI (SOS Jan 1988)".Sound on Sound (Jan 1988):56–59.Archived from the original on December 14, 2023. RetrievedDecember 14, 2023.
^Battino, David.Finally: MIDI 2.0Archived 16 August 2012 at theWayback Machine O'Reilly Digital Media Blog. O'Reilly Media, Inc. 6 October 2005. Web. 22 August 2012