With all that has been going on around the world these past six months, audio professionals will be forgiven if it slipped by that back in January 2020, at NAMM, the MIDI 2.0 spec was introduced, bringing with it the promise of significantly more powerful and expressive functionality. In a small way, this is a very big deal, as the new standard is finally real after years in development. At the same time, it doesn’t mean there will be a flood of MIDI 2.0 keyboards available for the holidays. Big deals take time.
Good-old MIDI 1.0, which has remained surprisingly relevant for 37 years, isn’t going away anytime soon. Even when manufacturers and software developers start to put out MIDI 2.0 products on a regular basis, there will still be a massive base of MIDI 1.0 products out there that musicians won’t want to cast aside.
Fortunately, one of the main pillars of the MIDI 2.0 project was to maintain backward-compatibility with MIDI 1.0 gear and software. There are even some aspects of MIDI 2.0, such as its improved time-stamping, that can benefit MIDI 1.0 gear.
Two Into One
One of the problems with 37-year old technology is that, well, it’s 37-year-old technology. Although it’s still plenty usable, MIDI 1.0 lags far behind the times when it comes to data-handling. Back in 1983, 128 steps seemed like plenty of resolution. At today’s processing speeds, it’s just a blip. That will change in a big way with MIDI 2.0.
To put it quite simply, says esteemed technology journalist Craig Anderton, “[MIDI 2.0] will provide an analog feel to controls where there isn’t stair-stepping or audible quantization.”
“Just in terms of raw numbers, you’re going from 128 steps at seven bits of data to 4 billion steps at 32 bits,” says Brett Porter, lead engineer at Art+Logic, a software development company that has been a key partner in building code for the MIDI 2.0 project over the past couple of years. He was scheduled to do a presentation at South by Southwest before it was canceled.
But before that cancellation, he recalls, there was a key MIDI 2.0 demo at NAMM that showed how the higher resolution would affect motorized faders. “An engineer from Roland had two motorized faders side by side, and he was driving one of them with 7-bit data, and one of them was 32-bit data. You could see the 7-bit one jumping, while the 32-bit one was smooth.”
Oddly, whereas MIDI 1.0’s shortcoming was not enough data, MIDI 2.0 could have the opposite problem, says Porter. “If you’ve got that 32 bits of data and you’ve got a high-resolution encoder, it’s going to be easy to turn that knob and create way too much data. We’re going to have to start learning how to thin it out intelligently.”
Another longtime complaint about MIDI 1.0 has been that 127 steps of velocity don’t provide enough resolution to faithfully reproduce the nuances of a performance on a MIDI instrument, particularly a piano. That number, of course, is going up in MIDI 2.0. The new spec calls for 16-bit velocity, compared to the current 7-bit data. Under MIDI 2.0, velocity will offer a whopping 65,536 possible Velocity values. And that’s not all.
“One other thing added to the note-on and note-off message was this idea of an attribute,” Porter says. “Every note has this other chunk of data that can go with it.”
That chunk can hold additional note data, which Porter says will particularly benefit, “anybody who is doing microtonal music or Xenharmonic music. They’ll no longer have to set up a tuning table in a synth. They can send the pitch data out as part of the note.”
Forward and Back
Communication in MIDI 1.0 is strictly one way: MIDI In, MIDI Out. That changes in MIDI 2.0, with the adoption of bi-directional data flow over USB or Ethernet. If you’ve ever taken on the tedious job of mapping controllers to software functions, you’ll appreciate one of the most talked-about additions in MIDI 2.0: Profile Configuration.
“When two pieces of gear connect under MIDI 2.0,” Porter says, “the first thing they’ll do is ask each other, ‘Who are you? What do you know how to do?'” The devices (or a device and a software application) will then automatically configure themselves to work together based on a series of standardized profiles.
Anderton describes a keyboard controller connecting to a DAW running a virtual analog synth plug-in. “A controller can query the instrument, say ‘What are you?’ and upon receiving an answer, assign its controls in a logical way to common parameters like filter cutoff, resonance, envelopes, pulse width, et cetera.
“If you opened a different virtual analog synth, the same controls with which you’re familiar would apply,” he continues. “You could control anything with relatively standard parameters—including amp sims, delays, whatever—similarly. Not having to customize hardware constantly to work with software will be a big deal for improving setup, as well as for workflow.”
Another new feature, Property Exchange, goes beyond the generic configuration of Profile Configuration and allows communication of specific data, such as patches, between devices. Porter gives the following example of how it might work:
“My DX7 is sitting right next to me connected to a rackmount synth. When I press a patch change on the controller, there’s no way for me, when looking at the keyboard, to know that patch 20 happens to be a Rhodes. It’s just a number. With Property Exchange being able to go across that, a keyboard with a much nicer display is going to be able to say to that synth, ‘What are your patch names?’ Which is so obvious. But now we have a mechanism to do that.”
“It will be easy to store hardware synth parameters within a DAW,” adds Anderton, “in a far more obvious—and human-readable—way than Sys Ex. Signal processors will then become, as far as a DAW is concerned, a plug-in. It would even be possible to retrofit some older synths to do this, but whether there would be an economic incentive is questionable.”
What’s the Timeline?
All of these new capabilities sound fantastic, but gear with MIDI 2.0 support isn’t yet available, nor is there support among DAWs and plug-ins. One exception is the Roland A-88MKII keyboard. On its website, the company says it is “ready to take advantage of the extended capabilities of MIDI 2.0,” although it hasn’t specified what that readiness entails.
Several companies have shown working prototypes, as was evident at the MMA annual meeting at NAMM. “Roland, Korg, and Yamaha sent engineers over from Japan with prototype gear,” Porter says. “They were playing with a Yamaha-modified keyboard controlling a Korg-modified synth over MIDI 2.0, with all the new expressivity and resolution, and it worked.”
But there are still more hurdles preventing widespread adoption of the new standard. One of them is bureaucratic. “We have to wait for the USB people to approve the way we want to send MIDI 2.0 data over it,” Porter says. “And that’s in process. I have no idea how long that takes. Once that happens, then all the companies like Focusrite and MOTU that make interface boxes have to update their systems. Apple and Microsoft have to update their operating systems. It’s a chicken-and-egg thing, and everybody behind the scenes is doing what they can, waiting for those other pieces.”
It appears that the adoption of MIDI 2.0 is going to happen gradually. “It’s not like we can flick a switch and all of a sudden, MIDI 2.0 comes out the other end,” Anderton points out. “The prioritization on backward-compatibility means new features can roll out over time. It will also be possible to buy products where MIDI 2.0 is dormant, as it waits for other products to appear. Then one day, that product wakes up to find other MIDI 2.0 gear with which it can converse.”
While we wait for MIDI 2.0 to gradually penetrate the market, we have time to say an extended goodbye to MIDI 1.0, and to marvel at its longevity.