You can get the feeling of “pushing into” an iPhone as of the iPhone 6S. It’s an expressive, intimate gesture, which is generally used for … wait, really, shortcut menus? That’s pretty boring.
Ever since I saw the feature, I wanted to see it used for music applications. And one obvious fit is an emerging standard for sending expressive pressure-based control over MIDI.
The futuristic, sleek black ROLI Seaboard does it. The lovely, wooden Madrona Labs Soundplane does it. Roger Linn’s innovative grid-covered Linnstrument does it. It’s all a (draft) specification for control called MPE – Multidimensional Polyphonic Expression. (Early on, people wanted to dub this “expressive MIDI,” but that might imply that MIDI is somehow not normally expressive, when it is.)
MPE is cool because in addition to velocity (when you hit a note) or only monophonic pressure (like channel aftertouch), it lets you send additional control data for everything. Maybe your pinkie is pushing a little less than your ring finger, and so on.
Aftertouch is an app that uses the iPhone’s 3D Touch capability to send both velocity and polyphonic pressure messages. So instead of just feeling like your fingers are pressing glass, you can actually use all those different fingers as nature intended.
On its own, Aftertouch lets you play, in the author’s words, “a silly little phase mod synth.”
But you can also send actual MPE data, making this compatible with instruments like Apple’s Sculpture in Logic. (Dear Apple: why oh why is Sculpture not available on iOS?) For each finger, you send pitch bend, modulation, and pressure via MIDI. That’ll work with a MIDI interface if you’ve got one, or wirelessly with Bluetooth MIDI.
The one and only R. Kevin Nelson created the app. It’s yours for 99 cents.
And there’s a little site for it:
I’d actually been talking for some time about wanting to make a little app like this, but Mr. Nelson beat me to it. That said, I notice there are actually some things this doesn’t do or does differently, so I’m curious to hear readers talk about what they want or how they imagine using this!
And if you have a compatible device with 3D Touch, you should absolutely also download ROLI’s own app. “Noise” is actually like having a virtual Seaboard on your phone or iPad. You can use it as a sound bank for the hardware, or take it on the go and practice the keyboard technique in miniature. It’s really clever, and I’m happy to own it along with my Seaboard RISE.
Noise is free and native both on iPad and iPhone. (If you’ve got an iPad Pro, by the way, Apple Pencil supports 3D Touch, though I’m not sure it’ll make so much sense here!) The ROLI app lacks MIDI output, though, so it’s not direct competition for Aftertouch. It’s worth having both.
ROLI have written up a little guide to recommend apps for Seaboard owners which is worth a look:
https://support.roli.com/article/recommended-mobile-apps/
There are already several iOS synth (Moog Model 15, Mitosynth, etc) apps that use 3D touch for polyphonic aftertouch but this is nice!
Ho actually I think it’s “only” aftertouch for Mitosynth.
Yeah, I should do a survey of those, too…
FYI many of the GarageBand instruments from the Keyboard to the new Chinese instruments utilize polyphonic aftertouch/3D Touch
You can’t do anything wrong for 99 cents, will definitely try the app once I own a 3D touch device.
The only problem with “expressive” (strange category, indeed) digital instruments or apps is lacking body intuition: The most “expressive” acoustic instruments like strings, woodwinds and brass you can play with your eyes shut. Even on LinnStrument – which does give physical feedback like a fretted instrument – you have to look at the surface every now and then. With a piano layout like on the Seaboard you may come closest to intuitve MPE control, but then again, it isn’t isomorphic. So there is still some work to do.
Well, to say something can be “expressive” is not to say “I’m throwing out my violin and replacing it with an iPhone…” Of course there’s some difference.
I have to disagree a little here. For one — I don’t see why you couldn’t play this with your eyes closed, using your eyes (and the edge of your phone, and muscle memory) as reference. It might not be as easy for me as playing a piano, but it still seems possible.
Also, as a lifelong piano player, I don’t find it necessary for all instruments to be isomorphic. 🙂 And you can play non-isomorphic instruments blind (some fairly famous people springing to mind). That’s not to question the advantages of isomorphism, though. It does make some sense. And I’m biased; I spent time understanding how to transpose on the piano.
I agree with you on most points except the smartphone muscle memory thing: I guess it would take you months of intense practise to play a simple Abm7 blind without previous contact.
These devices aren’t designed to “express” musical ideas in a solo performance (I guess they are designed to express your feelings in selfies, texts and emojis). The zone of physical contact between you and the screen is so small that it’s simply not fun to play as a live solo instrument (maybe it is for pads and subtle sliding stuff). Just look at touchscreen-based aftertouch and the distance your finger is able to make by pressing and compare it to an expression pedal, a breath controller, a violin bow etc.
But maybe my notion of expressiveness is just outdated.
Sometimes you don’t need to have that precise level of control. You basically just want more or less velocity now or more or less aftertouch now; that seams to work very well.
Im unsure if velocity really works. On 6s and iOS 9 it was kind of to slow for velocity. Haven’t heard about the new stuff from trutsted sources, yet.
Hi, app author here. When you say it was kind of too slow for velocity, do you mean the Aftertouch app, or the hardware?
i don’t know the aftertouch app. someone I know experimented with velo/aftertouch on 6s and iOS9 and he told me forget about real velocity, but aftertouch did work pleasantly.
in one word: hardware
Have a beer and a joint, go onstage and try to do the trick, doesn’t work.
U can’t feel where one tiggerbutton ends and the next starts on a flat piece of glass. You have to look.
btw. You are coming from the piano too.
Do you have any use for pitchbend on each finger?
To me this totally over the top.
Everybody that keeps banging on about how expressive that is seams to come from some fretless instrument, meh.
Polyphonic finger vibrato, I couldn’t care less.
And also don’t care about these one finger pitch slides.
Clearly, you don’t play stringed instruments, or even tabla ….
Have you heared about this very exotic thing called pitchwheel?
It comes with midi controllers that don’t cost an arm and a leg.
pitch just isn’t that interesting to me, I want more control over timbre, meh about pitch.
Have a beer and a joint, go onstage and try to do the trick, doesn’t work.
U can’t feel where one tiggerbutton ends and the next starts on a flat piece of glass. You have to look.
Cool, but i would need an ipad version to make this practical.
Yes, please!
Aftertouch will work on iPad in compatibility mode (full support will be provided in a future version), though obviously the drawback there is that iPads don’t yet support 3D Touch. One day, on an iPad with 3D Touch, I could see Aftertouch being very useful as a split screen app 🙂
The one problem with this is the one Randy @ Madrona identified 7 years ago … to do this sort of thing right if you want to drive a really expressive synthesis engine, you need a sampling rate in the surface of at least 1kHz. iPhones and current multitouch surfaces in general do not offer this. Doesn’t mean it isn’t fun, but ….
In this regard, you should check out the Sensel Morph device, which is just about to start shipping.