brettworks

thinking through music, sound and culture

Category: musical controllers

On Expressivity In Musical Performance: The Korg Wavedrum

When we talk about “expressivity” in musical performance we’re usually referring to the degree to which a musician is able to coax emotion or affect out of his or her instrument and make it seem to “sing” (the human voice remains our gold standard of musicality).  We expect, as well, that there be some kind of obviously perceivable one-to-one, direct relationship between the musician’s actions and the resulting sound.  So, a gentle bow stroke on the violin should produce a corresponding gentle sound, or a vigorous roll on the timpani should make a booming one.  Part of what makes a great musician is his or her fluid command over that violin bow/violin or timpani mallet/timpani pairing to the point that you forget that there is in fact a lot of technology involved in what you might have thought was an apparently transparent connection between musicians and their gear.  Virtuosity makes you forget about the gear and takes you away to wonderment.

In the course of writing a short article on electronic percussion for the Grove Dictionary Of American Music, I hit upon an interesting instrument by Korg called the Wavedrum.  What’s interesting is that the Wavedrum combines digital sampling and synthesis with what looks like a very realistically sensitive and playable control surface.

Whaack.

The Wavedrum seems to play just about like a real drum: you can play it with sticks or mallets or your hands anywhere on its drumhead or along its rim and the sound shifts accordingly along a continuum so finely graded that it just might trick you into feeling, “hey, wait a second here, I can really express myself here.”

Booom.

But the Wavedrum doesn’t have an acoustic bone in its electronic body.  What it has is clever coding and advanced triggering technology.

Just as devices like the Apple iPad–with its fluid touchscreen that responds to our fingers like it’s an extension of them–remind us that what we liked about reading the newspaper was really its responsiveness to our handling it, the Wavedrum reminds us that what we like about acoustic instruments is how they can’t really lie.  Like ventriloquists, we throw our voices and try to get the instruments to sing on our behalf, or else we end up looking and sounding like…dummies.

Here is a clip of a drummer demonstrating the Wavedrum at a music
trade show.  (Yes, the clip is sponsored by Korg, but I’m not endorsing it, just using it to make some observations.)

If you want more grist for this mill, watch this clip of drum machine pioneer Roger Linn demonstrating one of his experimental, not yet released musical instruments.  Again, the relationship between gesture and sound is fluid:

On Techlust: Native Instruments’ Maschine

I’m at Tekserve, in the audio department, and I spot a beauty: Native Instruments’ Maschine, a hardware-software rhythm machine.  I move in for a closer inspection.  Its top is made of metal and I run my fingers across the smooth, cool brushed surface.  I pick up the musical object off the display table and assess its weight: a solid few pounds.  I put it back down and continue exploring.  Its dials are smooth and rotate infinitely, and I so I twist them around and around, imagining what parameters they might control.  Its buttons produce subtle clicks–confident sounds that will surely respond to my touch and help me, one day, switch something on or off in an instant.  And then there are those sixteen beautiful 1.5 inch square rubber pads.  Soft like gummy bears, they’re mini drums that can absorb the impact of an incoming finger, and so I start drumming on them, my fingers playing silent patterns across the four by four grid.  Feels nice.  I pick up Maschine again, rotating it in my hands, and even consider smelling it–after all, I’m sizing up a potential musical mate. (This from someone who regularly smells his Kindle as if it were a paper book!)  What, I’m wondering, might I do with this thing?  Will this be, finally, the instrument that allows me to create fluidly, or will it lure me down a wormhole of complicated procedures that will blunt the creative process?

Maschine is a recent example of electronic music software assuming a physical presence in order to attract musicians. The thinking is that we like tangible things–vibrating strings, membranes, or even smooth moving knobs and smushy rubber pads–with which to interact and make music.  But the fascinating paradox about the tools of electronic music is that as the palette of sound possibilities has increased exponentially with software innovations, the music making process has become increasingly less physical.  There are two ways to think about this.  On the one hand, the shift has encouraged many people without traditional music training to just go ahead and make music.  On the other hand, those of us with training are always looking for a foothold, a link to the physical.  So far, this foothold or link comes in the form of MIDI keyboards and other controllers such as the Akai APC series and the Korg Kaoss touch pads.  Maschine harks back to hardware instruments from the late 1980s and early 1990s such as Akai’s MPC workstations, like the unit in the pic below:

These instruments are still popular with hip hop beat makers who program their patterns like a potter plays with and molds clay: the boxes allow them to feel like they’re getting their hands dirty.  This is a good thing, because our hands often know as much or even more than our minds, and letting our hands play with instruments is a direct route to new ideas.  Maschine is both an attractive piece of hardware and a powerful piece of software, hence its appeal for electronic musicians.  Below is a Native Instruments promotional video for the instrument featuring Jeremy Ellis hammering away on those rubber pads:

On Designing New Musical Controllers

A while back I wrote about MIDI hardware controllers which are used by musicians who want to control their computer software.  (You can read the post here.)  Why does one need a controller when performing music with say, a laptop?  For one thing, it gives you the sense of having physical, tactile control over your music.  Rather than using a mouse pad to point and click your way through musical actions, a hardware controller makes making electronic music feel a little more like playing a “real” musical instrument.  (Whether or not a laptop computer running software is in fact a musical instrument is another question altogether.)  Another thing controllers are good for is that they enable their users to do many things at once.  For instance, you can easily “map” several different parameters in your software to one knob, button, or fader on your controller, so with one turn, tap, or slide you could set into motion a whole bunch of musical transformations, making you feel, well, bionic.  To make an analogy with the symphony orchestra: the conductor can cue or “trigger” (in electronic music parlance) several instruments or sections at once with the wave of a hand. Now that is power.  Similarly, MIDI hardware controllers give the electronic musician that feeling of potential musical control.

As of 2011, you can find many kinds of musical controllers for sale at your local music store.  Most of these units are small plastic boxes with buttons, knobs, and faders, and are designed to work easily with popular software programs such as Ableton Live.  But adventurous musicians sometimes go above and beyond by designing their own MIDI controllers.  Nick Francis, the music director at KPLU-FM in Seattle, is one such musician.  In the video below, Francis describes how he set about building his own custom controller in order to perform live remixes of some of his favorite jazz recordings.  Francis then demonstrates his live remixing/mash-up of Fats Waller’s jazz classic “Honeysuckle Rose” (1928).

How did he go about doing it?  By taking audio samples from four different recordings of “Honeysuckle Rose” and importing them into Ableton Live software (which Francis accurately describes as a “spreadsheet” for sound).  These sound samples are then combined with other rhythmic loops.  If you watch the video closely, you can get a sense of how and when Francis is triggering the various “Honeysuckle Rose” samples, as well as sliding faders to switch from one sound to another (listen and watch for the back and forth between the piano and the bass).

This clip has already been viewed over 14,000 times on YouTube, and viewers are especially impressed by how “natural” the controller looks and by the fact that electronic music remixing is (or always was) open to all ages.

If you are now curious about Fats Waller’s original song, you might enjoy this clip of him performing:

On The Monome Community Earthquake Disaster Emergency Album

The other day I talked about music in terms of its having no specific meanings, and so available for us to project what we want onto its designs.  But this doesn’t mean music can’t be made in the service of a worthy cause besides its own pleasures.

In the days immediately following the recent huge earthquake in Japan, a number of electronic musicians who use hardware controllers made by monome (a very interesting company based in NY state) coalesced and composed new music.  Many of the tracks on this (free) release are based on sampled and synthesized representations of seismic data collected from the earthquakes.  Usually, basing music composition on this kind of non-musical “input” can seem contrived, but in this case it feels like an appropriate response.  The musicians write that their compilation is “intended as a cathartic response to the impermanence of our existence on this planet.  If you are moved by these musics, please donate something to aid the ongoing rescue and reconstruction efforts in and round Japan.”

The music on this compilation is very strong, and gives you a sense of the kinds of exciting things electronic musicians are doing in 2011 with computers, software, and very, very cool open-ended hardware controllers such as the monome (see pic below).

The download is free here and you can also donate to aid organizations:

www.doctorswithoutborders.org/donate/overview.cfm
www.jrc.or.jp/english/index.html
shelterbox.org

 

 

On Becoming A Virtuoso Of Knobs, Buttons, and Sliders

In the course of preparing for a paper on laptop music making as creative practice I’m giving this summer at Cambridge University, I’ve been thinking about how exactly one goes about performing music with/on a laptop: What are the decision-making and problem-solving processes musicians use in performance and in preparing for performance?  I’m approaching topic as a “traditional” musician who is used to grappling with sticks and membranes (drumming), vibrating strings (dulcimer playing), and fingers on a keyboard.  And composing for me has been, up until now, a linear kind of thing where I play or improvise parts to build pieces with beginnings, middles, and ends.  Even my recent electronic recording, Views From A Flying Machine (which I’ve written about elsewhere on this blog), was “through-composed”–layered one part at a time.  Kinda old-fashioned.

But now I’m moving forward and being a futurist!  One of the aims of my paper is to explore how exactly an electronic musician who uses a laptop goes about rendering a piece of pre-composed music in performance.  Many musicians augment their laptops with a MIDI hardware controller of some kind.  This allows them to literally control various parameters of the software running on their laptops.  So, if a musician uses Ableton Live as their software (as I do), the knobs, buttons, and sliders on their MIDI controller can be “mapped” to whichever Ableton parameter the musician would like to control.  A simple mapping might be an effect send, such as a delay or reverb.  Map that effect to one of the knobs on your controller and voila: when you turn the knob, you activate the effect.  The idea is to allow the musician to feel as if they have some tactile control over their sounds (something perhaps taken for granted by acoustic musicians, I might add).

On the website of Livid Instruments, a company that makes beautiful, hand-made controllers, co-founder Peter Nyboer points out that musical instruments are no longer the only controllers in town and that new electronic products offer creative possibilities:

“Strings, reeds, and resonating bodies are no longer the only musical controls, but the industrial conveniences of knobs, button, and sliders have augmented musical reality such that they demand their own vocabulary of virtuosity.”

I like this quote.  First, I never really thought of instruments like the piano or drums as “musical controls.”  Rather, I thought of them as pretty simple extensions of the human body that wants to make sound. (Note: the piano is a far from simple instrument!)  Now that I think about it, no musical instrument is simple.  It’s just that after years of playing, an instrument can feel like it’s an extension of my body, when in fact it remains an object with which I am constantly negotiating! Second, there is no question that electronic music controllers have “augmented musical reality.” Configured the right way, the twist of a knob could trigger exponential musical processes and seismic sonic changes.  It all depends how one sets them up.  Which brings us to Nyboer’s third point: these controllers make their own demands on us, specifically how we think about our musical processes, the software programs we use (or write!), even our philosophy of what music should be. Thus, Nyboer’s “vocabulary of virtuosity” is not just a matter of getting good at knob-twiddling, button-pushing, and fader-sliding.  What is hinted at here is nothing less than using an electronic “black box”–the best controllers, by the way, are pretty much blank slates upon which we can map whatever kinds of musical systems we want–to bring our music making to a new place outside the box.

Here is a clip of the DJ/Producer Eliot Lipp using the Livid Ohm64 with Ableton Live.  Notice what he says at 3:56:

“To me, I’m trying to set up [his controller] in a way where I can have the ability to do a live remix of a track.  Even if it’s something I just brought into Ableton.  If I want to play it out that night, I have all these parameters set up so I can do live edits of the track: loop it wherever I want, and deal with the blend from one track to the next.”

Here is a clip of another intriguing, blank slate controller, the monome, in action:

Interviews with Roger Linn

Instrument designer and musician Roger Linn is perhaps most famous for inventing the first drum machines (in the early 1980s) to use digitally sampled drum sounds, the LM-1 and LinnDrum.  In the years since, Linn teamed up with Akai to invent the MPC-series of drum machines/sequencers, and lately Linn has turned his attention to making unusual products for guitarists such as the AdrenaLinn filter/effects/sequencer units.  In a way, through his products he has deeply shaped the sounds of popular music over the past thirty years.

I found several interviews with Linn at sweetwater.com (that won’t show up in a google search).  Here, Linn outlines the history of some of his innovations such as the LinnDrum.  In the final video, Linn offers his perspective on what is missing in musical hardware innovation today.  He’s amazed that we are still largely playing guitars, basses, keyboards, and drums, and muses about what a new controller or control surface needs to have in order to compete with these old-fashioned instruments, what a new controller needs to be truly interactive.  He says:

“What is the new control surface that allows people to not only to just do superb solo work…but also work with other people, to use wireless sync to be able to synchronize together?  Instead of the old-fashioned way which is to read the same sheet music…You’ve got this interactive merging of the worlds of real-time and recording editing in the form of looping where you basically record something and immediately play back on top of it.  There are all kinds of great ideas happening, but people are implementing those ideas using fairly poor hardware interfaces–just a keyboard or guitar or mouse–you’re rolling a bar of soap around on a table. It’s silly.”

An unintentionally funny part of this is that musicians playing acoustic music together do the “wireless sync” thing pretty effortlessly together!  (It’s called listening.)  Another thing to remember is that widely-used, time-tested instruments like guitars, basses, keyboards, and drums are more than simply musical “controllers.” What makes them enduring is that they have deep histories of use, a kind of collective consciousness embedded in them that includes all the things that were ever played on them–a repertoire of expressive possibilities.  When someone decides to take up the drums, for instance, he is dipping into an ocean of other people who have played that same constellation of instruments (I’m thinking of a drumkit here), struggled to overcome the resistance of the instrument, struggled with learning a technical facility to enable the feeling of being “expressive”, and so on.  Simply put, there is a lot going on beneath the seemingly simple surface of an “old-fashioned” instrument, and that might be part of why we continue to be drawn to them.

The kinds of new electronic/digital controllers Linn is describing don’t have this collective history embedded in them (yet), and this may be part of the reason why it will take time and a lot of work by a lot of musicians to move them securely into our collective embrace.  But consider what happened to turntables after DJs began using them in new ways in discos, in hip hop: thirty years later we now have digital turntables that are widely used and considered expressive instruments.  (And DJs are considered a breed of musician too.)

You can view interviews with Roger Linn here.

You Are The Controller

Last week Microsoft released the Kinect controller for their XBox video game console.  The Kinect is being hailed/hyped as the next step in gaming technology as it does away with the most annoying part of the gaming experience: those little handheld controllers that serve as an interface between the player and the game.  Nintendo’s Wii got us part of the way there with their handheld controllers that respond to body movement.  So what makes Kinect on another level?  It scans the player’s body movements in real-time, making the human body the controller.  No more wires, no more joysticks, no more buttons to press, nothing to hold.  In the words of the XBox commercial: “You don’t need to know anything you don’t already know.  Or do anything you don’t already do.  All you have to do is be you.  You are the controller.”

I imagine that the Kinect technology will have resonance for many electronic musicians because musical controllers have long been something that we need to address when composing and performing music.  Pick up a music store catalog and you’ll see lots of controllers for sale, each of them offering the musician the prospect of ever better “control” over their music.  Controllers are always aiming for the kind of almost perfect transparency demonstrated by an acoustic musician at his or her instrument–with maybe only a pair of drumsticks or a violin bow or a mouthpiece or set of piano keys as the “interface” between player and expression.  In my conversations with electronic musicians over the years, one recurring theme is the tantalizing prospect of having nothing come between them and their music.  It’s the dream of having one’s physical (and possibly mental) gestures directly translated into sound, a situation where, as Kinect puts it, you are the controller.

Follow

Get every new post delivered to your Inbox.

Join 129 other followers