Multimedia artist Jason Snell (98BA) saw an advertisement for an EEG device a couple of years ago and had a thought. The high-tech headpiece had been marketed to assist with calming the mind during meditation. But Snell wondered if he could somehow pair the device, which measures brain activity through forehead sensors, with a music synthesizer.
After purchasing the biosensor, doing a day's worth of coding with a developer's kit, and connecting it via Bluetooth to his home-brewed MIDI app, he had his answer. "I put on the EEG, focused my brain, then let go," recalls Snell. "I saw the little light turn on for the note, and it was a complete eureka. I was like, 'Oh my God, I just played a note with my brain.'"
Snell has been refining his unique brand of mind-controlled electronic music ever since. The Cedar Rapids native, who divides his time between Iowa and New York, has performed solo at techno festivals across the U.S. and as far away as Germany under the moniker Primary Assembly. On stage, Snell meditates while wearing the biosensor—sold commercially as Muse headbands—and shifts his brain activity to create an otherworldly symphony of sound. Keyboard chords ebb and flow. Drum machines pulse. Behind the motionless musician, a live readout of his brain's electrical waves unspools on a screen.
The result is a futuristic fusion of physiology, music, and performance art. Using MIDI coding, Snell assigns different sounds from multiple synthesizers to modulations in the electrical activity of his brain—the alpha, beta, gamma, delta, and theta waves being measured by the headpiece. He says he's able to compose pieces on the fly by consciously causing peaks and valleys in those brainwaves.
"The way I learned to control it was by moving to and from different parts of my brain—going from verbal thinking, to thinking about movement, to meditating and clearing my brain and relaxing," says Snell. "By moving from area to area, that starts to control the different waves. The process is similar to learning a new instrument. The more I use it, the more skilled I get."
Snell, who majored in journalism and mass communication at the UI, has worked in web development and graphic design. Since buying his first drum machine at 19, he's been drawn to the convergence of music and technology. In 2009, he developed an iPhone app that composed "generative" music based on the user's inputs. More recently, he devised a way to translate dance movements into music using motion sensors, leading to collaborations with the UI Department of Dance and the UI Stanley Museum of Art. Last year, Snell demonstrated his brain-controlled music at events hosted by the City of Iowa City and the UI Virginia A. Myers NEXUS of Engineering and the Arts program, which brings together artists and engineers.
"The music that comes out can be very beautiful, very organic," Snell says of his Primary Assembly performances. "My brain creates and perceives simultaneously. So, as it's creating music, it's hearing that music in real time, and that influences the creation process. It blurs the line between cause and effect."