Friends,
While back I posted a thread about making music with our minds.
This could very well hapen…and soon.
There is a product called EMOTIVE which is a brain machine interface designed for gaming that is a wireless EEG machine that is supposed to detect neurological impulses and translate them into things that your computer can reckognize. A gyroscope for cursor control, a sensor that will read your moods and a sensor to read your facial expressions and route them to keystrokes.
This machine is 299 dollars and is supposed to ship on Dec 21st. emotiv.com
Now, why im posting this here is because this would essentially revolutionize the way that we make music. If they could translate the data output from that unit into a MIDI CC, then imagine being able to assign facial expressions or certian thoughts to assignable parameters of your instrument.
I know I have wanted another set of arms to play a second instrument. Imagine playing your guitar and controlling pitches on your Voyager, Phatty or Midi Murf. Imagine playing the drums and controlling a bass synthesizer along with it.
Imagine just playing your voyager and tilting your head to bend notes or to sweep the filter.
This would definately change the way that we interface with our computer synthesizers, lighting rigs, drum machines, anything.
Perhaps we should email them and ask that they consider the MIDI implementation of a unit such as this. We are already using WII Remote theremins and other things like the Buchla Lightning controllers.
I have always been a long time advocate of alternative controllers. As for making music with your mind, Alvin Lucier actually did it a long time ago translating an EEG signal into music.
This is a more expensive version of what you are talking about from I-Cube X:
As you can see, not exactly someone playing Toccata and Fugue in D Minor with their mind but Lucier’s work is groundbreaking in pointing to the use or new ways to make music. Some of the most interesting electronic music was done when synthesizers were first coming out or even before that using tape. Commercialism has often quelched the experimental spirit.
I think these things are in their infancy still but well worth spending some time with and seeing what is possible. Certainly the technology has gone well beyond that available to Lucier.
This is exactly what this is doing, using EEG technology translated into something a game can read, although this appears to have accurate control.
Supposedly this appears to read emotions, motor movements, and some cognitive thoughts. So morot movements can control left, right up, down and rotation of objects on the screen. If this is true then that could definately be sent to a destination such as filter cutoff, midi notes, pitch bend and modulation.
The motor skill function is sent to mimic certain keystrokes such as those associated with . one could “Lift” an object in a game with levitation by motor movements of the face, personalized to your specific desire and brain profile. This can definately be sent to something specific that midi can do.
Now Imagine being a guitar player and using motor movements (we all know those air guitar facial expressions) to trigger synthesizer functions. There is also a sensor that detects your emotion such as if you are suprised by how you kill a friend in a video game (or by the musical passage you play) that can generate other midi events such as triggering strobe lights.
Now, I first researched this for a paper on the history of prosthetic devices. I got into the “future” of this technology that talked about the BMI (Brain Machine Interface) and how it would enable people affected by strokes to better communicate using computers. They had currently gotten a monkey to drink with a prosthetic arm by thinking.
This was sue to tac like implants in the cortex of the brain that more accurately read neurological impulses, amplified them and sent them to a large computer. The results at that time were barely moving a cursor across the screen.
This technology hadn’t been tried with human implantations of the “tacs”. This technology probably won’t develop to the limits of our imagination until regular implantation of these neurological implants in the cortex of the brain becomes commonplace.
MC,
That would depend on your motor movement. It would probably be Moogasmic.
Lux,
That is exactly the point. Emotiv doesn’t mention musical instruments. This is merely the brain machine interface for GAMERS. A waste if you ask me, but once they translate the neuro output into something a computer can understand, how hard is it then to translate that to MIDI? They already are doing that with the WII remotes (controlling the LP).
I emailed them about this and registered on their forum to talk to them about this. It definately is a part of the market that they should capitalize on.
I wasn’t aware of those other links specifically music related that you had sent.
I’m glad you got in touch with the company because yes, if programmed right this could have great musical applications to great find. I think in general as speeds of computers increase and software becomes more sophisticated, recogniation of patterns be they visual or from brain waves could have incredible applications for music.
Even now, while its possible to interface Wii with Max/MSP, no one has realized the potential of just creating a cheap box that has a MIDI out and sends a MIDI controller message for each of the Wii motions.
I have a signficant interest in gestural controllers but just have not had the time to get into Max/MSP which is really the best enviromnent for experimenting with alternative controllers.
Buchla also has Lightening but I have been extremly disappointed with the video demos. One looks great, the other that I could find terrible but even the 1st does not explain what motions are being mapped.
the secret of any gestural controller be it from brain waves or otherwise is in the mapping of the sensors to meaningful MIDI data or OSC as the case may be with some controllers such as the Lemur.
This particular sensor meansures the level of different colors. Now imagine a controller that you don’t move but moves itself and can sense color. There are many ways of using this as a controller. I thought of getting some of these sensors because they sent out voltages and I think with some modifications might work with Moogerfoogers.
There is actually software that is converting images to music and I even remember a video of a bot doing this although I can’t remember what the name of it was so I can’t supply a link. Many people in music are thinking way outside the box right now but these innovation remain in the DIY side on hardware and often in Max/MSP in sofware and usually the combination of the two.
The real trick is to find something that is effective musically. That was the genius of Moog. At the time Moog made his famous modular, it seemed as foreign as any of these ideas if not more. Moog also lived in a time that was receptive to musical innovation. I am not sure that we do anymore.
Moog is one of the few companies out there that is not just making boxes with cliched sounds for cliched music.\
Hopefully some innovation can come out but to get it to the commerical side is a hard pill for a lot of executives of music companies to swallow.