Brain control interfaces (BCI)

Something I’ve been keeping an eye on lately is brain control interfaces i.e. devices which let you control things with your mind. There seems to be a sudden flurry of activity this year in this area with a small handful of products about to hit the consumer and research market! Ten years ago I tried to rig up a throat-mic and speech regognition software to save my poor aching hands, but it couldn’t cut it and besides, who want to be sitting in an office saying “OPEN FIREFOX”, “NAVIGATE TO FACEBOOK”. Mind control would be so much more efficient and private.

At the dumbed-down end of the market there’s a game called MindFlex which confuses some people with the levitation aspect of the game, but that’s done with fans. There’s only one sensor plus reference clip on the ear, so there can’t be many variables to control but the game looks quite fun. At least the actors in the promo video seemed to be geniunely enjoying themselves and had in fact briefly forgotten their failing acting careers.

At the other end of the spectrum is Honda’s research offering described in this great Gizmag article which combines two technologies – brainwaves (EEG) and infrared (NIRS) requiring the rather snappily dressed researcher to sit perfectly still while his slave Asimo is compelled to act out his intentions and wave its arms around. I’m not sure we’ll be seeing that in our living rooms soon, well, unless you happen to live at NASA or one of the darker military research organisations.

In the middle somewhere are two EEG headsets, the Neurosky and the Emotiv Epoc. My partner (who not co-incidentally does EEG for a living) couldn’t see how the Neurosky can be doing much more than detecting muscle activity as it only has one sensor located in a spot above the eyebrow most researchers avoid. However the Emotiv headset looks awesome and is aimed at both the game-developer and researcher markets with extensive SDKs and integration with the right scientific software. They claim to be able to detect three types of activity: actual active cognitive changes, the usual more indirect brain-wave levels and muscle activity to control avatars. It’s an interesting combined approach actually using the muscle signals instead of just being annoyed by them interfering with the true EEG. It’s the detection of intent that really excites me here though, as I can’t see using relaxation to help me do my day job. It turns out that the Emotiv was the brainchild of Dr Allan Snyder, amongst others, who I co-incidentally ran into once while briefly working for the Centre for the Mind at Sydney University. So it’s not suprising this piece of equipment is good, as he’s a very interesting chap.

So while I’m waiting for one of these headsets to appear, I decided to build my own partly for fun and partly for a science/art installation my partner and I are planning to demonstrate EEG technology (and for me to use Processing for something artitechnic). So I decided to write a new game to control as a test case – something we can project on a big screen and get kids to take turns in trying to play. We will also display raw EEG graphs and some processed signals to explain how the device works. So I convinced my partner to bring home some spare components from her lab but was slightly suprised to see the hospital-esque qualities of the syringes, cleaning swaps and contact gel… No wonder they have to distract the kids with a fake Hedwig before strapping the Davros-like helmet on them.

I thought I’d build the electronic interface with Arduino partly because I just want to play with it! So I’ve started (ahem, yes started…) planning the circuitry necessary to amplify the EEG signals without unintentionally re-programming myself at the same time. It seems I should be able to get everything I need to a couple of hundred dollars (AU).

So on to the game. I wanted a game which had a very simple control. I only have one sensor, so I’m assuming I only have one variable. This might not necessarily be true, as I could detect a range of brain-waves and variations, but lets keep things simple. The first game that came to mind was the old classic Lunar Lander! All you have to do is press one key to boost, and then let it fall as the lander moves sideways by itself. So what if the brain control is that, concentrate to fly higher, relax to fall.

I wanted to capture both girls and boys imaginations so I designed two versions – one with a been landing on a flower, or yukky bugs on either side, and of course the classic lunar version. With a little bit of help from Dreamstime and Processing viola! We have a little pair of games ready for the plugging in thereof the brain.

You can play the games here with the old fashion manual methods until I finish the sensor. Now where did I put that old 80’s headband…

This entry was posted in articles, Projects and tagged , . Bookmark the permalink.

One Response to Brain control interfaces (BCI)

  1. Pingback: Brain control interfaces (BCI) update: Deepend Lunch and Learn | SciPilot

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.