Tag Archives: brain-computer interface

Mind-controlled prostheses ready for real world activities

There’s some exciting news from Sweden’s Chalmers University of Technology about prosthetics. From an Oct. 8, 2014 news item on ScienceDaily,

For the first time, robotic prostheses controlled via implanted neuromuscular interfaces have become a clinical reality. A novel osseointegrated (bone-anchored) implant system gives patients new opportunities in their daily life and professional activities.

In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. …

An Oct. 8, 2014 Chalmers University press release (also on EurekAlert), which originated the news item, provides more details about the research and this ‘real world’ prosthetic device,

“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.

“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”

The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.

The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.

Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).

The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.

“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”

The researchers plan to treat more patients with the novel technology later this year.

“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

The researchers have provided an image of the patient using his prosthetic arm in the context of his work as a truck driver,

[downloaded from http://www.chalmers.se/en/news/Pages/Mind-controlled-prosthetic-arms-that-work-in-daily-life-are-now-a-reality.aspx]

[downloaded from http://www.chalmers.se/en/news/Pages/Mind-controlled-prosthetic-arms-that-work-in-daily-life-are-now-a-reality.aspx]

The news release offers some additional information about the device,

The new technology is based on the OPRA treatment (osseointegrated prosthesis for the rehabilitation of amputees), where a titanium implant is surgically inserted into the bone and becomes fixated to it by a process known as osseointegration (Osseo = bone). A percutaneous component (abutment) is then attached to the titanium implant to serve as a metallic bone extension, where the prosthesis is then fixated. Electrodes are implanted in nerves and muscles as the interfaces to the biological control system. These electrodes record signals which are transmitted via the osseointegrated implant to the prostheses, where the signals are finally decoded and translated into motions.

There are also some videos of the patient demonstrating various aspects of this device available here (keep scrolling) along with more details about what makes this device so special.

Here’s a link to and a citation for the research paper,

An osseointegrated human-machine gateway for long-term sensory feedback and motor control of artificial limbs by Max Ortiz-Catalan, Bo Håkansson, and Rickard Brånemark. Sci Transl Med 8 October 2014: Vol. 6, Issue 257, p. 257re6 Sci. Transl. Med. DOI: 10.1126/scitranslmed.3008933

This article is behind a paywall and it appears to be part of a special issue or a special section in an issue, so keep scrolling down the linked to page to find more articles on this topic.

I have written about similar research in the past. Notably, there’s a July 19, 2011 post about work on Intraosseous Transcutaneous Amputation Prosthesis (ITAP) and a May 17, 2012 post featuring a video of a woman reaching with a robotic arm for a cup of coffee using her thoughts alone to control the arm.

DARPA (US Defense Advanced Research Projects Agency) wants to crowdsource cheap brain-computer interfaces

The US Defense Advanced Research Projects Agency wants the DIY (or Maker community) to develop inexpensive brain-computer interfaces according to a Sept. 27, 2013 news item by John Hewitt on phys.org,

This past Saturday [Sept. 21, 2013], at the Maker Faire in New York, a new low-cost EEG recording front end was debuted at DARPA’s booth. Known as OpenBCI, the device can process eight channels of high quality EEG data, and interface it to popular platforms like Arduino. …

DARPA program manager William Casebeer said that his goal was to return next year to the Maker meeting with a device that costs under $30.

Adrianne Jeffries’ Sept. 22, 2013 article for The Verge provides more information (Note: Links have been removed),

A working prototype of a low-cost electroencephalography device funded by the US Defense Advanced Research Projects Agency (DARPA) made its debut in New York this weekend [Sept. 21 – 22, 2013], the first step in the agency’s effort to jumpstart a do-it-yourself revolution in neuroscience.
There are some products like those in the Neurosky lineup, which range from $99 to $130. But most neural monitoring tools are relatively expensive and proprietary, the OpenBCI [OpenBCI, an open source device built to capture signals from eight electrodes at a time] team explained, which makes it tough for the casual scientist, hacker, or artist to play with EEG. If neural monitoring were cheap and open, we’d start to see more science experiments, art projects, mind-controlled video games, and even serious research using brainwaves. You could use an at-home EEG to create a brain-powered keyboard, for example, Dr. Allen [Lindsey Allen, engineer for Creare;  OpenBCI was built by Creare and biofeedback scientist Joel Murphy, and the prototype was finished only two weeks ago] said, and learn how to type with your mind.

I have written about various brain-computer interfaces previously, the most recent being a Dec. 5, 2012 posting about Muse, a $199 brainwave computer controller.

‘Touching’ infrared light, if you’re a rat followed by announcement of US FDA approval of first commercial artificial retina (bionic eye)

Researcher Miguel Nicolelis and his colleagues at Duke University have implanted a neuroprosthetic device in the portion of a rat’s brain related to touch that allows the rats to see infrared light. From the Feb. 12, 2013 news release on EurekAlert,

Researchers have given rats the ability to “touch” infrared light, normally invisible to them, by fitting them with an infrared detector wired to microscopic electrodes implanted in the part of the mammalian brain that processes tactile information. The achievement represents the first time a brain-machine interface has augmented a sense in adult animals, said Duke University neurobiologist Miguel Nicolelis, who led the research team.

The experiment also demonstrated for the first time that a novel sensory input could be processed by a cortical region specialized in another sense without “hijacking” the function of this brain area said Nicolelis. This discovery suggests, for example, that a person whose visual cortex was damaged could regain sight through a neuroprosthesis implanted in another cortical region, he said.

Although the initial experiments tested only whether rats could detect infrared light, there seems no reason that these animals in the future could not be given full-fledged infrared vision, said Nicolelis. For that matter, cortical neuroprostheses could be developed to give animals or humans the ability to see in any region of the electromagnetic spectrum, or even magnetic fields. “We could create devices sensitive to any physical energy,” he said. “It could be magnetic fields, radio waves, or ultrasound. We chose infrared initially because it didn’t interfere with our electrophysiological recordings.”

Interestingly, the research was supported by the US National Institute of Mental Health (as per the news release).

The researchers have more to say about what they’re doing,

“The philosophy of the field of brain-machine interfaces has until now been to attempt to restore a motor function lost to lesion or damage of the central nervous system,” said Thomson, [Eric Thomson] first author of the study. “This is the first paper in which a neuroprosthetic device was used to augment function—literally enabling a normal animal to acquire a sixth sense.”

Here’s how they conducted the research,

The mammalian retina is blind to infrared light, and mammals cannot detect any heat generated by the weak infrared light used in the studies. In their experiments, the researchers used a test chamber that contained three light sources that could be switched on randomly. Using visible LED lights, they first taught each rat to choose the active light source by poking its nose into an attached port to receive a reward of a sip of water.

After training the rats, the researchers implanted in their brains an array of stimulating microelectrodes, each roughly a tenth the diameter of a human hair. The microelectrodes were implanted in the cortical region that processes touch information from the animals’ facial whiskers.

Attached to the microelectrodes was an infrared detector affixed to the animals’ foreheads. The system was programmed so that orientation toward an infrared light would trigger an electrical signal to the brain. The signal pulses increased in frequency with the intensity and proximity of the light.

The researchers returned the animals to the test chamber, gradually replacing the visible lights with infrared lights. At first in infrared trials, when a light was switched on the animals would tend to poke randomly at the reward ports and scratch at their faces, said Nicolelis. This indicated that they were initially interpreting the brain signals as touch. However, over about a month, the animals learned to associate the brain signal with the infrared source. They began to actively “forage” for the signal, sweeping their heads back and forth to guide themselves to the active light source. Ultimately, they achieved a near-perfect score in tracking and identifying the correct location of the infrared light source.

To ensure that the animals were really using the infrared detector and not their eyes to sense the infrared light, the researchers conducted trials in which the light switched on, but the detector sent no signal to the brain. In these trials, the rats did not react to the infrared light.

Their finding could have an impact on notions of mammalian brain plasticity,

A key finding, said Nicolelis, was that enlisting the touch cortex for light detection did not reduce its ability to process touch signals. “When we recorded signals from the touch cortex of these animals, we found that although the cells had begun responding to infrared light, they continued to respond to whisker touch. It was almost like the cortex was dividing itself evenly so that the neurons could process both types of information.

This finding of brain plasticity is in contrast with the “optogenetic” approach to brain stimulation, which holds that a particular neuronal cell type should be stimulated to generate a desired neurological function. Rather, said Nicolelis, the experiments demonstrate that a broad electrical stimulation, which recruits many distinct cell types, can drive a cortical region to adapt to a new source of sensory input.

All of this work is part of Nicolelis’ larger project ‘Walk Again’ which is mentioned in my March 16, 2012 posting and includes a reference to some ethical issues raised by the work. Briefly, Nicolelis and an international team of collaborators are developing a brain-machine interface that will enable full mobility for people who are severely paralyzed. From the news release,

The Walk Again Project has recently received a $20 million grant from FINEP, a Brazilian research funding agency to allow the development of the first brain-controlled whole body exoskeleton aimed at restoring mobility in severely paralyzed patients. A first demonstration of this technology is expected to happen in the opening game of the 2014 Soccer World Cup in Brazil.

Expanding sensory abilities could also enable a new type of feedback loop to improve the speed and accuracy of such exoskeletons, said Nicolelis. For example, while researchers are now seeking to use tactile feedback to allow patients to feel the movements produced by such “robotic vests,” the feedback could also be in the form of a radio signal or infrared light that would give the person information on the exoskeleton limb’s position and encounter with objects.

There’s more information including videos about the work with infrared light and rats at the Nicolelis Lab website.  Here’s a citation for and link to the team’s research paper,

Perceiving invisible light through a somatosensory cortical prosthesis by Eric E. Thomson, Rafael Carra, & Miguel A.L. Nicolelis. Nature Communications Published 12 Feb 2013 DOI: 10.1038/ncomms2497

Meanwhile, the US Food and Drug Administraton (FDA) has approved the first commercial artificial retina, from the Feb. 14, 2013 news release,

The U.S. Food and Drug Administration (FDA) granted market approval to an artificial retina technology today, the first bionic eye to be approved for patients in the United States. The prosthetic technology was developed in part with support from the National Science Foundation (NSF).

The device, called the Argus® II Retinal Prosthesis System, transmits images from a small, eye-glass-mounted camera wirelessly to a microelectrode array implanted on a patient’s damaged retina. The array sends electrical signals via the optic nerve, and the brain interprets a visual image.

The FDA approval currently applies to individuals who have lost sight as a result of severe to profound retinitis pigmentosa (RP), an ailment that affects one in every 4,000 Americans. The implant allows some individuals with RP, who are completely blind, to locate objects, detect movement, improve orientation and mobility skills and discern shapes such as large letters.

The Argus II is manufactured by, and will be distributed by, Second Sight Medical Products of Sylmar, Calif., which is part of the team of scientists and engineers from the university, federal and private sectors who spent nearly two decades developing the system with public and private investment.

Scientists are often compelled to do research in an area inspired by family,

“Seeing my grandmother go blind motivated me to pursue ophthalmology and biomedical engineering to develop a treatment for patients for whom there was no foreseeable cure,” says the technology’s co-developer, Mark Humayun, associate director of research at the Doheny Eye Institute at the University of Southern California and director of the NSF Engineering Research Center for Biomimetic MicroElectronic Systems (BMES). …”

There’s also been considerable government investment,

The effort by Humayun and his colleagues has received early and continuing support from NSF, the National Institutes of Health and the Department of Energy, with grants totaling more than $100 million. The private sector’s support nearly matched that of the federal government.

“The retinal implant exemplifies how NSF grants for high-risk, fundamental research can directly result in ground-breaking technologies decades later,” said Acting NSF Assistant Director for Engineering Kesh Narayanan. “In collaboration with the Second Sight team and the courageous patients who volunteered to have experimental surgery to implant the first-generation devices, the researchers of NSF’s Biomimetic MicroElectronic Systems Engineering Research Center are developing technologies that may ultimately have as profound an impact on blindness as the cochlear implant has had for hearing loss.”

Leaving aside controversies about cochlear implants and the possibility of such controversies with artificial retinas (bionic eyes), it’s interesting to note that this device is dependent on an external camera,

The researchers’ efforts have bridged cellular biology–necessary for understanding how to stimulate the retinal ganglion cells without permanent damage–with microelectronics, which led to the miniaturized, low-power integrated chip for performing signal conversion, conditioning and stimulation functions. The hardware was paired with software processing and tuning algorithms that convert visual imagery to stimulation signals, and the entire system had to be incorporated within hermetically sealed packaging that allowed the electronics to operate in the vitreous fluid of the eye indefinitely. Finally, the research team had to develop new surgical techniques in order to integrate the device with the body, ensuring accurate placement of the stimulation electrodes on the retina.

“The artificial retina is a great engineering challenge under the interdisciplinary constraint of biology, enabling technology, regulatory compliance, as well as sophisticated design science,” adds Liu.  [Wentai Liu of the University of California, Los Angeles] “The artificial retina provides an interface between biotic and abiotic systems. Its unique design characteristics rely on system-level optimization, rather than the more common practice of component optimization, to achieve miniaturization and integration. Using the most advanced semiconductor technology, the engine for the artificial retina is a ‘system on a chip’ of mixed voltages and mixed analog-digital design, which provides self-contained power and data management and other functionality. This design for the artificial retina facilitates both surgical procedures and regulatory compliance.”

The Argus II design consists of an external video camera system matched to the implanted retinal stimulator, which contains a microelectrode array that spans 20 degrees of visual field. [emphasis mine] …

“The external camera system-built into a pair of glasses-streams video to a belt-worn computer, which converts the video into stimulus commands for the implant,” says Weiland [USC researcher Jim Weiland], “The belt-worn computer encodes the commands into a wireless signal that is transmitted to the implant, which has the necessary electronics to receive and decode both wireless power and data. Based on those data, the implant stimulates the retina with small electrical pulses. The electronics are hermetically packaged and the electrical stimulus is delivered to the retina via a microelectrode array.”

You can see some footage of people using artificial retinas in the context of Grégoire Cosendai’s TEDx Vienna presentation. As I noted in my Aug. 18, 2011 posting where this talk and developments in human enhancement are mentioned, the relevant material can be seen at approximately 13 mins., 25 secs. in Cosendai’s talk.

Second Sight Medical Devices can be found here.

A brainwave computer controller named Muse

Toronto-based (Canada) company, InteraXon has just presented a portable brainwave controller at the ParisLeWeb 2012 meeting according to a Dec. 5, 2012 article by Nancy Owano for phys.org,

A Canadian company is talking about having a window, aka computer screen, into your mind. Another of the many ways to put it—they believe your computer can be so into you. And vice-versa. InteraXon, a Canadian company, is focused on making a business out of mind-control technology via a headband device, and they are planning to launch this as a $199 brainwave computer controller called Muse. The company is running an Indiegogo campaign to obtain needed funds. Muse is a Bluetooth-connected headset with four electroencephalography sensors, communicating with the person’s computer via the Bluetooth connection.

Here’s more about the technology from InteraXon’s How It Works webpage,

Your brain generates electrical patterns that resonate outside your head, which accumulate into brainwaves detectable by an Electroencephalograph (EEG). The EEG can’t read your thoughts, just your brain’s overall pattern of activity, like how relaxed or alert you are. With practice you can learn to manipulate your brainwave pattern, like flexing a muscle you’ve never used before.

InteraXon’s interface works by turning brainwaves into binary (ones and zeros). We’re like interpreters fluent in the language of the mind: our system analyses the frequency of your brainwaves and then translates them into a control signal for the computer to understand.

Just like a button or switch can activate whatever it’s connected to, your translated brainwaves can now control anything electric. InteraXon designers and engineers make the experience so seamless, the connected technology seems like an extension of your own body.

It would be nice to have found a little more technical detail.

InteraXon is currently featuring its work at the 2010 Olympics in Vancouver (Canada) as an example of past work,

When visitors arrive at Bright Ideas, InteraXon’s thought-controlled computing experience custom designed and built for the 2010 Olympics, they are lead to their own pod. In front of each pod is a large projection screen as well as a small training screen. Once seated, a trained host hands them a headset that will measure their brain’s electrical signals.

With help from the host, the participants learn to deliberately alter their brainwaves. By focusing or relaxing their mind, they learn to change the display on their training screen; music and seat vibrations provide immediate feedback to speed the learning process to five minutes or less. Now they are ready for the main event.

Thoughts are turned into light patterns instantaneously as their brain’s digital signal is beamed over the Rocky Mountains, across vast prairies all the way to three major Ontario icons – a distance of 3000 km.

This project – a first at this grand scale – allows each participant to experience a very personal connection with these massive Ontario landmarks, and with every Canadian watching the lightshow, whether online, or in-person.

As for Muse, InteraXon’s latest project, the company has a campaign on Indiegogo to raise money. Here’s the video on the campaign website,

They seem very excited about it all, don’t they? The question that arises is whether or not you actually need a device to let you know when you’re concentrating or when your thoughts are wandering.  Apparently, the answer is yes. The campaign has raised over $240,000 (they asked for $150,000) and it’s open until Dec. 7, 2012.  If you go today, you will find that in addition to the other pledge inducements there’s a special ParisLeWeb $149 pledge for one day only (Dec. 5, 2012). Here’s where you go.

The Canadian Broadcasting Corporation’s Spark radio programme featured an interview (either in Nov. or Dec. 2012) with Ariel Garten, Chief Executive Office of InteraXon discussing her company’s work. You can find podcast no. 197 here (it is approximately 55 mins. and there are other interviews bundled with Garten’s). Thanks to Richard Boyer for the tip about the Spark interview.

I have mentioned brain-computer interfaces previously. There’s the Brain-controlled robotic arm means drinking coffee by yourself for the first time in 15 years May 17, 2012 posting and the Advertising for the 21st Century: B-Reel, ‘storytelling’, and mind control Oct. 6, 2011 posting amongst others.

Less confused about Europe’s FET (Future and Emerging Technologies programme)

I’ve had problems trying figure out the European Union’s Future and Emerging Technologies programme and so I’m glad to say that the Feb. 10, 2012 news item on Nanowerk offers to clear up a few matters for me (and presumably a few other people too).

From the news item,

Go forth and explore the frontiers of science and technology! This is the unspoken motto of the Future and Emerging Technologies programme (FET), which has for more than 20 years been funding and inspiring researchers across Europe to lay new foundations for information and communication technology (ICT). [emphasis mine]

The vanguard researchers of frontier ICT research don’t always come from IT backgrounds or follow the traditional academic career path. The European Commission’s FET programme encourages unconventional match-ups like chemistry and IT, physics and optics, biology and data engineering. Researchers funded by FET are driven by ideas and a sense of purpose which push the boundaries of science and technology.

They have three funding programmes (from the news item),

To address these challenges, the FET scheme supports long-term ICT programmes under three banners:

  • FET-Open, which has simple and fast mechanisms in place to receive new ideas for projects without pre-conceived boundaries or deadlines;
  • FET-Proactive, which spearheads ‘transformative’ research and supports community-building around a number of fundamental long-term ICT challenges; and
  • FET Flagships, which cut across national and European programmes to unite top research teams pursuing ambitious, large-scale, science-driven research with a visionary goal.

The news item goes on to describe a number of projects including the GRAPHENE-CA flagship pilot currently under consideration, along with five other flagship projects, for one of two 1 Billion Euro prizes. I have commented before (my Feb. 6, 2012 posting) on the communication strategies being employed by at least some of the members of this particular flagship project. Amazingly, they’ve done it again; theirs is the only flagship pilot project mentioned.

You can see the original article on the European Union website here where they have described other projects including this one, PRESENCCIA,

‘Light switches, TV remote controls and even house keys could become a thing of the past thanks to brain-computer interface (BCI) technology being developed in Europe that lets users perform everyday tasks with thoughts alone.’ So begins a story on ICT Results about a pioneering EU-funded FET project called Presenccia*.

Primary applications of BCI are in gaming/virtual reality (VR), home entertainment and domestic care, but the project partners also see their work helping the medical profession. ‘A virtual environment could be used to train a disabled person to control an electric wheelchair through a BCI,’ explained Mel Slater, the project coordinator. ‘It is much safer for them to learn in VR than in the real world, where mistakes could have physical consequences.’

So, PRESENCCIA is a project whereby people will be trained to use a BCI in virtual reality before attempting it in real life. I wish there was a bit more information about this BCI technology that is being developed in Europe as I am deeply fascinated and horrified by this notion of thought waves that ‘turn light switches on and off’ or possibly allow you to make a phone call as Professor Mark Welland at Cambridge University was speculating in 2010 (mentioned in my April 30, 2010 posting [scroll 1/2 way down]). Welland did mention that you would need some sort of brain implant to achieve a phone call with your thought waves, which is the aspect that makes me most uncomfortable.