Tag Archives: optogenetics

Transparent graphene electrode technology and complex brain imaging

Michael Berger has written a May 24, 2018 Nanowerk Spotlight article about some of the latest research on transparent graphene electrode technology and the brain (Note: A link has been removed),

In new work, scientists from the labs of Kuzum [Duygu Kuzum, an Assistant Professor of Electrical and Computer Engineering at the University of California, San Diego {UCSD}] and Anna Devor report a transparent graphene microelectrode neural implant that eliminates light-induced artifacts to enable crosstalk-free integration of 2-photon microscopy, optogenetic stimulation, and cortical recordings in the same in vivo experiment. The new class of transparent brain implant is based on monolayer graphene. It offers a practical pathway to investigate neuronal activity over multiple spatial scales extending from single neurons to large neuronal populations.

Conventional metal-based microelectrodes cannot be used for simultaneous measurements of multiple optical and electrical parameters, which are essential for comprehensive investigation of brain function across spatio-temporal scales. Since they are opaque, they block the field of view of the microscopes and generate optical shadows impeding imaging.

More importantly, they cause light induced artifacts in electrical recordings, which can significantly interfere with neural signals. Transparent graphene electrode technology presented in this paper addresses these problems and allow seamless and crosstalk-free integration of optical and electrical sensing and manipulation technologies.

In their work, the scientists demonstrate that by careful design of key steps in the fabrication process for transparent graphene electrodes, the light-induced artifact problem can be mitigated and virtually artifact-free local field potential (LFP) recordings can be achieved within operating light intensities.

“Optical transparency of graphene enables seamless integration of imaging, optogenetic stimulation and electrical recording of brain activity in the same experiment with animal models,” Kuzum explains. “Different from conventional implants based on metal electrodes, graphene-based electrodes do not generate any electrical artifacts upon interacting with light used for imaging or optogenetics. That enables crosstalk free integration of three modalities: imaging, stimulation and recording to investigate brain activity over multiple spatial scales extending from single neurons to large populations of neurons in the same experiment.”

The team’s new fabrication process avoids any crack formation in the transfer process, resulting in a 95-100% yield for the electrode arrays. This fabrication quality is important for expanding this technology to high-density large area transparent arrays to monitor brain-scale cortical activity in large animal models or humans.

“Our technology is also well-suited for neurovascular and neurometabolic studies, providing a ‘gold standard’ neuronal correlate for optical measurements of vascular, hemodynamic, and metabolic activity,” Kuzum points out. “It will find application in multiple areas, advancing our understanding of how microscopic neural activity at the cellular scale translates into macroscopic activity of large neuron populations.”

“Combining optical techniques with electrical recordings using graphene electrodes will allow to connect the large body of neuroscience knowledge obtained from animal models to human studies mainly relying on electrophysiological recordings of brain-scale activity,” she adds.

Next steps for the team involve employing this technology to investigate coupling and information transfer between different brain regions.

This work is part of the US BRAIN (Brain Research through Advancing Innovative Neurotechnologies) initiative and there’s more than one team working with transparent graphene electrodes. John Hewitt in an Oct. 21, 2014 posting on ExtremeTech describes two other teams’ work (Note: Links have been removed),

The solution [to the problems with metal electrodes], now emerging from multiple labs throughout the universe is to build flexible, transparent electrode arrays from graphene. Two studies in the latest issue of Nature Communications, one from the University of Wisconsin-Madison and the other from Penn [University of Pennsylvania], describe how to build these devices.

The University of Wisconsin researchers are either a little bit smarter or just a little bit richer, because they published their work open access. It’s a no-brainer then that we will focus on their methods first, and also in more detail. To make the arrays, these guys first deposited the parylene (polymer) substrate on a silicon wafer, metalized it with gold, and then patterned it with an electron beam to create small contact pads. The magic was to then apply four stacked single-atom-thick graphene layers using a wet transfer technique. These layers were then protected with a silicon dioxide layer, another parylene layer, and finally molded into brain signal recording goodness with reactive ion etching.

PennTransparentelectrodeThe researchers went with four graphene layers because that provided optimal mechanical integrity and conductivity while maintaining sufficient transparency. They tested the device in opto-enhanced mice whose neurons expressed proteins that react to blue light. When they hit the neurons with a laser fired in through the implant, the protein channels opened and fired the cell beneath. The masterstroke that remained was then to successfully record the electrical signals from this firing, sit back, and wait for the Nobel prize office to call.

The Penn State group [Note: Every reearcher mentioned in the paper Hewitt linked to is from the University of Pennsylvania] in the  used a similar 16-spot electrode array (pictured above right), and proceeded — we presume — in much the same fashion. Their angle was to perform high-resolution optical imaging, in particular calcium imaging, right out through the transparent electrode arrays which simultaneously recorded in high-temporal-resolution signals. They did this in slices of the hippocampus where they could bring to bear the complex and multifarious hardware needed to perform confocal and two-photon microscopy. These latter techniques provide a boost in spatial resolution by zeroing in over narrow planes inside the specimen, and limiting the background by the requirement of two photons to generate an optical signal. We should mention that there are voltage sensitive dyes available, in addition to standard calcium dyes, which can almost record the fastest single spikes, but electrical recording still reigns supreme for speed.

What a mouse looks like with an optogenetics system plugged in

What a mouse looks like with an optogenetics system plugged in

One concern of both groups in making these kinds of simultaneous electro-optic measurements was the generation of light-induced artifacts in the electrical recordings. This potential complication, called the Becqueral photovoltaic effect, has been known to exist since it was first demonstrated back in 1839. When light hits a conventional metal electrode, a photoelectrochemical (or more simply, a photovoltaic) effect occurs. If present in these recordings, the different signals could be highly disambiguatable. The Penn researchers reported that they saw no significant artifact, while the Wisconsin researchers saw some small effects with their device. In particular, when compared with platinum electrodes put into the opposite side cortical hemisphere, the Wisconsin researchers found that the artifact from graphene was similar to that obtained from platinum electrodes.

Here’s a link to and a citation for the latest research from UCSD,

Deep 2-photon imaging and artifact-free optogenetics through transparent graphene microelectrode arrays by Martin Thunemann, Yichen Lu, Xin Liu, Kıvılcım Kılıç, Michèle Desjardins, Matthieu Vandenberghe, Sanaz Sadegh, Payam A. Saisan, Qun Cheng, Kimberly L. Weldy, Hongming Lyu, Srdjan Djurovic, Ole A. Andreassen, Anders M. Dale, Anna Devor, & Duygu Kuzum. Nature Communicationsvolume 9, Article number: 2035 (2018) doi:10.1038/s41467-018-04457-5 Published: 23 May 2018

This paper is open access.

You can find out more about the US BRAIN initiative here and if you’re curious, you can find out more about the project at UCSD here. Duygu Kuzum (now at UCSD) was at  the University of Pennsylvania in 2014 and participated in the work mentioned in Hewitt’s 2014 posting.

Stretchy optical materials for implants that could pulse light

An Oct. 17, 2016 Massachusetts Institute of Technology (MIT) news release (also on EurekAlert) by Emily Chu describes research that could lead to long-lasting implants offering preventive health strategies,

Researchers from MIT and Harvard Medical School have developed a biocompatible and highly stretchable optical fiber made from hydrogel — an elastic, rubbery material composed mostly of water. The fiber, which is as bendable as a rope of licorice, may one day be implanted in the body to deliver therapeutic pulses of light or light up at the first sign of disease. [emphasis mine]

The researchers say the fiber may serve as a long-lasting implant that would bend and twist with the body without breaking down. The team has published its results online in the journal Advanced Materials.

Using light to activate cells, and particularly neurons in the brain, is a highly active field known as optogenetics, in which researchers deliver short pulses of light to targeted tissues using needle-like fibers, through which they shine light from an LED source.

“But the brain is like a bowl of Jell-O, whereas these fibers are like glass — very rigid, which can possibly damage brain tissues,” says Xuanhe Zhao, the Robert N. Noyce Career Development Associate Professor in MIT’s Department of Mechanical Engineering. “If these fibers could match the flexibility and softness of the brain, they could provide long-term more effective stimulation and therapy.”

Getting to the core of it

Zhao’s group at MIT, including graduate students Xinyue Liu and Hyunwoo Yuk, specializes in tuning the mechanical properties of hydrogels. The researchers have devised multiple recipes for making tough yet pliable hydrogels out of various biopolymers. The team has also come up with ways to bond hydrogels with various surfaces such as metallic sensors and LEDs, to create stretchable electronics.

The researchers only thought to explore hydrogel’s use in optical fibers after conversations with the bio-optics group at Harvard Medical School, led by Associate Professor Seok-Hyun (Andy) Yun. Yun’s group had previously fabricated an optical fiber from hydrogel material that successfully transmitted light through the fiber. However, the material broke apart when bent or slightly stretched. Zhao’s hydrogels, in contrast, could stretch and bend like taffy. The two groups joined efforts and looked for ways to incorporate Zhao’s hydrogel into Yun’s optical fiber design.

Yun’s design consists of a core material encased in an outer cladding. To transmit the maximum amount of light through the core of the fiber, the core and the cladding should be made of materials with very different refractive indices, or degrees to which they can bend light.

“If these two things are too similar, whatever light source flows through the fiber will just fade away,” Yuk explains. “In optical fibers, people want to have a much higher refractive index in the core, versus cladding, so that when light goes through the core, it bounces off the interface of the cladding and stays within the core.”

Happily, they found that Zhao’s hydrogel material was highly transparent and possessed a refractive index that was ideal as a core material. But when they tried to coat the hydrogel with a cladding polymer solution, the two materials tended to peel apart when the fiber was stretched or bent.

To bond the two materials together, the researchers added conjugation chemicals to the cladding solution, which, when coated over the hydrogel core, generated chemical links between the outer surfaces of both materials.

“It clicks together the carboxyl groups in the cladding, and the amine groups in the core material, like molecular-level glue,” Yuk says.

Sensing strain

The researchers tested the optical fibers’ ability to propagate light by shining a laser through fibers of various lengths. Each fiber transmitted light without significant attenuation, or fading. They also found that fibers could be stretched over seven times their original length without breaking.

Now that they had developed a highly flexible and robust optical fiber, made from a hydrogel material that was also biocompatible, the researchers began to play with the fiber’s optical properties, to see if they could design a fiber that could sense when and where it was being stretched.

They first loaded a fiber with red, green, and blue organic dyes, placed at specific spots along the fiber’s length. Next, they shone a laser through the fiber and stretched, for instance, the red region. They measured the spectrum of light that made it all the way through the fiber, and noted the intensity of the red light. They reasoned that this intensity relates directly to the amount of light absorbed by the red dye, as a result of that region being stretched.

In other words, by measuring the amount of light at the far end of the fiber, the researchers can quantitatively determine where and by how much a fiber was stretched.

“When you stretch a certain portion of the fiber, the dimensions of that part of the fiber changes, along with the amount of light that region absorbs and scatters, so in this way, the fiber can serve as a sensor of strain,” Liu explains.

“This is like a multistrain sensor through a single fiber,” Yuk adds. “So it can be an implantable or wearable strain gauge.”

The researchers imagine that such stretchable, strain-sensing optical fibers could be implanted or fitted along the length of a patient’s arm or leg, to monitor for signs of improving mobility.

Zhao envisions the fibers may also serve as sensors, lighting up in response to signs of disease.

“We may be able to use optical fibers for long-term diagnostics, to optically monitor tumors or inflammation,” he says. “The applications can be impactful.”

Here’s a link to and a citation for the paper,

Highly Stretchable, Strain Sensing Hydrogel Optical Fibers by Jingjing Guo, Xinyue Liu, Nan Jiang, Ali K. Yetisen, Hyunwoo Yuk, Changxi Yang, Ali Khademhosseini, Xuanhe Zhao, and Seok-Hyun Yun. Advanced Materials DOI: 10.1002/adma.201603160 Version of Record online: 7 OCT 2016

© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

‘Touching’ infrared light, if you’re a rat followed by announcement of US FDA approval of first commercial artificial retina (bionic eye)

Researcher Miguel Nicolelis and his colleagues at Duke University have implanted a neuroprosthetic device in the portion of a rat’s brain related to touch that allows the rats to see infrared light. From the Feb. 12, 2013 news release on EurekAlert,

Researchers have given rats the ability to “touch” infrared light, normally invisible to them, by fitting them with an infrared detector wired to microscopic electrodes implanted in the part of the mammalian brain that processes tactile information. The achievement represents the first time a brain-machine interface has augmented a sense in adult animals, said Duke University neurobiologist Miguel Nicolelis, who led the research team.

The experiment also demonstrated for the first time that a novel sensory input could be processed by a cortical region specialized in another sense without “hijacking” the function of this brain area said Nicolelis. This discovery suggests, for example, that a person whose visual cortex was damaged could regain sight through a neuroprosthesis implanted in another cortical region, he said.

Although the initial experiments tested only whether rats could detect infrared light, there seems no reason that these animals in the future could not be given full-fledged infrared vision, said Nicolelis. For that matter, cortical neuroprostheses could be developed to give animals or humans the ability to see in any region of the electromagnetic spectrum, or even magnetic fields. “We could create devices sensitive to any physical energy,” he said. “It could be magnetic fields, radio waves, or ultrasound. We chose infrared initially because it didn’t interfere with our electrophysiological recordings.”

Interestingly, the research was supported by the US National Institute of Mental Health (as per the news release).

The researchers have more to say about what they’re doing,

“The philosophy of the field of brain-machine interfaces has until now been to attempt to restore a motor function lost to lesion or damage of the central nervous system,” said Thomson, [Eric Thomson] first author of the study. “This is the first paper in which a neuroprosthetic device was used to augment function—literally enabling a normal animal to acquire a sixth sense.”

Here’s how they conducted the research,

The mammalian retina is blind to infrared light, and mammals cannot detect any heat generated by the weak infrared light used in the studies. In their experiments, the researchers used a test chamber that contained three light sources that could be switched on randomly. Using visible LED lights, they first taught each rat to choose the active light source by poking its nose into an attached port to receive a reward of a sip of water.

After training the rats, the researchers implanted in their brains an array of stimulating microelectrodes, each roughly a tenth the diameter of a human hair. The microelectrodes were implanted in the cortical region that processes touch information from the animals’ facial whiskers.

Attached to the microelectrodes was an infrared detector affixed to the animals’ foreheads. The system was programmed so that orientation toward an infrared light would trigger an electrical signal to the brain. The signal pulses increased in frequency with the intensity and proximity of the light.

The researchers returned the animals to the test chamber, gradually replacing the visible lights with infrared lights. At first in infrared trials, when a light was switched on the animals would tend to poke randomly at the reward ports and scratch at their faces, said Nicolelis. This indicated that they were initially interpreting the brain signals as touch. However, over about a month, the animals learned to associate the brain signal with the infrared source. They began to actively “forage” for the signal, sweeping their heads back and forth to guide themselves to the active light source. Ultimately, they achieved a near-perfect score in tracking and identifying the correct location of the infrared light source.

To ensure that the animals were really using the infrared detector and not their eyes to sense the infrared light, the researchers conducted trials in which the light switched on, but the detector sent no signal to the brain. In these trials, the rats did not react to the infrared light.

Their finding could have an impact on notions of mammalian brain plasticity,

A key finding, said Nicolelis, was that enlisting the touch cortex for light detection did not reduce its ability to process touch signals. “When we recorded signals from the touch cortex of these animals, we found that although the cells had begun responding to infrared light, they continued to respond to whisker touch. It was almost like the cortex was dividing itself evenly so that the neurons could process both types of information.

This finding of brain plasticity is in contrast with the “optogenetic” approach to brain stimulation, which holds that a particular neuronal cell type should be stimulated to generate a desired neurological function. Rather, said Nicolelis, the experiments demonstrate that a broad electrical stimulation, which recruits many distinct cell types, can drive a cortical region to adapt to a new source of sensory input.

All of this work is part of Nicolelis’ larger project ‘Walk Again’ which is mentioned in my March 16, 2012 posting and includes a reference to some ethical issues raised by the work. Briefly, Nicolelis and an international team of collaborators are developing a brain-machine interface that will enable full mobility for people who are severely paralyzed. From the news release,

The Walk Again Project has recently received a $20 million grant from FINEP, a Brazilian research funding agency to allow the development of the first brain-controlled whole body exoskeleton aimed at restoring mobility in severely paralyzed patients. A first demonstration of this technology is expected to happen in the opening game of the 2014 Soccer World Cup in Brazil.

Expanding sensory abilities could also enable a new type of feedback loop to improve the speed and accuracy of such exoskeletons, said Nicolelis. For example, while researchers are now seeking to use tactile feedback to allow patients to feel the movements produced by such “robotic vests,” the feedback could also be in the form of a radio signal or infrared light that would give the person information on the exoskeleton limb’s position and encounter with objects.

There’s more information including videos about the work with infrared light and rats at the Nicolelis Lab website.  Here’s a citation for and link to the team’s research paper,

Perceiving invisible light through a somatosensory cortical prosthesis by Eric E. Thomson, Rafael Carra, & Miguel A.L. Nicolelis. Nature Communications Published 12 Feb 2013 DOI: 10.1038/ncomms2497

Meanwhile, the US Food and Drug Administraton (FDA) has approved the first commercial artificial retina, from the Feb. 14, 2013 news release,

The U.S. Food and Drug Administration (FDA) granted market approval to an artificial retina technology today, the first bionic eye to be approved for patients in the United States. The prosthetic technology was developed in part with support from the National Science Foundation (NSF).

The device, called the Argus® II Retinal Prosthesis System, transmits images from a small, eye-glass-mounted camera wirelessly to a microelectrode array implanted on a patient’s damaged retina. The array sends electrical signals via the optic nerve, and the brain interprets a visual image.

The FDA approval currently applies to individuals who have lost sight as a result of severe to profound retinitis pigmentosa (RP), an ailment that affects one in every 4,000 Americans. The implant allows some individuals with RP, who are completely blind, to locate objects, detect movement, improve orientation and mobility skills and discern shapes such as large letters.

The Argus II is manufactured by, and will be distributed by, Second Sight Medical Products of Sylmar, Calif., which is part of the team of scientists and engineers from the university, federal and private sectors who spent nearly two decades developing the system with public and private investment.

Scientists are often compelled to do research in an area inspired by family,

“Seeing my grandmother go blind motivated me to pursue ophthalmology and biomedical engineering to develop a treatment for patients for whom there was no foreseeable cure,” says the technology’s co-developer, Mark Humayun, associate director of research at the Doheny Eye Institute at the University of Southern California and director of the NSF Engineering Research Center for Biomimetic MicroElectronic Systems (BMES). …”

There’s also been considerable government investment,

The effort by Humayun and his colleagues has received early and continuing support from NSF, the National Institutes of Health and the Department of Energy, with grants totaling more than $100 million. The private sector’s support nearly matched that of the federal government.

“The retinal implant exemplifies how NSF grants for high-risk, fundamental research can directly result in ground-breaking technologies decades later,” said Acting NSF Assistant Director for Engineering Kesh Narayanan. “In collaboration with the Second Sight team and the courageous patients who volunteered to have experimental surgery to implant the first-generation devices, the researchers of NSF’s Biomimetic MicroElectronic Systems Engineering Research Center are developing technologies that may ultimately have as profound an impact on blindness as the cochlear implant has had for hearing loss.”

Leaving aside controversies about cochlear implants and the possibility of such controversies with artificial retinas (bionic eyes), it’s interesting to note that this device is dependent on an external camera,

The researchers’ efforts have bridged cellular biology–necessary for understanding how to stimulate the retinal ganglion cells without permanent damage–with microelectronics, which led to the miniaturized, low-power integrated chip for performing signal conversion, conditioning and stimulation functions. The hardware was paired with software processing and tuning algorithms that convert visual imagery to stimulation signals, and the entire system had to be incorporated within hermetically sealed packaging that allowed the electronics to operate in the vitreous fluid of the eye indefinitely. Finally, the research team had to develop new surgical techniques in order to integrate the device with the body, ensuring accurate placement of the stimulation electrodes on the retina.

“The artificial retina is a great engineering challenge under the interdisciplinary constraint of biology, enabling technology, regulatory compliance, as well as sophisticated design science,” adds Liu.  [Wentai Liu of the University of California, Los Angeles] “The artificial retina provides an interface between biotic and abiotic systems. Its unique design characteristics rely on system-level optimization, rather than the more common practice of component optimization, to achieve miniaturization and integration. Using the most advanced semiconductor technology, the engine for the artificial retina is a ‘system on a chip’ of mixed voltages and mixed analog-digital design, which provides self-contained power and data management and other functionality. This design for the artificial retina facilitates both surgical procedures and regulatory compliance.”

The Argus II design consists of an external video camera system matched to the implanted retinal stimulator, which contains a microelectrode array that spans 20 degrees of visual field. [emphasis mine] …

“The external camera system-built into a pair of glasses-streams video to a belt-worn computer, which converts the video into stimulus commands for the implant,” says Weiland [USC researcher Jim Weiland], “The belt-worn computer encodes the commands into a wireless signal that is transmitted to the implant, which has the necessary electronics to receive and decode both wireless power and data. Based on those data, the implant stimulates the retina with small electrical pulses. The electronics are hermetically packaged and the electrical stimulus is delivered to the retina via a microelectrode array.”

You can see some footage of people using artificial retinas in the context of Grégoire Cosendai’s TEDx Vienna presentation. As I noted in my Aug. 18, 2011 posting where this talk and developments in human enhancement are mentioned, the relevant material can be seen at approximately 13 mins., 25 secs. in Cosendai’s talk.

Second Sight Medical Devices can be found here.