Tag Archives: artificial retina

Neuristors and brainlike computing

As you might suspect, a neuristor is based on a memristor .(For a description of a memristor there’s this Wikipedia entry and you can search this blog with the tags ‘memristor’ and neuromorphic engineering’ for more here.)

Being new to neuristors ,I needed a little more information before reading the latest and found this Dec. 24, 2012 article by John Timmer for Ars Technica (Note: Links have been removed),

Computing hardware is composed of a series of binary switches; they’re either on or off. The other piece of computational hardware we’re familiar with, the brain, doesn’t work anything like that. Rather than being on or off, individual neurons exhibit brief spikes of activity, and encode information in the pattern and timing of these spikes. The differences between the two have made it difficult to model neurons using computer hardware. In fact, the recent, successful generation of a flexible neural system required that each neuron be modeled separately in software in order to get the sort of spiking behavior real neurons display.

But researchers may have figured out a way to create a chip that spikes. The people at HP labs who have been working on memristors have figured out a combination of memristors and capacitors that can create a spiking output pattern. Although these spikes appear to be more regular than the ones produced by actual neurons, it might be possible to create versions that are a bit more variable than this one. And, more significantly, it should be possible to fabricate them in large numbers, possibly right on a silicon chip.

The key to making the devices is something called a Mott insulator. These are materials that would normally be able to conduct electricity, but are unable to because of interactions among their electrons. Critically, these interactions weaken with elevated temperatures. So, by heating a Mott insulator, it’s possible to turn it into a conductor. In the case of the material used here, NbO2, the heat is supplied by resistance itself. By applying a voltage to the NbO2 in the device, it becomes a resistor, heats up, and, when it reaches a critical temperature, turns into a conductor, allowing current to flow through. But, given the chance to cool off, the device will return to its resistive state. Formally, this behavior is described as a memristor.

To get the sort of spiking behavior seen in a neuron, the authors turned to a simplified model of neurons based on the proteins that allow them to transmit electrical signals. When a neuron fires, sodium channels open, allowing ions to rush into a nerve cell, and changing the relative charges inside and outside its membrane. In response to these changes, potassium channels then open, allowing different ions out, and restoring the charge balance. That shuts the whole thing down, and allows various pumps to start restoring the initial ion balance.

Here’s a link to and a citation for the research paper described in Timmer’s article,

A scalable neuristor built with Mott memristors by Matthew D. Pickett, Gilberto Medeiros-Ribeiro, & R. Stanley Williams. Nature Materials 12, 114–117 (2013) doi:10.1038/nmat3510 Published online 16 December 2012

This paper is behind a paywall.

A July 28, 2017 news item on Nanowerk provides an update on neuristors,

A future android brain like that of Star Trek’s Commander Data might contain neuristors, multi-circuit components that emulate the firings of human neurons.

Neuristors already exist today in labs, in small quantities, and to fuel the quest to boost neuristors’ power and numbers for practical use in brain-like computing, the U.S. Department of Defense has awarded a $7.1 million grant to a research team led by the Georgia Institute of Technology. The researchers will mainly work on new metal oxide materials that buzz electronically at the nanoscale to emulate the way human neural networks buzz with electric potential on a cellular level.

A July 28, 2017 Georgia Tech news release, which originated the news item, delves further into neuristors and the proposed work leading to an artificial retina that can learn (!). This was not where I was expecting things to go,

But let’s walk expectations back from the distant sci-fi future into the scientific present: The research team is developing its neuristor materials to build an intelligent light sensor, and not some artificial version of the human brain, which would require hundreds of trillions of circuits.

“We’re not going to reach circuit complexities of that magnitude, not even a tenth,” said Alan Doolittle, a professor at Georgia Tech’s School of Electrical and Computer Engineering. “Also, currently science doesn’t really know yet very well how the human brain works, so we can’t duplicate it.”

Intelligent retina

But an artificial retina that can learn autonomously appears well within reach of the research team from Georgia Tech and Binghamton University. Despite the term “retina,” the development is not intended as a medical implant, but it could be used in advanced image recognition cameras for national defense and police work.

At the same time, it would significantly advance brain-mimicking, or neuromorphic, computing. The research field that takes its cues from what science already does know about how the brain computes to develop exponentially more powerful computing.

The retina would be comprised of an array of ultra-compact circuits called neuristors (a word combining “neuron” and “transistor”) that sense light, compute an image out of it and store the image. All three of the functions would occur simultaneously and nearly instantaneously.

“The same device senses, computes and stores the image,” Doolittle said. “The device is the sensor, and it’s the processor, and it’s the memory all at the same time.” A neuristor itself is comprised in part of devices called memristors inspired by the way human neurons work.

Brain vs. PC

That cuts out loads of processing and memory lag time that are inherent in traditional computing.

Take the device you’re reading this article on: Its microprocessor has to tap a separate memory component to get data, then do some processing, tap memory again for more data, process some more, etc. “That back-and-forth from memory to microprocessor has created a bottleneck,” Doolittle said.

A neuristor array breaks the bottleneck by emulating the extreme flexibility of biological nervous systems: When a brain computes, it uses a broad set of neural pathways that flash with enormous data. Then, later, to compute the same thing again, it will use quite different neural paths.

Traditional computer pathways, by contrast, are hardwired. For example, look at a present-day processor and you’ll see lines etched into it. Those are pathways that computational signals are limited to.

The new memristor materials at the heart of the neuristor are not etched, and signals flow through the surface very freely, more like they do through the brain, exponentially increasing the number of possible pathways computation can take. That helps the new intelligent retina compute powerfully and swiftly.

Terrorists, missing children

The retina’s memory could also store thousands of photos, allowing it to immediately match up what it sees with the saved images. The retina could pinpoint known terror suspects in a crowd, find missing children, or identify enemy aircraft virtually instantaneously, without having to trawl databases to correctly identify what is in the images.

Even if you take away the optics, the new neuristor arrays still advance artificial intelligence. Instead of light, a surface of neuristors could absorb massive data streams at once, compute them, store them, and compare them to patterns of other data, immediately. It could even autonomously learn to extrapolate further information, like calculating the third dimension out of data from two dimensions.

“It will work with anything that has a repetitive pattern like radar signatures, for example,” Doolittle said. “Right now, that’s too challenging to compute, because radar information is flying out at such a high data rate that no computer can even think about keeping up.”

Smart materials

The research project’s title acronym CEREBRAL may hint at distant dreams of an artificial brain, but what it stands for spells out the present goal in neuromorphic computing: Cross-disciplinary Electronic-ionic Research Enabling Biologically Realistic Autonomous Learning.

The intelligent retina’s neuristors are based on novel metal oxide nanotechnology materials, unique to Georgia Tech. They allow computing signals to flow flexibly across pathways that are electronic, which is customary in computing, and at the same time make use of ion motion, which is more commonly know from the way batteries and biological systems work.

The new materials have already been created, and they work, but the researchers don’t yet fully understand why.

Much of the project is dedicated to examining quantum states in the materials and how those states help create useful electronic-ionic properties. Researchers will view them by bombarding the metal oxides with extremely bright x-ray photons at the recently constructed National Synchrotron Light Source II.

Grant sub-awardee Binghamton University is located close by, and Binghamton physicists will run experiments and hone them via theoretical modeling.

‘Sea of lithium’

The neuristors are created mainly by the way the metal oxide materials are grown in the lab, which has advantages over building neuristors in a more wired way.

This materials-growing approach is conducive to mass production. Also, though neuristors in general free signals to take multiple pathways, Georgia Tech’s neuristors do it much more flexibly thanks to chemical properties.

“We also have a sea of lithium, and it’s like an infinite reservoir of computational ionic fluid,” Doolittle said. The lithium niobite imitates the way ionic fluid bathes biological neurons and allows them to flash with electric potential while signaling. In a neuristor array, the lithium niobite helps computational signaling move in myriad directions.

“It’s not like the typical semiconductor material, where you etch a line, and only that line has the computational material,” Doolittle said.

Commander Data’s brain?

“Unlike any other previous neuristors, our neuristors will adapt themselves in their computational-electronic pulsing on the fly, which makes them more like a neurological system,” Doolittle said. “They mimic biology in that we have ion drift across the material to create the memristors (the memory part of neuristors).”

Brains are far superior to computers at most things, but not all. Brains recognize objects and do motor tasks much better. But computers are much better at arithmetic and data processing.

Neuristor arrays can meld both types of computing, making them biological and algorithmic at once, a bit like Commander Data’s brain.

The research is being funded through the U.S. Department of Defense’s Multidisciplinary University Research Initiatives (MURI) Program under grant number FOA: N00014-16-R-FO05. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of those agencies.

Fascinating, non?

Clinical trial for bionic eye (artificial retinal implant) shows encouraging results (safety and efficacy)

The Argus II artificial retina was first mentioned here in a Feb. 15, 2013 posting (scroll down about 50% of the way) when it received US Food and Drug Administration (FDA) commercial approval. In retrospect that seems puzzling since the results of a three-year clinical trial have just been reported in a June 23, 2015 news item on ScienceDaily (Note: There was one piece of information about the approval which didn’t make its way into the information disseminated in 2013),

The three-year clinical trial results of the retinal implant popularly known as the “bionic eye,” have proven the long-term efficacy, safety and reliability of the device that restores vision in those blinded by a rare, degenerative eye disease. The findings show that the Argus II significantly improves visual function and quality of life for people blinded by retinitis pigmentosa. They are being published online in Ophthalmology, the journal of the American Academy of Ophthalmology.

A June 23, 2015 American Academy of Ophthalmology news release (also on EurekAlert), which originated the news item, describes the condition the Argus II is designed for and that crucial bit of FDA information,

Retinitis pigmentosa is an incurable disease that affects about 1 in 4,000 Americans and causes slow vision loss that eventually leads to blindness.[1] The Argus II system was designed to help provide patients who have lost their sight due to the disease with some useful vision. Through the device, patients with retinitis pigmentosa are able to see patterns of light that the brain learns to interpret as an image. The system uses a miniature video camera stored in the patient’s glasses to send visual information to a small computerized video processing unit which can be stored in a pocket. This computer turns the image to electronic signals that are sent wirelessly to an electronic device implanted on the retina, the layer of light-sensing cells lining the back of the eye.

The Argus II received Food and Drug Administration (FDA) approval as a Humanitarian Use Device (HUD) in 2013, which is an approval specifically for devices intended to benefit small populations and/or rare conditions. [emphasis mine]

I don’t recall seeing “Humanitarian Use Device (HUD)” in the 2013 materials which focused on the FDA’s commercial use approval. I gather from this experience that commercial use doesn’t necessarily mean they’ve finished with clinical trials and are ready to start selling the product. In any event, I will try to take a closer look at the actual approvals the next time, assuming I can make sense of the language.

After all the talk about it, here’s what the device looks like,

 Caption: Figure A, The implanted portions of the Argus II System. Figure B, The external components of the Argus II System. Images in real time are captured by camera mounted on the glasses. The video processing unit down-samples and processes the image, converting it to stimulation patterns. Data and power are sent via radiofrequency link form the transmitter antenna on the glasses to the receiver antenna around the eye. A removable, rechargeable battery powers the system. Credit: Photo courtesy of Second Sight Medical Products, Inc.


Caption: Figure A, The implanted portions of the Argus II System. Figure B, The external components of the Argus II System. Images in real time are captured by camera mounted on the glasses. The video processing unit down-samples and processes the image, converting it to stimulation patterns. Data and power are sent via radiofrequency link form the transmitter antenna on the glasses to the receiver antenna around the eye. A removable, rechargeable battery powers the system.
Credit: Photo courtesy of Second Sight Medical Products, Inc.

The news release offers more details about the recently completed clinical trial,

To further evaluate the safety, reliability and benefit of the device, a clinical trial of 30 people, aged 28 to 77, was conducted in the United States and Europe. All of the study participants had little or no light perception in both eyes. The researchers conducted visual function tests using both a computer screen and real-world conditions, including finding and touching a door and identifying and following a line on the ground. A Functional Low-vision Observer Rated Assessment (FLORA) was also performed by independent visual rehabilitation experts at the request of the FDA to assess the impact of the Argus II system on the subjects’ everyday lives, including extensive interviews and tasks performed around the home.

The visual function results indicated that up to 89 percent of the subjects performed significantly better with the device. The FLORA found that among the subjects, 80 percent received benefit from the system when considering both functional vision and patient-reported quality of life, and no subjects were affected negatively.

After one year, two-thirds of the subjects had not experienced device- or surgery-related serious adverse events. After three years, there were no device failures. Throughout the three years, 11 subjects experienced serious adverse events, most of which occurred soon after implantation and were successfully treated. One of these treatments, however, was to remove the device due to recurring erosion after the suture tab on the device became damaged.

“This study shows that the Argus II system is a viable treatment option for people profoundly blind due to retinitis pigmentosa – one that can make a meaningful difference in their lives and provides a benefit that can last over time,” said Allen C. Ho, M.D., lead author of the study and director of the clinical retina research unit at Wills Eye Hospital. “I look forward to future studies with this technology which may make possible expansion of the intended use of the device, including treatment for other diseases and eye injuries.”

Here’s a link to a PDF of and a citation for the paper,

Long-Term Results from an Epiretinal Prosthesis to Restore Sight to the Blind by Allen C. Ho,Mark S. Humayun, Jessy D. Dorn, Lyndon da Cruz, Gislin Dagnelie,James Handa, Pierre-Olivier Barale, José-Alain Sahel, Paulo E. Stanga, Farhad Hafezi, Avinoam B. Safran, Joel Salzmann, Arturo Santos, David Birch, Rand Spencer, Artur V. Cideciyan, Eugene de Juan, Jacque L. Duncan, Dean Eliott, Amani Fawzi, Lisa C. Olmos de Koo, Gary C. Brown, Julia A. Haller, Carl D. Regillo, Lucian V. Del Priore, Aries Arditi, Duane R. Geruschat, Robert J. Greenberg. Opthamology, June 2015 http://dx.doi.org/10.1016/j.ophtha.2015.04.032

This paper is open access.

Graphene and an artificial retina

A graphene-based artificial retina project has managed to intermingle the European Union’s two major FET (Future and Emerging Technologies) funding projects, 1B Euros each to be disbursed over 10 years, the Graphene Flagship and the Human Brain Project. From an Aug. 7, 2014 Technische Universitaet Muenchen (TUM) news release (also on EurekAlert),

Because of its unusual properties, graphene holds great potential for applications, especially in the field of medical technology. A team of researchers led by Dr. Jose A. Garrido at the Walter Schottky Institut of the TUM is taking advantage of these properties. In collaboration with partners from the Institut de la Vision of the Université Pierre et Marie Curie in Paris and the French company Pixium Vision, the physicists are developing key components of an artificial retina made of graphene.

Retina implants can serve as optical prostheses for blind people whose optical nerves are still intact. The implants convert incident light into electrical impulses that are transmitted to the brain via the optical nerve. There, the information is transformed into images. Although various approaches for implants exist today, the devices are often rejected by the body and the signals transmitted to the brain are generally not optimal.

Already funded by the Human Brain Project as part of the Neurobotics effort, Garrido and his colleagues will now also receive funding from the Graphene Flagship. As of July 2014, the Graphene Flagship has added 86 new partners including TUM according to the news release.

Here’s an image of an ‘invisible’ graphene sensor (a precursor to developing an artificial retina),

Graphene electronics can be prepared on flexible substrates. Only the gold metal leads are visible in the transparent graphene sensor. (Photo: Natalia Hutanu / TUM)

Graphene electronics can be prepared on flexible substrates. Only the gold metal leads are visible in the transparent graphene sensor. (Photo: Natalia Hutanu / TUM)

Artificial retinas were first featured on this blog in an Aug. 18, 2011 posting about video game Deus Ex: Human Revolution which features a human character with artificial sight. The post includes links to a video of a scientist describing an artificial retina trial with 30 people and an Israeli start-up company, ‘Nano Retina’, along with information about ‘Eyeborg’, a Canadian filmmaker who on losing an eye in an accident had a camera implanted in the previously occupied eye socket.

More recently, a Feb. 15, 2013 posting featured news about the US Food and Drug Administration’s decision to allow sale of the first commercial artificial retinas in the US in the context of news about a neuroprosthetic implant in a rat which allowed it to see in the infrared range, normally an impossible feat.

‘Touching’ infrared light, if you’re a rat followed by announcement of US FDA approval of first commercial artificial retina (bionic eye)

Researcher Miguel Nicolelis and his colleagues at Duke University have implanted a neuroprosthetic device in the portion of a rat’s brain related to touch that allows the rats to see infrared light. From the Feb. 12, 2013 news release on EurekAlert,

Researchers have given rats the ability to “touch” infrared light, normally invisible to them, by fitting them with an infrared detector wired to microscopic electrodes implanted in the part of the mammalian brain that processes tactile information. The achievement represents the first time a brain-machine interface has augmented a sense in adult animals, said Duke University neurobiologist Miguel Nicolelis, who led the research team.

The experiment also demonstrated for the first time that a novel sensory input could be processed by a cortical region specialized in another sense without “hijacking” the function of this brain area said Nicolelis. This discovery suggests, for example, that a person whose visual cortex was damaged could regain sight through a neuroprosthesis implanted in another cortical region, he said.

Although the initial experiments tested only whether rats could detect infrared light, there seems no reason that these animals in the future could not be given full-fledged infrared vision, said Nicolelis. For that matter, cortical neuroprostheses could be developed to give animals or humans the ability to see in any region of the electromagnetic spectrum, or even magnetic fields. “We could create devices sensitive to any physical energy,” he said. “It could be magnetic fields, radio waves, or ultrasound. We chose infrared initially because it didn’t interfere with our electrophysiological recordings.”

Interestingly, the research was supported by the US National Institute of Mental Health (as per the news release).

The researchers have more to say about what they’re doing,

“The philosophy of the field of brain-machine interfaces has until now been to attempt to restore a motor function lost to lesion or damage of the central nervous system,” said Thomson, [Eric Thomson] first author of the study. “This is the first paper in which a neuroprosthetic device was used to augment function—literally enabling a normal animal to acquire a sixth sense.”

Here’s how they conducted the research,

The mammalian retina is blind to infrared light, and mammals cannot detect any heat generated by the weak infrared light used in the studies. In their experiments, the researchers used a test chamber that contained three light sources that could be switched on randomly. Using visible LED lights, they first taught each rat to choose the active light source by poking its nose into an attached port to receive a reward of a sip of water.

After training the rats, the researchers implanted in their brains an array of stimulating microelectrodes, each roughly a tenth the diameter of a human hair. The microelectrodes were implanted in the cortical region that processes touch information from the animals’ facial whiskers.

Attached to the microelectrodes was an infrared detector affixed to the animals’ foreheads. The system was programmed so that orientation toward an infrared light would trigger an electrical signal to the brain. The signal pulses increased in frequency with the intensity and proximity of the light.

The researchers returned the animals to the test chamber, gradually replacing the visible lights with infrared lights. At first in infrared trials, when a light was switched on the animals would tend to poke randomly at the reward ports and scratch at their faces, said Nicolelis. This indicated that they were initially interpreting the brain signals as touch. However, over about a month, the animals learned to associate the brain signal with the infrared source. They began to actively “forage” for the signal, sweeping their heads back and forth to guide themselves to the active light source. Ultimately, they achieved a near-perfect score in tracking and identifying the correct location of the infrared light source.

To ensure that the animals were really using the infrared detector and not their eyes to sense the infrared light, the researchers conducted trials in which the light switched on, but the detector sent no signal to the brain. In these trials, the rats did not react to the infrared light.

Their finding could have an impact on notions of mammalian brain plasticity,

A key finding, said Nicolelis, was that enlisting the touch cortex for light detection did not reduce its ability to process touch signals. “When we recorded signals from the touch cortex of these animals, we found that although the cells had begun responding to infrared light, they continued to respond to whisker touch. It was almost like the cortex was dividing itself evenly so that the neurons could process both types of information.

This finding of brain plasticity is in contrast with the “optogenetic” approach to brain stimulation, which holds that a particular neuronal cell type should be stimulated to generate a desired neurological function. Rather, said Nicolelis, the experiments demonstrate that a broad electrical stimulation, which recruits many distinct cell types, can drive a cortical region to adapt to a new source of sensory input.

All of this work is part of Nicolelis’ larger project ‘Walk Again’ which is mentioned in my March 16, 2012 posting and includes a reference to some ethical issues raised by the work. Briefly, Nicolelis and an international team of collaborators are developing a brain-machine interface that will enable full mobility for people who are severely paralyzed. From the news release,

The Walk Again Project has recently received a $20 million grant from FINEP, a Brazilian research funding agency to allow the development of the first brain-controlled whole body exoskeleton aimed at restoring mobility in severely paralyzed patients. A first demonstration of this technology is expected to happen in the opening game of the 2014 Soccer World Cup in Brazil.

Expanding sensory abilities could also enable a new type of feedback loop to improve the speed and accuracy of such exoskeletons, said Nicolelis. For example, while researchers are now seeking to use tactile feedback to allow patients to feel the movements produced by such “robotic vests,” the feedback could also be in the form of a radio signal or infrared light that would give the person information on the exoskeleton limb’s position and encounter with objects.

There’s more information including videos about the work with infrared light and rats at the Nicolelis Lab website.  Here’s a citation for and link to the team’s research paper,

Perceiving invisible light through a somatosensory cortical prosthesis by Eric E. Thomson, Rafael Carra, & Miguel A.L. Nicolelis. Nature Communications Published 12 Feb 2013 DOI: 10.1038/ncomms2497

Meanwhile, the US Food and Drug Administraton (FDA) has approved the first commercial artificial retina, from the Feb. 14, 2013 news release,

The U.S. Food and Drug Administration (FDA) granted market approval to an artificial retina technology today, the first bionic eye to be approved for patients in the United States. The prosthetic technology was developed in part with support from the National Science Foundation (NSF).

The device, called the Argus® II Retinal Prosthesis System, transmits images from a small, eye-glass-mounted camera wirelessly to a microelectrode array implanted on a patient’s damaged retina. The array sends electrical signals via the optic nerve, and the brain interprets a visual image.

The FDA approval currently applies to individuals who have lost sight as a result of severe to profound retinitis pigmentosa (RP), an ailment that affects one in every 4,000 Americans. The implant allows some individuals with RP, who are completely blind, to locate objects, detect movement, improve orientation and mobility skills and discern shapes such as large letters.

The Argus II is manufactured by, and will be distributed by, Second Sight Medical Products of Sylmar, Calif., which is part of the team of scientists and engineers from the university, federal and private sectors who spent nearly two decades developing the system with public and private investment.

Scientists are often compelled to do research in an area inspired by family,

“Seeing my grandmother go blind motivated me to pursue ophthalmology and biomedical engineering to develop a treatment for patients for whom there was no foreseeable cure,” says the technology’s co-developer, Mark Humayun, associate director of research at the Doheny Eye Institute at the University of Southern California and director of the NSF Engineering Research Center for Biomimetic MicroElectronic Systems (BMES). …”

There’s also been considerable government investment,

The effort by Humayun and his colleagues has received early and continuing support from NSF, the National Institutes of Health and the Department of Energy, with grants totaling more than $100 million. The private sector’s support nearly matched that of the federal government.

“The retinal implant exemplifies how NSF grants for high-risk, fundamental research can directly result in ground-breaking technologies decades later,” said Acting NSF Assistant Director for Engineering Kesh Narayanan. “In collaboration with the Second Sight team and the courageous patients who volunteered to have experimental surgery to implant the first-generation devices, the researchers of NSF’s Biomimetic MicroElectronic Systems Engineering Research Center are developing technologies that may ultimately have as profound an impact on blindness as the cochlear implant has had for hearing loss.”

Leaving aside controversies about cochlear implants and the possibility of such controversies with artificial retinas (bionic eyes), it’s interesting to note that this device is dependent on an external camera,

The researchers’ efforts have bridged cellular biology–necessary for understanding how to stimulate the retinal ganglion cells without permanent damage–with microelectronics, which led to the miniaturized, low-power integrated chip for performing signal conversion, conditioning and stimulation functions. The hardware was paired with software processing and tuning algorithms that convert visual imagery to stimulation signals, and the entire system had to be incorporated within hermetically sealed packaging that allowed the electronics to operate in the vitreous fluid of the eye indefinitely. Finally, the research team had to develop new surgical techniques in order to integrate the device with the body, ensuring accurate placement of the stimulation electrodes on the retina.

“The artificial retina is a great engineering challenge under the interdisciplinary constraint of biology, enabling technology, regulatory compliance, as well as sophisticated design science,” adds Liu.  [Wentai Liu of the University of California, Los Angeles] “The artificial retina provides an interface between biotic and abiotic systems. Its unique design characteristics rely on system-level optimization, rather than the more common practice of component optimization, to achieve miniaturization and integration. Using the most advanced semiconductor technology, the engine for the artificial retina is a ‘system on a chip’ of mixed voltages and mixed analog-digital design, which provides self-contained power and data management and other functionality. This design for the artificial retina facilitates both surgical procedures and regulatory compliance.”

The Argus II design consists of an external video camera system matched to the implanted retinal stimulator, which contains a microelectrode array that spans 20 degrees of visual field. [emphasis mine] …

“The external camera system-built into a pair of glasses-streams video to a belt-worn computer, which converts the video into stimulus commands for the implant,” says Weiland [USC researcher Jim Weiland], “The belt-worn computer encodes the commands into a wireless signal that is transmitted to the implant, which has the necessary electronics to receive and decode both wireless power and data. Based on those data, the implant stimulates the retina with small electrical pulses. The electronics are hermetically packaged and the electrical stimulus is delivered to the retina via a microelectrode array.”

You can see some footage of people using artificial retinas in the context of Grégoire Cosendai’s TEDx Vienna presentation. As I noted in my Aug. 18, 2011 posting where this talk and developments in human enhancement are mentioned, the relevant material can be seen at approximately 13 mins., 25 secs. in Cosendai’s talk.

Second Sight Medical Devices can be found here.