Category Archives: human enhancement

Fixed: The Science/Fiction of Human Enhancement

First the news, Fixed: The Science/Fiction of Human Enhancement is going to be broadcast on KCTS 9 (PBS [Public Broadcasting Service] station for Seattle/Yakima) on Wednesday, Aug. 26, 2015 at 7 pm PDT. From the KCTS 9 schedule,

From botox to bionic limbs, the human body is more “upgradeable” than ever. But how much of it can we alter and still be human? What do we gain or lose in the process? Award-winning documentary, Fixed: The Science/Fiction of Human Enhancement, explores the social impact of human biotechnologies. Haunting and humorous, poignant and political, Fixed rethinks “disability” and “normalcy” by exploring technologies that promise to change our bodies and minds forever.

This 2013 documentary has a predecessor titled ‘Fixed’, which I wrote about in an August 3, 2010 posting. The director for both ‘Fixeds’ is Regan Brashear.

It seems the latest version of Fixed builds on the themes present in the first, while integrating the latest scientific work (to 2013) in the field of human enhancement (from my August 3, 2010 posting),

As for the film, I found this at the University of California, Santa Cruz,

Fixed is a video documentary that explores the burgeoning field of “human enhancement” technologies from the perspective of individuals with disabilities. Fixed uses the current debates surrounding human enhancement technologies (i.e. bionic limbs, brain machine interfaces, prenatal screening technologies such as PGD or pre-implantation genetic diagnosis, etc.) to tackle larger questions about disability, inequality, and citizenship. This documentary asks the question, “Will these technologies ‘liberate’ humanity, or will they create even more inequality?”

You can find out more about the 2013 Fixed on its website or Facebook page (they list opportunities in the US, in Canada, and internationally to see the documentary). There is also a listing of PBS broadcasts available from the Fixed: The Science/Fiction of Human Enhancement Press page.

I recognized two names from the cast list on the Internet Movie Database (IMDB) page for Fixed: The Science/Fiction of Human Enhancement, Gregor Wolbring (he also appeared in the first ‘Fixed’) and Hugh Herr.

Gregor has been mentioned here a few times in connection with human enhancement. A Canadian professor at the University of Calgary, he’s active in the field of bioethics and you can find out more about Gregor and his work here.

Hugh Herr was first mentioned here in a January 30, 2013 posting titled: The ultimate DIY: ‘How to build a robotic man’ on BBC 4. He is a robotocist at the Massachusetts Institute of Technology (MIT).

The two men offering contrasting perspectives, Gregor Wolbring, ‘we should re-examine the notion that some people are impaired and need to be fixed’,  and Hugh Herr, ‘we will eliminate all forms of impairment’. Hopefully, the 2013 documentary has managed to present more of the nuances than I have.

Brain-friendly interface to replace neural prosthetics one day?

This research will not find itself occupying anyone’s brain for some time to come but it is interesting to find out that neural prosthetics have some drawbacks and there is work being done to address them. From an Aug. 10, 2015 news item on Azonano,

Instead of using neural prosthetic devices–which suffer from immune-system rejection and are believed to fail due to a material and mechanical mismatch–a multi-institutional team, including Lohitash Karumbaiah of the University of Georgia’s Regenerative Bioscience Center, has developed a brain-friendly extracellular matrix environment of neuronal cells that contain very little foreign material. These by-design electrodes are shielded by a covering that the brain recognizes as part of its own composition.

An Aug. 5, 2015 University of Georgia news release, which originated the news item, describes the new approach and technique in more detail,

Although once believed to be devoid of immune cells and therefore of immune responses, the brain is now recognized to have its own immune system that protects it against foreign invaders.

“This is not by any means the device that you’re going to implant into a patient,” said Karumbaiah, an assistant professor of animal and dairy science in the UGA College of Agricultural and Environmental Sciences. “This is proof of concept that extracellular matrix can be used to ensheathe a functioning electrode without the use of any other foreign or synthetic materials.”

Implantable neural prosthetic devices in the brain have been around for almost two decades, helping people living with limb loss and spinal cord injury become more independent. However, not only do neural prosthetic devices suffer from immune-system rejection, but most are believed to eventually fail because of a mismatch between the soft brain tissue and the rigid devices.

The collaboration, led by Wen Shen and Mark Allen of the University of Pennsylvania, found that the extracellular matrix derived electrodes adapted to the mechanical properties of brain tissue and were capable of acquiring neural recordings from the brain cortex.

“Neural interface technology is literally mind boggling, considering that one might someday control a prosthetic limb with one’s own thoughts,” Karumbaiah said.

The study’s joint collaborators were Ravi Bellamkonda, who conceived the new approach and is chair of the Wallace H. Coulter Department of Biomedical Engineering at the Georgia Institute of Technology and Emory University, as well as Allen, who at the time was director of the Institute for Electronics and Nanotechnology.

“Hopefully, once we converge upon the nanofabrication techniques that would enable these to be clinically translational, this same methodology could then be applied in getting these extracellular matrix derived electrodes to be the next wave of brain implants,” Karumbaiah said.

Currently, one out of every 190 Americans is living with limb loss, according to the National Institutes of Health. There is a significant burden in cost of care and quality of life for people suffering from this disability.

The research team is one part of many in the prosthesis industry, which includes those who design the robotics for the artificial limbs, others who make the neural prosthetic devices and developers who design the software that decodes the neural signal.

“What neural prosthetic devices do is communicate seamlessly to an external prosthesis,” Karumbaiah said, “providing independence of function without having to have a person or a facility dedicated to their care.”

Karumbaiah hopes further collaboration will allow them to make positive changes in the industry, saying that, “it’s the researcher-to-industry kind of conversation that now needs to take place, where companies need to come in and ask: ‘What have you learned? How are the devices deficient, and how can we make them better?'”

Here’s a link to and a citation for the paper,

Extracellular matrix-based intracortical microelectrodes: Toward a microfabricated neural interface based on natural materials by Wen Shen, Lohitash Karumbaiah, Xi Liu, Tarun Saxena, Shuodan Chen, Radhika Patkar, Ravi V. Bellamkonda, & Mark G. Allen. Microsystems & Nanoengineering 1, Article number: 15010 (2015) doi:10.1038/micronano.2015.10

This appears to be an open access paper.

One final note, I have written frequently about prosthetics and neural prosthetics, which you can find by using either of those terms and/or human enhancement. Here’s my latest piece, a March 25, 2015 posting.

Clinical trial for bionic eye (artificial retinal implant) shows encouraging results (safety and efficacy)

The Argus II artificial retina was first mentioned here in a Feb. 15, 2013 posting (scroll down about 50% of the way) when it received US Food and Drug Administration (FDA) commercial approval. In retrospect that seems puzzling since the results of a three-year clinical trial have just been reported in a June 23, 2015 news item on ScienceDaily (Note: There was one piece of information about the approval which didn’t make its way into the information disseminated in 2013),

The three-year clinical trial results of the retinal implant popularly known as the “bionic eye,” have proven the long-term efficacy, safety and reliability of the device that restores vision in those blinded by a rare, degenerative eye disease. The findings show that the Argus II significantly improves visual function and quality of life for people blinded by retinitis pigmentosa. They are being published online in Ophthalmology, the journal of the American Academy of Ophthalmology.

A June 23, 2015 American Academy of Ophthalmology news release (also on EurekAlert), which originated the news item, describes the condition the Argus II is designed for and that crucial bit of FDA information,

Retinitis pigmentosa is an incurable disease that affects about 1 in 4,000 Americans and causes slow vision loss that eventually leads to blindness.[1] The Argus II system was designed to help provide patients who have lost their sight due to the disease with some useful vision. Through the device, patients with retinitis pigmentosa are able to see patterns of light that the brain learns to interpret as an image. The system uses a miniature video camera stored in the patient’s glasses to send visual information to a small computerized video processing unit which can be stored in a pocket. This computer turns the image to electronic signals that are sent wirelessly to an electronic device implanted on the retina, the layer of light-sensing cells lining the back of the eye.

The Argus II received Food and Drug Administration (FDA) approval as a Humanitarian Use Device (HUD) in 2013, which is an approval specifically for devices intended to benefit small populations and/or rare conditions. [emphasis mine]

I don’t recall seeing “Humanitarian Use Device (HUD)” in the 2013 materials which focused on the FDA’s commercial use approval. I gather from this experience that commercial use doesn’t necessarily mean they’ve finished with clinical trials and are ready to start selling the product. In any event, I will try to take a closer look at the actual approvals the next time, assuming I can make sense of the language.

After all the talk about it, here’s what the device looks like,

 Caption: Figure A, The implanted portions of the Argus II System. Figure B, The external components of the Argus II System. Images in real time are captured by camera mounted on the glasses. The video processing unit down-samples and processes the image, converting it to stimulation patterns. Data and power are sent via radiofrequency link form the transmitter antenna on the glasses to the receiver antenna around the eye. A removable, rechargeable battery powers the system. Credit: Photo courtesy of Second Sight Medical Products, Inc.


Caption: Figure A, The implanted portions of the Argus II System. Figure B, The external components of the Argus II System. Images in real time are captured by camera mounted on the glasses. The video processing unit down-samples and processes the image, converting it to stimulation patterns. Data and power are sent via radiofrequency link form the transmitter antenna on the glasses to the receiver antenna around the eye. A removable, rechargeable battery powers the system.
Credit: Photo courtesy of Second Sight Medical Products, Inc.

The news release offers more details about the recently completed clinical trial,

To further evaluate the safety, reliability and benefit of the device, a clinical trial of 30 people, aged 28 to 77, was conducted in the United States and Europe. All of the study participants had little or no light perception in both eyes. The researchers conducted visual function tests using both a computer screen and real-world conditions, including finding and touching a door and identifying and following a line on the ground. A Functional Low-vision Observer Rated Assessment (FLORA) was also performed by independent visual rehabilitation experts at the request of the FDA to assess the impact of the Argus II system on the subjects’ everyday lives, including extensive interviews and tasks performed around the home.

The visual function results indicated that up to 89 percent of the subjects performed significantly better with the device. The FLORA found that among the subjects, 80 percent received benefit from the system when considering both functional vision and patient-reported quality of life, and no subjects were affected negatively.

After one year, two-thirds of the subjects had not experienced device- or surgery-related serious adverse events. After three years, there were no device failures. Throughout the three years, 11 subjects experienced serious adverse events, most of which occurred soon after implantation and were successfully treated. One of these treatments, however, was to remove the device due to recurring erosion after the suture tab on the device became damaged.

“This study shows that the Argus II system is a viable treatment option for people profoundly blind due to retinitis pigmentosa – one that can make a meaningful difference in their lives and provides a benefit that can last over time,” said Allen C. Ho, M.D., lead author of the study and director of the clinical retina research unit at Wills Eye Hospital. “I look forward to future studies with this technology which may make possible expansion of the intended use of the device, including treatment for other diseases and eye injuries.”

Here’s a link to a PDF of and a citation for the paper,

Long-Term Results from an Epiretinal Prosthesis to Restore Sight to the Blind by Allen C. Ho,Mark S. Humayun, Jessy D. Dorn, Lyndon da Cruz, Gislin Dagnelie,James Handa, Pierre-Olivier Barale, José-Alain Sahel, Paulo E. Stanga, Farhad Hafezi, Avinoam B. Safran, Joel Salzmann, Arturo Santos, David Birch, Rand Spencer, Artur V. Cideciyan, Eugene de Juan, Jacque L. Duncan, Dean Eliott, Amani Fawzi, Lisa C. Olmos de Koo, Gary C. Brown, Julia A. Haller, Carl D. Regillo, Lucian V. Del Priore, Aries Arditi, Duane R. Geruschat, Robert J. Greenberg. Opthamology, June 2015 http://dx.doi.org/10.1016/j.ophtha.2015.04.032

This paper is open access.

Is it time to invest in a ‘brain chip’ company?

This story take a few twists and turns. First, ‘brain chips’ as they’re sometimes called would allow, theoretically, computers to learn and function like human brains. (Note: There’s another type of ‘brain chip’ which could be implanted in human brains to help deal with diseases such as Parkinson’s and Alzheimer’s. *Today’s [June 26, 2015] earlier posting about an artificial neuron points at some of the work being done in this areas.*)

Returning to the ‘brain ship’ at hand. Second, there’s a company called BrainChip, which has one patent and another pending for, yes, a ‘brain chip’.

The company, BrainChip, founded in Australia and now headquartered in California’s Silicon Valley, recently sparked some investor interest in Australia. From an April 7, 2015 article by Timna Jacks for the Australian Financial Review,

Former mining stock Aziana Limited has whet Australian investors’ appetite for science fiction, with its share price jumping 125 per cent since it announced it was acquiring a US-based tech company called BrainChip, which promises artificial intelligence through a microchip that replicates the neural system of the human brain.

Shares in the company closed at 9¢ before the Easter long weekend, having been priced at just 4¢ when the backdoor listing of BrainChip was announced to the market on March 18.

Creator of the patented digital chip, Peter Van Der Made told The Australian Financial Review the technology has the capacity to learn autonomously, due to its composition of 10,000 biomimic neurons, which, through a process known as synaptic time-dependent plasticity, can form memories and associations in the same way as a biological brain. He said it works 5000 times faster and uses a thousandth of the power of the fastest computers available today.

Mr Van Der Made is inviting technology partners to license the technology for their own chips and products, and is donating the technology to university laboratories in the US for research.

The Netherlands-born Australian, now based in southern California, was inspired to create the brain-like chip in 2004, after working at the IBM Internet Security Systems for two years, where he was chief scientist for behaviour analysis security systems. …

A June 23, 2015 article by Tony Malkovic on phys.org provide a few more details about BrainChip and about the deal,

Mr Van der Made and the company, also called BrainChip, are now based in Silicon Valley in California and he returned to Perth last month as part of the company’s recent merger and listing on the Australian Stock Exchange.

He says BrainChip has the ability to learn autonomously, evolve and associate information and respond to stimuli like a brain.

Mr Van der Made says the company’s chip technology is more than 5,000 faster than other technologies, yet uses only 1/1,000th of the power.

“It’s a hardware only solution, there is no software to slow things down,” he says.

“It doesn’t executes instructions, it learns and supplies what it has learnt to new information.

“BrainChip is on the road to position itself at the forefront of artificial intelligence,” he says.

“We have a clear advantage, at least 10 years, over anybody else in the market, that includes IBM.”

BrainChip is aiming at the global semiconductor market involving almost anything that involves a microprocessor.

You can find out more about the company, BrainChip here. The site does have a little more information about the technology,

Spiking Neuron Adaptive Processor (SNAP)

BrainChip’s inventor, Peter van der Made, has created an exciting new Spiking Neural Networking technology that has the ability to learn autonomously, evolve and associate information just like the human brain. The technology is developed as a digital design containing a configurable “sea of biomimic neurons’.

The technology is fast, completely digital, and consumes very low power, making it feasible to integrate large networks into portable battery-operated products, something that has never been possible before.

BrainChip neurons autonomously learn through a process known as STDP (Synaptic Time Dependent Plasticity). BrainChip’s fully digital neurons process input spikes directly in hardware. Sensory neurons convert physical stimuli into spikes. Learning occurs when the input is intense, or repeating through feedback and this is directly correlated to the way the brain learns.

Computing Artificial Neural Networks (ANNs)

The brain consists of specialized nerve cells that communicate with one another. Each such nerve cell is called a Neuron,. The inputs are memory nodes called synapses. When the neuron associates information, it produces a ‘spike’ or a ‘spike train’. Each spike is a pulse that triggers a value in the next synapse. Synapses store values, similar to the way a computer stores numbers. In combination, these values determine the function of the neural network. Synapses acquire values through learning.

In Artificial Neural Networks (ANNs) this complex function is generally simplified to a static summation and compare function, which severely limits computational power. BrainChip has redefined how neural networks work, replicating the behaviour of the brain. BrainChip’s artificial neurons are completely digital, biologically realistic resulting in increased computational power, high speed and extremely low power consumption.

The Problem with Artificial Neural Networks

Standard ANNs, running on computer hardware are processed sequentially; the processor runs a program that defines the neural network. This consumes considerable time and because these neurons are processed sequentially, all this delayed time adds up resulting in a significant linear decline in network performance with size.

BrainChip neurons are all mapped in parallel. Therefore the performance of the network is not dependent on the size of the network providing a clear speed advantage. So because there is no decline in performance with network size, learning also takes place in parallel within each synapse, making STDP learning very fast.

A hardware solution

BrainChip’s digital neural technology is the only custom hardware solution that is capable of STDP learning. The hardware requires no coding and has no software as it evolves learning through experience and user direction.

The BrainChip neuron is unique in that it is completely digital, behaves asynchronously like an analog neuron, and has a higher level of biological realism. It is more sophisticated than software neural models and is many orders of magnitude faster. The BrainChip neuron consists entirely of binary logic gates with no traditional CPU core. Hence, there are no ‘programming’ steps. Learning and training takes the place of programming and coding. Like of a child learning a task for the first time.

Software ‘neurons’, to compromise for limited processing power, are simplified to a point where they do not resemble any of the features of a biological neuron. This is due to the sequential nature of computers, whereby all data has to pass through a central processor in chunks of 16, 32 or 64 bits. In contrast, the brain’s network is parallel and processes the equivalent of millions of data bits simultaneously.

A significantly faster technology

Performing emulation in digital hardware has distinct advantages over software. As software is processed sequentially, one instruction at a time, Software Neural Networks perform slower with increasing size. Parallel hardware does not have this problem and maintains the same speed no matter how large the network is. Another advantage of hardware is that it is more power efficient by several orders of magnitude.

The speed of the BrainChip device is unparalleled in the industry.

For large neural networks a GPU (Graphics Processing Unit) is ~70 times faster than the Intel i7 executing a similar size neural network. The BrainChip neural network is faster still and takes far fewer CPU (Central Processing Unit) cycles, with just a little communication overhead, which means that the CPU is available for other tasks. The BrainChip network also responds much faster than a software network accelerating the performance of the entire system.

The BrainChip network is completely parallel, with no sequential dependencies. This means that the network does not slow down with increasing size.

Endorsed by the neuroscience community

A number of the world’s pre-eminent neuroscientists have endorsed the technology and are agreeing to joint develop projects.

BrainChip has the potential to become the de facto standard for all autonomous learning technology and computer products.

Patented

BrainChip’s autonomous learning technology patent was granted on the 21st September 2008 (Patent number US 8,250,011 “Autonomous learning dynamic artificial neural computing device and brain inspired system”). BrainChip is the only company in the world to have achieved autonomous learning in a network of Digital Neurons without any software.

A prototype Spiking Neuron Adaptive Processor was designed as a ‘proof of concept’ chip.

The first tests were completed at the end of 2007 and this design was used as the foundation for the US patent application which was filed in 2008. BrainChip has also applied for a continuation-in-part patent filed in 2012, the “Method and System for creating Dynamic Neural Function Libraries”, US Patent Application 13/461,800 which is pending.

Van der Made doesn’t seem to have published any papers on this work and the description of the technology provided on the website is frustratingly vague. There are many acronyms for processes but no mention of what this hardware might be. For example, is it based on a memristor or some kind of atomic ionic switch or something else altogether?

It would be interesting to find out more but, presumably, van der Made, wishes to withhold details. There are many companies following the same strategy while pursuing what they view as a business advantage.

* Artificial neuron link added June 26, 2015 at 1017 hours PST.

Magnetic sensitivity under the microscope

Humans do not have the sense of magnetoreception (the ability to detect magnetic fields) unless they’ve been enhanced. On the other hand, species of fish, insects, birds, and some mammals (other than human) possess the sense naturally. Scientists at the University of Tokyo (Japan) have developed a microscope capable of observing magnetoreception according to a June 4, 2015 news item on Nanowerk (Note: A link has been removed),

Researchers at the University of Tokyo have succeeded in developing a new microscope capable of observing the magnetic sensitivity of photochemical reactions believed to be responsible for the ability of some animals to navigate in the Earth’s magnetic field, on a scale small enough to follow these reactions taking place inside sub-cellular structures (Angewandte Chemie International Edition, “Optical Absorption and Magnetic Field Effect Based Imaging of Transient Radicals”).

A June 4, 2015 University of Tokyo news release on EurekAlert, which originated the news item, describes the research in more detail,

Several species of insects, fish, birds and mammals are believed to be able to detect magnetic fields – an ability known as magnetoreception. For example, birds are able to sense the Earth’s magnetic field and use it to help navigate when migrating. Recent research suggests that a group of proteins called cryptochromes and particularly the molecule flavin adenine dinucleotide (FAD) that forms part of the cryptochrome, are implicated in magnetoreception. When cryptochromes absorb blue light, they can form what are known as radical pairs. The magnetic field around the cryptochromes determines the spins of these radical pairs, altering their reactivity. However, to date there has been no way to measure the effect of magnetic fields on radical pairs in living cells.

The research group of Associate Professor Jonathan Woodward at the Graduate School of Arts and Sciences are specialists in radical pair chemistry and investigating the magnetic sensitivity of biological systems. In this latest research, PhD student Lewis Antill made measurements using a special microscope to detect radical pairs formed from FAD, and the influence of very weak magnetic fields on their reactivity, in volumes less than 4 millionths of a billionth of a liter (4 femtoliters). This was possible using a technique the group developed called TOAD (transient optical absorption detection) imaging, employing a microscope built by postdoctoral research associate Dr. Joshua Beardmore based on a design by Beardmore and Woodward.

“In the future, using another mode of the new microscope called MIM (magnetic intensity modulation), also introduced in this work, it may be possible to directly image only the magnetically sensitive regions of living cells,” says Woodward. “The new imaging microscope developed in this research will enable the study of the magnetic sensitivity of photochemical reactions in a variety of important biological and other contexts, and hopefully help to unlock the secrets of animals’ miraculous magnetic sense.”

Here’s a link to and a citation for the paper,

Optical Absorption and Magnetic Field Effect Based Imaging of Transient Radicals by Dr. Joshua P. Beardmore, Lewis M. Antill, and Prof. Jonathan R. Woodward. Angewandte Chemie International Edition DOI: 10.1002/anie.201502591 Article first published online: 3 JUN 2015

© 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

I mentioned human enhancement earlier with regard to magnetoreception. There are people (body hackers) who’ve had implants that give them this extra sense. Dann Berg in a March 21, 2012 post on his website blog (iamdann.com) describes why he implanted a magnet into his finger and his experience with it (at that time, three years and counting),

I quickly learned that magnetic surfaces provided almost no sensation at all. Rather, it was movement that caused my finger to perk up. Things like power cord transformers, microwaves, and laptop fans became interactive in a whole new way. Each object has its own unique field, with different strength and “texture.” I started holding my finger over almost everything that I could, getting a feeling for each object’s invisible reach.

Portable electronics proved to be an experience as well. There were two fairly large electronic items that hit the shelves around the same time as I got my implant: the first iPad and the Kindle 2.

Something to consider,

Courtesy: iamdann.com (Dann Berg)

Courtesy: iamdann.com (Dann Berg)

Gray Matters volume 2: Integrative Approaches for Neuroscience, Ethics, and Society issued March 2015 by US Presidential Bioethics Commission

The second and final volume in the Grey Matters  set (from the US Presidential Commission for the Study of Bioethical Issues produced in response to a request from President Barack Obama regarding the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) initiative) has just been released.

The formal title of the latest volume is Gray Matters: Topics at the Intersection of Neuroscience, Ethics, and Society, volume two. The first was titled: Gray Matters: Integrative Approaches for Neuroscience, Ethics, and Society, volume one.)

According to volume 2 of the report’s executive summary,

… In its first volume on neuroscience and ethics, Gray Matters: Integrative Approaches for Neuroscience, Ethics, and Society, the Bioethics Commission emphasized the importance of integrating ethics and neuroscience throughout the research endeavor.1 This second volume, Gray Matters: Topics at the Intersection of Neuroscience, Ethics, and Society, takes an in-depth look at three topics at the intersection of neuroscience and society that have captured the public’s attention.

The Bioethics Commission found widespread agreement that contemporary neuroscience holds great promise for relieving human suffering from a number of devastating neurological disorders. Less agreement exists on multiple other topics, and the Bioethics Commission focused on three cauldrons of controversy—cognitive enhancement, consent capacity, and neuroscienceand the legal system. These topics illustrate the ethical tensions and societal implications of advancing neuroscience and technology, and bring into heightened relief many important ethical considerations.

A March 26, 2015 post by David Bruggeman on his Pasco Phronesis blog further describes the 168 pp. second volume of the report,

There are fourteen main recommendations in the report:

Prioritize Existing Strategies to Maintain and Improve Neural Health

Continue to examine and develop existing tools and techniques for brain health

Prioritize Treatment of Neurological Disorders

As with the previous recommendation, it would be valuable to focus on existing means of addressing neurological disorders and working to improve them.

Study Novel Neural Modifiers to Augment or Enhance Neural Function

Existing research in this area is limited and inconclusive.

Ensure Equitable Access to Novel Neural Modifiers to Augment or Enhance Neural Function

Access to cognitive enhancements will need to be handled carefully to avoid exacerbating societal inequities (think the stratified societies of the film Elysium or the Star Trek episode “The Cloud Minders“).

Create Guidance About the Use of Neural Modifiers

Professional societies and expert groups need to develop guidance for health care providers that receive requests for prescriptions for cognitive enhancements (something like an off-label use of attention deficit drugs, beta blockers or other medicines to boost cognition rather than address perceived deficits).

If you don’t have time to look at the 2nd volume, David’s post covers many of the important points.

Think of your skin as a smartphone

A March 5, 2015 news item on Azonano highlights work on flexible, transparent electronics designed to adhere to your skin,

Someone wearing a smartwatch can look at a calendar or receive e-mails without having to reach further than their wrist. However, the interaction area offered by the watch face is both fixed and small, making it difficult to actually hit individual buttons with adequate precision. A method currently being developed by a team of computer scientists from Saarbrücken in collaboration with researchers from Carnegie Mellon University in the USA may provide a solution to this problem. They have developed touch-sensitive stickers made from flexible silicone and electrically conducting sensors that can be worn on the skin.

Here’s what the sticker looks like,

Caption: The stickers are skin-friendly and are attached to the skin with a biocompatible, medical-grade adhesive. Credit: Oliver Dietze

Caption: The stickers are skin-friendly and are attached to the skin with a biocompatible, medical-grade adhesive. Credit: Oliver Dietze Courtesy: Saarland University

A March 4, 2015 University of Saarland press release on EurekAlert, which originated the news item, expands on the theme on connecting technology to the body,

… The stickers can act as an input space that receives and executes commands and thus controls mobile devices. Depending on the type of skin sticker used, applying pressure to the sticker could, for example, answer an incoming call or adjust the volume of a music player. ‘The stickers allow us to enlarge the input space accessible to the user as they can be attached practically anywhere on the body,’ explains Martin Weigel, a PhD student in the team led by Jürgen Steimle at the Cluster of Excellence at Saarland University. The ‘iSkin’ approach enables the human body to become more closely connected to technology. [emphasis mine]

Users can also design their iSkin patches on a computer beforehand to suit their individual tastes. ‘A simple graphics program is all you need,’ says Weigel. One sticker, for instance, is based on musical notation, another is circular in shape like an LP. The silicone used to fabricate the sensor patches makes them flexible and stretchable. ‘This makes them easier to use in an everyday environment. The music player can simply be rolled up and put in a pocket,’ explains Jürgen Steimle, who heads the ‘Embodied Interaction Group’ in which Weigel is doing his research. ‘They are also skin-friendly, as they are attached to the skin with a biocompatible, medical-grade adhesive. Users can therefore decide where they want to position the sensor patch and how long they want to wear it.’

In addition to controlling music or phone calls, the iSkin technology could be used for many other applications. For example, a keyboard sticker could be used to type and send messages. Currently the sensor stickers are connected via cable to a computer system. According to Steimle, in-built microchips may in future allow the skin-worn sensor patches to communicate wirelessly with other mobile devices.

The publication about ‘iSkin’ won the ‘Best Paper Award’ at the SIGCHI conference, which ranks among the most important conferences within the research area of human computer interaction. The researchers will present their project at the SIGCHI conference in April [2015] in Seoul, Korea, and beforehand at the computer expo Cebit, which takes place from the 16th until the 20th of March [2015] in Hannover (hall 9, booth E13).

Hopefully, you’ll have a chance to catch researchers’ presentation at the SIGCHI or Cebit events.

That quote about enabling “the human body to become more closely connected to technology” reminds me of a tag (machine/flesh) I created to categorize research of this nature. I explained the idea being explored in a May 9, 2012 posting titled: Everything becomes part machine,

Machine/flesh. That’s what I’ve taken to calling this process of integrating machinery into our and, as I newly realized, other animals’ flesh.

I think my latest previous post on this topic was a Jan. 10, 2014 post titled: Chemistry of Cyborgs: review of the state of the art by German researchers.

More about MUSE, a Canadian company and its brain sensing headband; women and startups; Canadianess

I first wrote about Ariel Garten and her Toronto-based (Canada) company, InteraXon, in a Dec. 5, 2012 posting where I featured a product, MUSE (Muse), then described as a brainwave controller. A March 5, 2015 article by Lydia Dishman for Fast Company provides an update on the product now described as a brainwave-sensing headband and on the company (Note: Links have been removed),

The technology that had captured the imagination of millions was then incorporated to develop a headband called Muse. It sells at retail stores like BestBuy for about $300 and works in conjunction with an app called Calm as a tool to increase focus and reduce stress.

If you always wanted to learn to meditate without those pesky distracting thoughts commandeering your mind, Muse can help by taking you through a brief exercise that translates brainwaves into the sound of wind. Losing focus or getting antsy brings on the gales. Achieving calm rewards you with a flock of birds across your screen.

The company has grown to 50 employees and has raised close to $10 million from investors including Ashton Kutcher. Garten [Ariel Garten, founder and Chief Executive Founder] says they’re about to close on a Series B round, “which will be significant.”

She says that listening plays an important role at InteraXon. Reflecting back on what you think you heard is an exercise she encourages, especially in meetings. When the development team is building a tool, for example, they use their Muses to meditate and focus, which then allows for listening more attentively and nonjudgmentally.

Women and startups

Dishman references gender and high tech financing in her article about Garten,

Garten doesn’t dwell on her status as a woman in a mostly male-dominated sector. That goes for securing funding for the startup too, despite the notorious bias venture-capital investors have against women startup founders.

“I am sure I lost deals because I am a woman, but also because the idea didn’t resonate,” she says, adding, “I’m sure I gained some because I am a woman, so it is unfair to put a blanket statement on it.”

Yet Garten is the only female member of her C-suite, something she says “is just the way it happened.” Casting the net recently to fill the role of chief operating officer [COO], Garten says there weren’t any women in the running, in part because the position required hardware experience as well as knowledge of working with the Chinese.

She did just hire a woman to be senior vice president of sales and marketing, and says, “When we are hiring younger staff, we are gender agnostic.”

I can understand wanting to introduce nuance into the ‘gender bias and tech startup discussion’ by noting that some rejections could have been due to issues with the idea or implementation. But the comment about being the only female in late stage funding as “just the way it happened” suggests she is extraordinarily naïve or willfully blind. Given her followup statement about her hiring practices, I’m inclined to go with willfully blind. It’s hard to believe she couldn’t find any woman with hardware experience and China experience. It seems more likely she needed a male COO to counterbalance a company with a female CEO. As for being gender agnostic where younger staff are concerned, that’s nice but it’s not reassuring as women have been able to get more junior positions. It’s the senior positions such as COO which remain out of reach and, troublingly, Garten seems to have blown off the question with a weak explanation and a glib assurance of equality at the lower levels of the company.

For more about gender, high tech companies, and hiring/promoting practices, you can read a March 5, 2015 article titled, Ellen Pao Trial Reveals the Subtle Sexism of Silicon Valley, by Amanda Marcotte for Slate.

Getting back to MUSE, you can find out more here. You can find out more about InterAxon here. Unusually, there doesn’t seem to be any information about the management team on the website.

Canadianness

I thought it was interesting that InterAxon’s status as a Canada-based company was mentioned nowhere in Dishman’s article. This is in stark contrast to Nancy Owano’s  Dec. 5, 2012 article for phys.org,

A Canadian company is talking about having a window, aka computer screen, into your mind. … InteraXon, a Canadian company, is focused on making a business out of mind-control technology via a headband device, and they are planning to launch this as a $199 brainwave computer controller called Muse. … [emphases mine]

This is not the only recent instance I’ve noticed. My Sept. 1, 2014 posting mentions what was then an upcoming Margaret Atwood event at Arizona State University,

… (from the center’s home page [Note: The center is ASU’s Center for Science and the Imagination]),

Internationally renowned novelist and environmental activist Margaret Atwood will visit Arizona State University this November [2014] to discuss the relationship between art and science, and the importance of creative writing and imagination for addressing social and environmental challenges.

Atwood’s visit will mark the launch of the Imagination and Climate Futures Initiative … Atwood, author of the MaddAddam trilogy of novels that have become central to the emerging literary genre of climate fiction, or “CliFi,” will offer the inaugural lecture for the initiative on Nov. 5.

“We are proud to welcome Margaret Atwood, one of the world’s most celebrated living writers, to ASU and engage her in these discussions around climate, science and creative writing,” …  “A poet, novelist, literary critic and essayist, Ms. Atwood epitomizes the creative and professional excellence our students aspire to achieve.”

There’s not a single mention that she is Canadian there or in a recent posting by Martin Robbins about a word purge from the Oxford Junior Dictionary published by the Guardian science blog network (March 3, 2015 posting). In fact, Atwood was initially described by Robbins as one of Britain’s literary giants. I assume there were howls of anguish once Canadians woke up to read the article since the phrase was later amended to “a number of the Anglosphere’s literary giants.”

The omission of InterAxon’s Canadianness in Dishman’s article for an American online magazine and Atwood’s Canadianness on the Arizona State University website and Martin Robbins’ initial appropriation and later change to the vague-sounding “Anglospere” in his post for the British newspaper, The Guardian, means the bulk of their readers will likely assume InterAxon is American and that Margaret Atwood, depending on where you read about her, is either an American or a Brit.

It’s flattering that others want to grab a little bit of Canada for themselves.

Coda: The Oxford Junior Dictionary and its excision of ‘nature’ words

 

Robbins’ March 3, 2015 posting focused on a heated literary discussion about the excision of these words from the Oxford Junior Dictionary (Note:  A link has been removed),

“The deletions,” according to Robert Macfarlane in another article on Friday, “included acorn, adder, ash, beech, bluebell, buttercup, catkin, conker, cowslip, cygnet, dandelion, fern, hazel, heather, heron, ivy, kingfisher, lark, mistletoe, nectar, newt, otter, pasture and willow. The words taking their places in the new edition included attachment, block-graph, blog, broadband, bullet-point, celebrity, chatroom, committee, cut-and-paste, MP3 player and voice-mail.”

I’m surprised the ‘junior’ dictionary didn’t have “attachment,” “celebrity,” and “committee” prior to the 2007 purge. By the way, it seems no one noticed the purge till recently. Robbins has an interesting take on the issue, one with which I do not entirely agree. I understand needing to purge words but what happens a child reading a classic such as “The Wind in the Willows’ attempts to look up the word ‘willows’?  (Thanks to Susan Baxter who in a private communication pointed out the problems inherent with reading new and/or classic books and not being able to find basic vocabulary.)

Disability and technology

There’s a human enhancement or,more specifically, a ‘technology and disability’ event being held by Future Tense (a collaboration between Slate.com, New America, and Arizona State University) on March 4, 2015. Here’s more from the Feb. 20, 2015 Slate.com post,

Attention-grabbing advances in robotics and neurotechnology have caused many to rethink the concept of human disability. A paraplegic man in a robotic suit took the first kick at the 2014 World Cup, for instance, and the FDA has approved a bionic arm controlled with signals from the brain. It’s not hard to imagine that soon these advances may allow people to run, lift, and even think better than what is currently considered “normal”—challenging what it means to be human. But some in the disability community reject these technologies; for others, accessing them can be an overwhelmingly expensive and bureaucratic process. As these technological innovations look more and more like human engineering, will we need to reconsider what it means to be able and disabled?

We’ll discuss these questions and more at noon [EST] on Wednesday, March 4, at the New America office in Washington, D.C. The event is presented by Future Tense in collaboration with the award-winning documentary on disability and technology Fixed: The Science/Fiction of Human Enhancement [mentioned in an Aug. 3, 2010 posting]. You can find the event agenda and the trailer for Fixed below; to RSVP, click here. The venue is wheelchair accessible, and an American Sign Language interpreter will be present.

The Will Technology Put an End to Disability? event page includes an agenda,

Agenda:

12:00 pm Engineering Ability

Jennifer French
Executive Director, Neurotech Network

Larry Jasinksi
CEO, ReWalk Robotics
@ReWalk_Robotics

Will Oremus
Senior Technology Writer, Slate
@WillOremus

12:45 pm T​he Promise and Peril of Human Enhancement

​Gregor Wolbring
Associate Professor, University of Calgary
@Wolbring

Julia Bascom
Director of Programs, Autistic Self Advocacy Network
@autselfadvocacy

Teresa Blankmeyer Burke
Assistant Professor of Philosophy, Gallaudet University
@teresaburke

Moderator:
Lawrence Carter-Long
Public Affairs Specialist, National Council on Disability
@LCarterLong

Gregor Wolbring who’s scheduled for 1245 hours EST has been mentioned here more than once (most recently in a Jan. 10, 2014 posting titled, Chemistry of Cyborgs: review of the state of the art by German researchers, which includes further links. Gregor is also mentioned in the Aug. 3, 2010 posting about the movie ‘Fixed’. You can find out more about Wolbring and his work here.

Coincidentally, there’s a March 2, 2015 article titled: Deus Ex and Human Enhancement by Adam Koper for nouse.co.uk which conflates the notion of nanotechnology and human enhancement. It’s a well written and interesting article (there is a proviso) about a game, Deus Ex, which features nanotechnology=enabled human enhancement.  Despite Koper’s description not all human enhancement is nanotechnology-enabled and not all nanotechnology-enabled solutions are oriented to human enhancement. However, many human enhancement efforts are enabled by nanotechnology.

By the way, the game is published in Montréal (Québec, Canada) by Eidos (you will need your French language skills; I was not able to find an English language site).

Mind-controlled prostheses ready for real world activities

There’s some exciting news from Sweden’s Chalmers University of Technology about prosthetics. From an Oct. 8, 2014 news item on ScienceDaily,

For the first time, robotic prostheses controlled via implanted neuromuscular interfaces have become a clinical reality. A novel osseointegrated (bone-anchored) implant system gives patients new opportunities in their daily life and professional activities.

In January 2013 a Swedish arm amputee was the first person in the world to receive a prosthesis with a direct connection to bone, nerves and muscles. …

An Oct. 8, 2014 Chalmers University press release (also on EurekAlert), which originated the news item, provides more details about the research and this ‘real world’ prosthetic device,

“Going beyond the lab to allow the patient to face real-world challenges is the main contribution of this work,” says Max Ortiz Catalan, research scientist at Chalmers University of Technology and leading author of the publication.

“We have used osseointegration to create a long-term stable fusion between man and machine, where we have integrated them at different levels. The artificial arm is directly attached to the skeleton, thus providing mechanical stability. Then the human’s biological control system, that is nerves and muscles, is also interfaced to the machine’s control system via neuromuscular electrodes. This creates an intimate union between the body and the machine; between biology and mechatronics.”

The direct skeletal attachment is created by what is known as osseointegration, a technology in limb prostheses pioneered by associate professor Rickard Brånemark and his colleagues at Sahlgrenska University Hospital. Rickard Brånemark led the surgical implantation and collaborated closely with Max Ortiz Catalan and Professor Bo Håkansson at Chalmers University of Technology on this project.

The patient’s arm was amputated over ten years ago. Before the surgery, his prosthesis was controlled via electrodes placed over the skin. Robotic prostheses can be very advanced, but such a control system makes them unreliable and limits their functionality, and patients commonly reject them as a result.

Now, the patient has been given a control system that is directly connected to his own. He has a physically challenging job as a truck driver in northern Sweden, and since the surgery he has experienced that he can cope with all the situations he faces; everything from clamping his trailer load and operating machinery, to unpacking eggs and tying his children’s skates, regardless of the environmental conditions (read more about the benefits of the new technology below).

The patient is also one of the first in the world to take part in an effort to achieve long-term sensation via the prosthesis. Because the implant is a bidirectional interface, it can also be used to send signals in the opposite direction – from the prosthetic arm to the brain. This is the researchers’ next step, to clinically implement their findings on sensory feedback.

“Reliable communication between the prosthesis and the body has been the missing link for the clinical implementation of neural control and sensory feedback, and this is now in place,” says Max Ortiz Catalan. “So far we have shown that the patient has a long-term stable ability to perceive touch in different locations in the missing hand. Intuitive sensory feedback and control are crucial for interacting with the environment, for example to reliably hold an object despite disturbances or uncertainty. Today, no patient walks around with a prosthesis that provides such information, but we are working towards changing that in the very short term.”

The researchers plan to treat more patients with the novel technology later this year.

“We see this technology as an important step towards more natural control of artificial limbs,” says Max Ortiz Catalan. “It is the missing link for allowing sophisticated neural interfaces to control sophisticated prostheses. So far, this has only been possible in short experiments within controlled environments.”

The researchers have provided an image of the patient using his prosthetic arm in the context of his work as a truck driver,

[downloaded from http://www.chalmers.se/en/news/Pages/Mind-controlled-prosthetic-arms-that-work-in-daily-life-are-now-a-reality.aspx]

[downloaded from http://www.chalmers.se/en/news/Pages/Mind-controlled-prosthetic-arms-that-work-in-daily-life-are-now-a-reality.aspx]

The news release offers some additional information about the device,

The new technology is based on the OPRA treatment (osseointegrated prosthesis for the rehabilitation of amputees), where a titanium implant is surgically inserted into the bone and becomes fixated to it by a process known as osseointegration (Osseo = bone). A percutaneous component (abutment) is then attached to the titanium implant to serve as a metallic bone extension, where the prosthesis is then fixated. Electrodes are implanted in nerves and muscles as the interfaces to the biological control system. These electrodes record signals which are transmitted via the osseointegrated implant to the prostheses, where the signals are finally decoded and translated into motions.

There are also some videos of the patient demonstrating various aspects of this device available here (keep scrolling) along with more details about what makes this device so special.

Here’s a link to and a citation for the research paper,

An osseointegrated human-machine gateway for long-term sensory feedback and motor control of artificial limbs by Max Ortiz-Catalan, Bo Håkansson, and Rickard Brånemark. Sci Transl Med 8 October 2014: Vol. 6, Issue 257, p. 257re6 Sci. Transl. Med. DOI: 10.1126/scitranslmed.3008933

This article is behind a paywall and it appears to be part of a special issue or a special section in an issue, so keep scrolling down the linked to page to find more articles on this topic.

I have written about similar research in the past. Notably, there’s a July 19, 2011 post about work on Intraosseous Transcutaneous Amputation Prosthesis (ITAP) and a May 17, 2012 post featuring a video of a woman reaching with a robotic arm for a cup of coffee using her thoughts alone to control the arm.