Tag Archives: human enhancement

Chemistry of Cyborgs: review of the state of the art by German researchers

Communication between man and machine – a fascinating area at the interface of chemistry, biomedicine, and engineering. (Figure: KIT/S. Giselbrecht, R. Meyer, B. Rapp)

Communication between man and machine – a fascinating area at the interface of chemistry, biomedicine, and engineering. (Figure: KIT/S. Giselbrecht, R. Meyer, B. Rapp)

German researchers from the Karlsruhe Institute of Technology (KIT), Professor Christof M. Niemeyer and Dr. Stefan Giselbrecht of the Institute for Biological Interfaces 1 (IBG 1) and Dr. Bastian E. Rapp, Institute of Microstructure Technology (IMT) have written a good overview of the current state of cyborgs while pointing out some of the ethical issues associated with this field. From the Jan. 10, 2014 news item on ScienceDaily,

Medical implants, complex interfaces between brain and machine or remotely controlled insects: Recent developments combining machines and organisms have great potentials, but also give rise to major ethical concerns. In a new review, KIT scientists discuss the state of the art of research, opportunities, and risks.

The Jan. ?, 2014 KIT press release (also on EurekAlert with a release date of Jan. 10, 2014), which originated the news item, describes the innovations and the work at KIT in more detail,

They are known from science fiction novels and films – technically modified organisms with extraordinary skills, so-called cyborgs. This name originates from the English term “cybernetic organism”. In fact, cyborgs that combine technical systems with living organisms are already reality. The KIT researchers Professor Christof M. Niemeyer and Dr. Stefan Giselbrecht of the Institute for Biological Interfaces 1 (IBG 1) and Dr. Bastian E. Rapp, Institute of Microstructure Technology (IMT), point out that this especially applies to medical implants.

In recent years, medical implants based on smart materials that automatically react to changing conditions, computer-supported design and fabrication based on magnetic resonance tomography datasets or surface modifications for improved tissue integration allowed major progress to be achieved. For successful tissue integration and the prevention of inflammation reactions, special surface coatings were developed also by the KIT under e.g. the multidisciplinary Helmholtz program “BioInterfaces”.

Progress in microelectronics and semiconductor technology has been the basis of electronic implants controlling, restoring or improving the functions of the human body, such as cardiac pacemakers, retina implants, hearing implants, or implants for deep brain stimulation in pain or Parkinson therapies. Currently, bioelectronic developments are being combined with robotics systems to design highly complex neuroprostheses. Scientists are working on brain-machine interfaces (BMI) for the direct physical contacting of the brain. BMI are used among others to control prostheses and complex movements, such as gripping. Moreover, they are important tools in neurosciences, as they provide insight into the functioning of the brain. Apart from electric signals, substances released by implanted micro- and nanofluidic systems in a spatially or temporarily controlled manner can be used for communication between technical devices and organisms.

BMI are often considered data suppliers. However, they can also be used to feed signals into the brain, which is a highly controversial issue from the ethical point of view. “Implanted BMI that feed signals into nerves, muscles or directly into the brain are already used on a routine basis, e.g. in cardiac pacemakers or implants for deep brain stimulation,” Professor Christof M. Niemeyer, KIT, explains. “But these signals are neither planned to be used nor suited to control the entire organism – brains of most living organisms are far too complex.”

Brains of lower organisms, such as insects, are less complex. As soon as a signal is coupled in, a certain movement program, such as running or flying, is started. So-called biobots, i.e. large insects with implanted electronic and microfluidic control units, are used in a new generation of tools, such as small flying objects for monitoring and rescue missions. In addition, they are applied as model systems in neurosciences in order to understand basic relationships.

Electrically active medical implants that are used for longer terms depend on reliable power supply. Presently, scientists are working on methods to use the patient body’s own thermal, kinetic, electric or chemical energy.

In their review the KIT researchers sum up that developments combining technical devices with organisms have a fascinating potential. They may considerably improve the quality of life of many people in the medical sector in particular. However, ethical and social aspects always have to be taken into account.

After briefly reading the paper, I can say the researchers are most interested in the science and technology aspects but they do have this to say about ethical and social issues in the paper’s conclusion (Note: Links have been removed),

The research and development activities summarized here clearly raise significant social and ethical concerns, in particular, when it comes to the use of BMIs for signal injection into humans, which may lead to modulation or even control of behavior. The ethical issues of this new technology have been discussed in the excellent commentary of Jens Clausen,33 which we highly recommend for further reading. The recently described engineering of a synthetic polymer construct, which is capable of propulsion in water through a collection of adhered rat cardiomyocytes,77 a “medusoid” also described as a “cyborg jellyfish with a rat heart”, brings up an additional ethical aspect. The motivation of the work was to reverse-engineer muscular pumps, and it thus represents fundamental research in tissue engineering for biomedical applications. However, it is also an impressive, early demonstration that autonomous control of technical devices can be achieved through small populations of cells or microtissues. It seems reasonable that future developments along this line will strive, for example, to control complex robots through the use of brain tissue. Given the fact that the robots of today are already capable of autonomously performing complex missions, even in unknown territories,78 this approach might indeed pave the way for yet another entirely new generation of cybernetic organisms.

Here’s a link to and a citation for the English language version of the paper, which is open access (as of Jan. 10, 2014),

The Chemistry of Cyborgs—Interfacing Technical Devices with Organisms by Dr. Stefan Giselbrecht, Dr. Bastian E. Rapp, & Prof.Dr. Christof M. Niemeyer. Angewandte Chemie International Edition Volume 52, Issue 52, pages 13942–13957, December 23, 2013 Article first published online: 29 NOV 2013 DOI: 10.1002/anie.201307495

Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

For those with German language skills,

Chemie der Cyborgs – zur Verknüpfung technischer Systeme mit Lebewesen.  by Stefan Giselbrecht, Bastian E. Rapp, and Christof M. Niemeyer. Angewandte Chemie. Volume 125, issue 52, pages 14190, December 23, 2013. DOI: 10.1002/ange.201307495

I have written many times about cyborgs and neuroprosthetics including this Aug. 30, 2011 posting titled:  Eye, arm, & leg prostheses, cyborgs, eyeborgs, Deus Ex, and ableism, where I mention Gregor Wolbring, a Canadian academic (University of Calgary) who has written extensively on the social and ethical issues of human enhancement technologies. You can find out more on his blog, Nano and Nano- Bio, Info, Cogno, Neuro, Synbio, Geo, Chem…

For anyone wanting to search this blog for these pieces, try using the term machine/flesh as a tag, as well as, human enhancement, neuroprostheses, cyborgs …

Review of ‘You Are Very Star’ transmedia show (in Vancouver, Canada)

Blasting backwards (1968) and forwards (2048) in time, the You Are Very Star immersive, transmedia experience offered by Vancouver’s Electric Company Theatre is an exciting experiment as I discovered on opening night, June 15, 2013.

Don’t expect to sit passively in a theatre seat. We trotted around the building to visit or remember 1968 in one theatre, then zipped out to a 20 minute 2013 intermission where we played a game (they gave us maps with our programmes, which you are invited to return at the end of the intermission), and, finally, were escorted to the planetarium theatre to encounter 2048.

I’m not sure about the artistic intention for the 1968 portion of the show. It was one of those situations where my tiny bit of knowledge and considerable fund of ignorance combined to create confusion. For example, one of the characters, Earle Birney, a poet, writer, and titan of Canadian literature, did found the creative writing programme at the University of British Columbia as they note in the show but by 1968 he’d left Vancouver for Toronto. One of the other characters in this segment is called Esther, a black feminist and more, with whom Birney’s character appears to establish a liaison. Birney was married to an Esther whom I met some years ago. She was a white Englishwoman and a feminist but of a somewhat different character than the Esther of the play.

In addition, the clothing wasn’t quite right. No tie dye, no macrame, no beads, no granny dresses, and not enough fringe. Plus, I can’t recall seeing any bell bottom pants, mini dresses and skirts, and/or go go boots.

There were some interesting tonal changes in this section ranging from humour, political angst and anger, and pathos. The depiction of the professor who’s decided to let people grade themselves and who takes an hallucinogenic drug in front of his class seemed pretty typical of a few of the crackpot professors of the time.

Unexpectedly, the professor decides to get high on ayahuasca. LSD, magic mushrooms, marijuana and hashish would have been more typical. I can understand clothing and some of the dialogue not being typical of the period but getting the preferred drugs wrong seems odd, which is why I questioned *whether the artists introduced these incongruencies intentionally.

The actors all shone at one time or another as they delivered some pretty snappy dialogue . I’m hoping they tighten this section up so there’s less waiting for the snappy stuff and perhaps they could find some device other than xx hours/days earlier to signify a change in the timeframe. I lost count of how often they flashed a slide onscreen notifying us that the next scene had taken place at an earlier time. Finally, I loved the shadow puppets but they were on for a brief time only, never to return.

Our intermission was pretty active. There were lots of places on the map, given with the programme, where one was meant to discover things. I never did figure out what was happening with the stuffed toys that were being given out but I’m ok with those kinds of mysteries.

The last stop was the planetarium theatre for 2048. Very interesting costuming, especially the head gear. Still, I have to ask why do people in the future, in the more ‘optimistic’ versions of it, tend to wear white?

I found 2048 the most interesting part but that may be due to the references to human enhancement (a topic I’ve covered here a number of times). The playwrights also seem to have spent some time studying Ray Kurzweil and the singularity he’s predicting. From the Technological singularity essay on Wikipedia (Note: Links and footnotes have been removed),

The technological singularity is the theoretical emergence of superintelligence through technological means. Since the capabilities of such intelligence would be difficult for an unaided human mind to comprehend, the technological singularity is seen as an occurrence beyond which events cannot be predicted.

The first use of the term “singularity” in this context was by mathematician John von Neumann. Neumann in the mid-1950s spoke of “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue”. The term was popularized by science fiction writer Vernor Vinge, who argues that artificial intelligence, human biological enhancement, or brain-computer interfaces could be possible causes of the singularity. Futurist Ray Kurzweil cited von Neumann’s use of the term in a foreword to von Neumann’s classic The Computer and the Brain.

Proponents of the singularity typically postulate an “intelligence explosion”, where superintelligences design successive generations of increasingly powerful minds, that might occur very quickly and might not stop until the agent’s cognitive abilities greatly surpass that of any human.

Kurzweil predicts the singularity to occur around 2045 whereas Vinge predicts some time before 2030. At the 2012 Singularity Summit, Stuart Armstrong did a study of artificial generalized intelligence (AGI) predictions by experts and found a wide range of predicted dates, with a median value of 2040. His own prediction on reviewing the data is that there’s an 80% probability that the singularity will occur in a range of 5 to 100 years. An alternative view, the “mitochondrial singularity,” proposed by microbiologist Joan Slonczewski, holds that the singularity is a gradual process that began centuries ago, as humans have outsourced our intelligence to machines, and that we may become reduced to vestigial power providers analogous to the mitochondria of living cells.

I thank the playwrights for introducing some of the more difficult aspects of the science and technology discussion that are taking place into this piece. For example, those who are enhanced and moving towards the singularity and those who are not enhanced are both represented here and so the playwrights have introduced some ideas about the social implications of employing new and emerging technologies.

You Are Very Star is not a perfect production but it is as I noted earlier very exciting both for the ways the company is trying to immerse audiences in an experience and for the ideas and dialogue they are attempting to stimulate.

The show goes on until June 29, 2013 and tickets are $30,




This production is being held at,

H.R. MacMillan Space Centre
1100 Chestnut Street, in Vanier Park
8:00pm Tues – Sun
2:00pm Sun
12:00pm Thurs June 20

Do enjoy!

* Correction June 19,2013: ‘where’ changed to ‘whether’

ETA June 24, 2013: I noticed that where I use the word ‘enhancement’ other reviewers such as Colin Thomas in his June 17, 2013 review for the Georgia Straight are using ‘augment’

Whose Electric Brain? the video

After a few fits and starts, the video of my March 15, 2012 presentation to the Canadian Academy of Independent Scholars at Simon Fraser University has been uploaded to Vimeo. Unfortunately the original recording was fuzzy (camera issues) so we (camera operator, director, and editor, Sama Shodjai [samashodjai@gmail.com]) and I rerecorded the presentation and this second version is the one we’ve uploaded.

Whose Electric Brain? (Presentation) from Maryse de la Giroday on Vimeo.

I’ve come across a few errors; at one point, I refer to Buckminster Fuller as Buckminster Fullerene and I state that the opening image visualizes a neuron from someone with Parkinson’s disease, I should have said Huntingdon’s disease. Perhaps, you’ll come across more, please do let me know. If this should become a viral sensation (no doubt feeding a pent up demand for grey-haired women talking about memristors and brains), it’s important that corrections be added.

Finally, a big thank you to Mark Dwor who provides my introduction at the beginning, the Canadian Academy of Independent Scholars whose grant made the video possible, and Simon Fraser University.

ETA March 29, 2012: This is an updated version of the presentation I was hoping to give at ISEA (International Symposium on Electronic Arts) 2011 in Istanbul. Sadly, I was never able to raise all of the funds I needed for that venture. The funds I raised separately from the CAIS grant are being held until I can find another suitable opportunity to present my work.

Skin as art and as haptic device

I stumbled across an essay, Nano-Bio-Info-Cogno Skin by Natasha Vita-More on the IEET (Institute for Ethics & Emerging Technologies) website newly republished on Mar. 19, 2012. (The essay was originally published Jan. 19, 2009 on the Nanotechnology Now website.) No matter the date, it has proved quite timely in light of Nokia’s (Finnish telephone company) application to patent magnetic tattoos. From the Vibrating tattoo alerts patent filed by Nokia in US  March 20, 2012 story on the BBC News online,

Vibrating magnetic tattoos may one day be used to alert mobile phone users to phone calls and text messages if Nokia follows up a patent application.

The Finnish company has described the idea in a filing to the US Patent and Trademark Office.

It describes tattooing, stamping or spraying “ferromagnetic” material onto a user’s skin and then pairing it with a mobile device.

It suggests different vibrations could be used to create a range of alerts.

The application is dated March 15, 2012. From United States Patent Application no. 20120062371 (abstract),

1. An apparatus comprising: a material attachable to skin, the material capable of detecting a magnetic field and transferring a perceivable stimulus to the skin, wherein the perceivable stimulus relates to the magnetic field.

2. An apparatus according to claim 1, wherein the material comprises at least one of a visible image, invisible image, invisible tattoo, visible tattoo, visible marking, invisible marking, visible marker, visible sign, invisible sign, visible label, invisible label, visible symbol, invisible symbol, visible badge and invisible badge.

3. An apparatus according to claim 1, wherein the perceivable stimulus comprises vibration.

4. An apparatus according to claim 1, wherein the magnetic field originates from an electronic device and relates to digital content stored in the electronic device.

5. An apparatus according to claim 1, wherein the perceivable stimulus is related to the magnetic field.

6. An apparatus according to claim 1, wherein the perceivable stimulus relates to a time variation of at least one of a magnetic field pulse, height, width and period.

7. An apparatus according to claim 1, wherein the magnetic field originates from a remote source.

8. An apparatus according to claim 7, wherein the perceivable stimulus relates to digital content of the remote source.

If you want the full listing, there are 13 more claims for a total of 21 listed in the abstract. Nokia’s initial plans are to create a material that you’d wear, the notion of tattoos arises later in the application according to Vlad Bobleanta in his March 15, 2012 article for unwiredview.com. He describes the potential tattoos is some detail,

The tattoo would be applied using ferromagnetic inks. The ink material would first be exposed to high temperatures to demagnetize it. Then the tattoo would be applied. You’ll apparently be able to choose the actual image you want as the tattoo. The procedure is identical to that of getting a ‘normal’ tattoo – only the ink is special.

After the tattoo has been applied, you’ll need to magnetize it. That means bringing the tattooed area in the close proximity of an external magnet, and going “several times over this magnet to magnetize the image material again”. The tattoo will then have enhanced sensitivity towards external alternating magnet fields, and will basically function the same way the aforementioned material attached to your skin did. Only in a more permanent fashion, so to speak.

I suggest reading Bobleanta’s article as he includes diagrams of the proposed tattoo, fabric, and fingernail applications. Yes, this could be attached to your fingernails.

Getting back to Vita-More’s essay, she was exploring the integration of nanotechnology, biotechnology, cognitive and neuro sciences (nano-bio-info-cogno- or NBIC) as applied to skin (from the essay),

NBIC is a far cry from the biological touch, taste and smell of our skin because it suggests a cold, mechanical and invasive integration. While the cognitive and neuro sciences are a bit more familiar from a biological viewpoint, they too suggest tampering with our thoughts and probing our privacy. Nonetheless, the enhancement of our human skin is not only lifesaving; it offers new textures, sensations and smells which will have their own sensorial capabilities. [emphasis mine]

New sensorial capabilities certainly evokes Nokia’s proposed magnetic tattoo. She also comments from an artist’s perspective,

What does this mean for designers and media artists? From the perspective of my own artistic practice, it means that it is natural that humans integrate with other types of organisms, that we will evolve with other types of systems, and that this evolution is essential for our future.

The idea of fusing skin with technology is not new as you can see from Vita-More’s essay and countless science fiction stories, as well, there’s research of this kind being done globally. For example, there’s research on electronic tattoos as I noted in my Aug. 12, 2011 posting (and you can find more references elsewhere online). However, these magnetic tattoos represent the first time I’ve seen interest from a commercial enterprise.

Nanocellulose as scaffolding for nerve cells

Swedish scientists have announced success with growing nerve cells using nanocellulose as the scaffolding. From the March 19, 2012 news item on Naowerk,

Researchers from Chalmers and the University of Gothenburg have shown that nanocellulose stimulates the formation of neural networks. This is the first step toward creating a three-dimensional model of the brain. Such a model could elevate brain research to totally new levels, with regard to Alzheimer’s disease and Parkinson’s disease, for example.

“This has been a great challenge,” says Paul Gatenholm, Professor of Biopolymer Technology at Chalmers.?Until recently the cells were dying after a while, since we weren’t able to get them to adhere to the scaffold. But after many experiments we discovered a method to get them to attach to the scaffold by making it more positively charged. Now we have a stable method for cultivating nerve cells on nanocellulose.”

When the nerve cells finally attached to the scaffold they began to develop and generate contacts with one another, so-called synapses. A neural network of hundreds of cells was produced. The researchers can now use electrical impulses and chemical signal substances to generate nerve impulses, that spread through the network in much the same way as they do in the brain. They can also study how nerve cells react with other molecules, such as pharmaceuticals.

I found the original March 19, 2012 press release  and an image on the University of Chalmers website,

Nerve cells growing on a three-dimensional nanocellulose scaffold. One of the applications the research group would like to study is destruction of synapses between nerve cells, which is one of the earliest signs of Alzheimer’s disease. Synapses are the connections between nerve cells. In the image, the functioning synapses are yellow and the red spots show where synapses have been destroyed. Illustration: Philip Krantz, Chalmers

This latest research from Gatenholm and his team will be presented at the American Chemical Society annual meeting in San Diego, March 25, 2012.

The research team from Chalmers University and its partners are working on other applications for nanocellulose including one for artificial ears. From the Chalmers University Jan. 22, 2012 press release,

As the first group in the world, researchers from Chalmers will build up body parts using nanocellulose and the body’s own cells. Funding will be from the European network for nanomedicine, EuroNanoMed.

Professor Paul Gatenholm at Chalmers is leading and co-ordinating this European research programme, which will construct an outer ear using nanocellulose and a mixture of the patient’s own cartilage cells and stem cells.

Previously, Paul Gatenholm and his colleagues succeeded, in close co-operation with Sahlgrenska University Hospital, in developing artificial blood vessels using nanocellulose, where small bacteria “spin” the cellulose.

In the new programme , the researchers will build up a three-dimensional nanocellulose network that is an exact copy of the patient’s healthy outer ear and construct an exact mirror image of the ear. It will have sufficient mechanical stability for it to be used as a bioreactor, which means that the patient’s own cartilage and stem cells can be cultivated directly inside the body or on the patient, in this case on the head. [Presumably the patient has one ear that is healthy and the researchers are attempting to repair or replace an unhealthy ear on the other side of the head.]

As for the Swedish perspective on nanocellulose (from the 2010 press release),

Cellulose-based material is of strategic significance to Sweden and materials science is one of Chalmers eight areas of advance. Biopolymers are highly interesting as they are renewable and could be of major significance in the development of future materials.

Further research into using the forest as a resource for new materials is continuing at Chalmers within the new research programme that is being built up with different research groups at Chalmers and Swerea – IVF. The programme is part of the Wallenberg Wood Science Center, which is being run jointly by the Royal Institute of Technology in Stockholm and Chalmers under the leadership of Professor Lars Berglund at the Royal Institute of Technology.

The 2012 press release announcing the work on nerve cells had this about nanocellulose,

Nanocellulose is a material that consists of nanosized cellulose fibers. Typical dimensions are widths of 5 to 20 nanometers and lengths of up to 2,000 nanometers. Nanocellulose can be produced by bacteria that spin a close-meshed structure of cellulose fibers. It can also be isolated from wood pulp through processing in a high-pressure homogenizer.

I last wrote about the Swedes and nanocellulose in a Feb. 15, 2012 posting about recovering it (nanocellulose) from wood-based sludge.

As for anyone interested in the Canadian scene, there is an article by David Manly in the Jan.-Feb. 2012 issue of Canadian Biomass Magazine that focuses largely on economic impacts and value-added products as they pertain to nanocellulose manufacturing production in Canada. You can also search this blog as I have covered the nanocellulose story in Canada and elsewhere as extensively as I can.

Monkeys, mind control, robots, prosthetics, and the 2014 World Cup (soccer/football)

The idea that a monkey in the US could control a robot’s movements in Japan is stunning. Even more stunning is the fact that the research is four years old. It was discussed publicly in a Jan. 15, 2008 article by Sharon Gaudin for Computer World,

Scientists in the U.S. and Japan have successfully used a monkey’s brain activity to control a humanoid robot — over the Internet.

This research may only be a few years away from helping paralyzed people walk again by enabling them to use their thoughts to control exoskeletons attached to their bodies, according to Miguel Nicolelis, a professor of neurobiology at Duke University and lead researcher on the project.

“This is an attempt to restore mobility to people,” said Nicolelis. “We had the animal trained to walk on a treadmill. As it walked, we recorded its brain activity that generated its locomotion pattern. As the animal was walking and slowing down and changing his pattern, his brain activity was driving a robot in Japan in real time.”

This video clip features an animated monkey simulating control of  a real robot in Japan (the Computational Brain Project of the Japan Science and Technology Agency (JST) in Kyoto partnered with Duke University for this project),

I wonder if the Duke researchers or communications staff thought that the sight of real rhesus monkeys on treadmills might be too disturbing. While we’re on the topic of simulation, I wonder where the robot in the clip actually resides. Quibbles about the video clip aside, I have no doubt that the research took place.

There’s a more recent (Oct. 5, 2011) article, about the work being done in Nicolelis’ laboratory at Duke University, by Ed Yong for Discover Magazine (mentioned previously described in my Oct. 6, 2011 posting),

This is where we are now: at Duke University, a monkey controls a virtual arm using only its thoughts. Miguel Nicolelis had fitted the animal with a headset of electrodes that translates its brain activity into movements. It can grab virtual objects without using its arms. It can also feel the objects without its hands, because the headset stimulates its brain to create the sense of different textures. Monkey think, monkey do, monkey feel – all without moving a muscle.
And this is where  Nicolelis wants to be in three years: a young quadriplegic Brazilian man strolls confidently into a massive stadium. He controls his four prosthetic limbs with his thoughts, and they in turn send tactile information straight to his brain. The technology melds so fluidly with his mind that he confidently runs up and delivers the opening kick of the 2014 World Cup.

This sounds like a far-fetched dream, but Nicolelis – a big soccer fan – is talking to the Brazilian government to make it a reality.

According to Yong, Nicolelis has created an international consortium to support the Walk Again Project. From the project home page,

The Walk Again Project, an international consortium of leading research centers around the world represents a new paradigm for scientific collaboration among the world’s academic institutions, bringing together a global network of scientific and technological experts, distributed among all the continents, to achieve a key humanitarian goal.

The project’s central goal is to develop and implement the first BMI [brain-machine interface] capable of restoring full mobility to patients suffering from a severe degree of paralysis. This lofty goal will be achieved by building a neuroprosthetic device that uses a BMI as its core, allowing the patients to capture and use their own voluntary brain activity to control the movements of a full-body prosthetic device. This “wearable robot,” also known as an “exoskeleton,” will be designed to sustain and carry the patient’s body according to his or her mental will.

In addition to proposing to develop new technologies that aim at improving the quality of life of millions of people worldwide, the Walk Again Project also innovates by creating a complete new paradigm for global scientific collaboration among leading academic institutions worldwide. According to this model, a worldwide network of leading scientific and technological experts, distributed among all the continents, come together to participate in a major, non-profit effort to make a fellow human being walk again, based on their collective expertise. These world renowned scholars will contribute key intellectual assets as well as provide a base for continued fundraising capitalization of the project, setting clear goals to establish fundamental advances toward restoring full mobility for patients in need.

It’s the exoskeleton described on the Walk Again Project home page that Nicolelis is hoping will enable a young Brazilian quadriplegic to deliver the opening kick for the 2014 World Cup (soccer/football) in Brazil.

Advertising for the 21st Century: B-Reel, ‘storytelling’, and mind control

Erin Schulte at Fast Company introduced me to B-Reel, a digital production company, via her Sept. 30, 2011 posting,

Though Swedish hybrid production company B-Reel has been around since 1999, merging film, interactive, games, and mobile to create new methods of storytelling, it exploded into the broader consciousness with 2010’s “The Wilderness Downtown.”

The interactive short film dreamed up by Chris Milk and the band Arcade Fire for its song “We Used To Wait” is a Gen-Y paean of childhood nostalgia, where the singer pines for a simpler, not-so-far away yesteryear where people wrote love letters on paper and anxiously awaited the arrival of an envelope in return.

Here’s a description (followed by B-Reel’s promotiional video) of the Wilderness Downtown project, which was initiated by Google, from the company website,

Featuring Arcade Fire’s new single “We Used To Wait” from their latest album The Suburbs, The Wilderness Downtown is an interactive music video built in HTML 5, using Google Maps and Street-view for Google Chrome Experiments. The film takes an intimate approach by prompting users to input an address from their childhood which then places them at the center of the film’s narrative. Viewers see themselves in the film as they run through the streets of their old neighborhood and finally reach their childhood home. This is tied very closely to the song’s lyrics to make for a powerful emotional experience.

Here’s the video,

The making of the Wilderness Downtown. from B-Reel & B-Reel Films on Vimeo.

A subtle form of advertising for Google, this showcases some of the more innovative approaches that B-Reel takes to its work.

I did watch the Fast Company video interview with Anders Wahlquist, B-Reel Chief Executive Officer, which is included with Schulte’s posting, and he mentions that he founded the company with the intention of combining filmmaking, storytelling, and interactivity. It’s interesting how often the words storytelling and story are used  in the service of advertising and marketing but to replace those words, i.e., it’s no longer about advertising; it’s about telling your story or possibly it’s about mind control. From the July 21, 2011 posting on the B-Reel website,

From B-Reel’s secret laboratory comes a brain-bending experimental project utilising a number of cutting edge tech tools. B-Reel’s UK creative director Riccardo Giraldi led the development of the project, and you can view the explanatory video here, as well as some of the creative musings in a write up below.

The idea is quite simple.

What if you could run a slot car race using your brain?

We did a bit of research on this and it didn’t take long to realise we already have all we need to make these ideas come to life; we just needed to connect the dots and find an easier way to integrate different disciplines to make the magic happen.

These are the steps B-Reel went through:

– researched components and library we could have used

– procured a device that reads mind signals, a Scalextric, Arduino, some tools and electric components

– designed a small electronic circuit to connect Arduino to Scalextric

– wrote the Arduino script to control the Scalextric

– wrote a small Processing application to control the car with the computer mouse

– connected the brain reader device signal to the Scalextric

There are few commercial devices that claim to safely read your brain signals. We ended up choosing the Mindwave headset from Neurosky for this experiment because of its unobtrusive design and its affordable price.

Then we got a basic version Scalextric and started to play around with it. Slot cars are awesome. Digital is already the past – tangible is the future. The principle is straightforward: there are two cars on separate tracks that you can control with a handset. The more current you let pass through the handset, the faster the cars go.

Here’s the ‘mind control’ video,

B-Reel Performs Mind Tricks from B-Reel & B-Reel Films on Vimeo.

I wrote about rats with robotic brains and monkeys (at Duke University in the US) that control robots  in Japan with their thoughts in my Oct. 4, 2011 posting.  I find the resemblance between these projects disconcertingly close and I have to admit I would not have considered advertising applications at this stage of the technology development.

If you are interested in more about mind control projects, Ed Yong at his Not Exactly Rocket Science blog (on the Discover blog network) has written an Oct. 5, 2011 posting titled, Monkeys grab and feel virtual objects with thoughs alone (and what this means for the World Cup). Excerpted from the posting,

This is where we are now: at Duke University, a monkey controls a virtual arm using only its thoughts. Miguel Nicolelis had fitted the animal with a headset of electrodes that translates its brain activity into movements. It can grab virtual objects without using its arms. It can also feel the objects without its hands, because the headset stimulates its brain to create the sense of different textures. Monkey think, monkey do, monkey feel – all without moving a muscle.
And this is where  Nicolelis wants to be in three years: a young quadriplegic Brazilian man strolls confidently into a massive stadium. He controls his four prosthetic limbs with his thoughts, and they in turn send tactile information straight to his brain. The technology melds so fluidly with his mind that he confidently runs up and delivers the opening kick of the 2014 World Cup.

This sounds like a far-fetched dream, but Nicolelis – a big soccer fan – is talking to the Brazilian government to make it a reality. He has created an international consortium called the Walk Again Project, consisting of non-profit research institutions in the United States, Brazil, Germany and Switzerland. Their goal is to create a “high performance brain-controlled prosthetic device that enables patients to finally leave the wheelchair behind.”

I’m not sure what the* intention was in 1999 when the company name, B-Reel, was chosen but today the wordplay has a haunting quality. Especially when you consider that mind control doesn’t necessarily mean people are in control. After all there’s my Sept. 28, 2011 posting about full size vehicles titled Cars that read minds? If you notice, the researcher at B-Reel has to shift his brain function in order to exert control so who’s in charge the researcher or the model car? Extending that question, will you have to change your mind so the car can read it?

* ‘the’ added May 15, 2014.

Step closer to integrating electronics into the body

The Sept. 20, 2011 news item (Proton-based transistor could let machines communicate with living things) on Nanowerk features a rather interesting development,

Human devices, from light bulbs to iPods, send information using electrons. Human bodies and all other living things, on the other hand, send signals and perform work using ions or protons.

Materials scientists at the University of Washington have built a novel transistor that uses protons, creating a key piece for devices that can communicate directly with living things.

Here’s a diagram from the University of Washington Sept. 20, 2011 article about the proton transistor by Hannah Hickey,


On the left is a colored photo of the UW device overlaid on a graphic of the other components. On the right is a magnified image of the chitosan fibers. The white scale bar is 200 nanometers. (Marco Rolandi, UW)

Here’s a little more about the proton transistor (from the Hickey article),

In the body, protons activate “on” and “off” switches and are key players in biological energy transfer. Ions open and close channels in the cell membrane to pump things in and out of the cell. Animals including humans use ions to flex their muscles and transmit brain signals. A machine that was compatible with a living system in this way could, in the short term, monitor such processes. Someday it could generate proton currents to control certain functions directly.

A first step toward this type of control is a transistor that can send pulses of proton current. The prototype device is a field-effect transistor, a basic type of transistor that includes a gate, a drain and a source terminal for the current. The UW prototype is the first such device to use protons. It measures about 5 microns wide, roughly a twentieth the width of a human hair.

As for the device (from the Hickey article),

The device uses a modified form of the compound chitosan originally extracted from squid pen, a structure that survives from when squids had shells. The material is compatible with living things, is easily manufactured, and can be recycled from crab shells and squid pen discarded by the food industry.

There is a minor Canadian connection,

Computer models of charge transport developed by co-authors M.P. Anantram, a UW professor of electrical engineering, and Anita Fadavi Roudsari at Canada’s University of Waterloo, were a good match for the experimental results.

If I understand this correctly, the computer models were confirmed by the experimental  results, which means the computer models can be used (to augment the use of expensive experiments) with a fair degree of confidence.

I am finding this integration of electronics into the body both fascinating and disturbing as per my paper, Whose electric brain? More about that when I have more time.

Eye, arm, & leg prostheses, cyborgs, eyeborgs, Deus Ex, and ableism

Companies are finding more ways to publicize and promote themselves and their products. For example there’s Intel, which seems to have been especially active lately with its Tomorrow Project (my August 22, 2011 posting) and its sponsorship (being one of only four companies to do so) of the Discovery Channel’s Curiosity television programme (my July 15, 2011 posting). What I find interesting in these efforts is their range and the use of old and new techniques.

Today I found (August 30, 2011 article by Nancy Owano) a documentary made by Robert Spence, Canadian filmmaker and eyeborg, for the recently released Deus Ex: Human Revolution game (both the game and Spence are mentioned in my August 18, 2011 posting) from the company, Eidos Montréal. If you’re squeamish (medical operation is featured), you might want to miss the first few minutes,

I found it quite informative but curiously US-centric. How could they discuss prostheses for the legs and not mention Oscar Pistorius, the history-making South African double amputee runner who successfully petitioned the Court for Arbitration for Sport for the right to compete with able-bodied athletes? (In July this year, Pistorius qualified for the 2012 Olympics.) By the way, they do mention the Icelandic company, Össur, which created Pistorius’ “cheetah” legs. (There’s more about Pistorius and human enhancement in my Feb. 2, 2010 posting. [scroll down about 1/3 of the way])

There’s some very interesting material about augmented reality masks for firefighters in this documentary. Once functional and commercially available, the masks would give firefighters information about toxic gases, temperature, etc. as they move through a burning building. There’s a lot of interest in making augmented reality commercially available via smartphones as Kit Eaton notes in an August 29, 2011 article for Fast Company,

Junaio’s 3.0 release is a big transformation for the software–it included limited object recognition powers for about a year, but the new system is far more sophisticated. As well as relying on the usual AR sensor suite of GPS (to tell the software where the smartphone is on the planet), compass, and gyros to work out what angle the phone’s camera is looking, it also uses feature tracking to give it a better idea of the objects in its field of view. As long as one of Junaio’s channels or databases or the platforms of its developer partners has information on the object, it’ll pop up on screen.

When it recognizes a barcode, for example, the software “combines and displays data sources from various partner platforms to provide useful consumer information on a given product,” which can be a “website, a shopping micro-site or other related information” such as finding recipes based on the ingredients. It’s sophisticated enough so you can scan numerous barcoded items from your fridge and add in extras like “onions” and then get it to find a recipe that uses them.

Eaton notes that people might have an objection to holding up their smartphones for long periods of time. That’s a problem that could be solved of course if we added a prosthetic to the eye or replaced an organic eye with a bionic eye as they do in the game and as they suggest in the documentary.

Not everyone is quite so sanguine about this bright new future. I featured a documentary, Fixed, about some of the discussion regarding disability, ability, and human enhancement in my August 3, 2010 posting. One of the featured academics is Gregor Wolbring, assistant professor, Dept of Community Health Sciences, Program in Community Rehabilitation and Disability Studies, University of Calgary; and president of the Canadian Disability Studies Association.  From Gregor’s June 17, 2011 posting on the FedCan blog,

The term ableism evolved from the disabled people rights movements in the United States and Britain during the 1960s and 1970s.  It questions and highlights the prejudice and discrimination experienced by persons whose body structure and ability functioning were labelled as ‘impaired’ as sub species-typical. Ableism of this flavor is a set of beliefs, processes and practices, which favors species-typical normative body structure based abilities. It labels ‘sub-normative’ species-typical biological structures as ‘deficient’, as not able to perform as expected.

The disabled people rights discourse and disability studies scholars question the assumption of deficiency intrinsic to ‘below the norm’ labeled body abilities and the favoritism for normative species-typical body abilities. The discourse around deafness and Deaf Culture would be one example where many hearing people expect the ability to hear. This expectation leads them to see deafness as a deficiency to be treated through medical means. In contrast, many Deaf people see hearing as an irrelevant ability and do not perceive themselves as ill and in need of gaining the ability to hear. Within the disabled people rights framework ableism was set up as a term to be used like sexism and racism to highlight unjust and inequitable treatment.

Ableism is, however, much more pervasive.

Ableism based on biological structure is not limited to the species-typical/ sub species-typical dichotomy. With recent science and technology advances, and envisioned advances to come, we will see the dichotomy of people exhibiting species-typical and the so-called sub species-typical abilities labeled as impaired, and in ill health. On the other side we will see people exhibiting beyond species-typical abilities as the new expectation norm. An ableism that favours beyond species-typical abilities over species-typical and sub species-typical abilities will enable a change in meaning and scope of concepts such as health, illness, rehabilitation, disability adjusted life years, medicine, health care, and health insurance. For example, one will only be labeled as healthy if one has received the newest upgrade to one’s body – meaning one would by default be ill until one receives the upgrade.

Here’s an excerpt from my Feb. 2, 2010 posting which reinforces what Gregor is saying,

This influx of R&D cash, combined with breakthroughs in materials science and processor speed, has had a striking visual and social result: an emblem of hurt and loss has become a paradigm of the sleek, modern, and powerful. Which is why Michael Bailey, a 24-year-old student in Duluth, Georgia, is looking forward to the day when he can amputate the last two fingers on his left hand.

“I don’t think I would have said this if it had never happened,” says Bailey, referring to the accident that tore off his pinkie, ring, and middle fingers. “But I told Touch Bionics I’d cut the rest of my hand off if I could make all five of my fingers robotic.” [originally excerpted from Paul Hochman’s Feb. 1, 2010 article, Bionic Legs, i-Limbs, and Other Super Human Prostheses You’ll Envy for Fast Company]

I don’t really know how to take the fact that the documentary is in fact product placement for the game, Deus Ex: Human Revolution. On the up side, it opens up a philosophical discussion in a very engaging way. On the down side, it closes down the discussion because drawbacks are not seriously mentioned.

God from the machine: Deus ex machina and augmentation

Wherever you go, there it is: ancient Greece. Deus Ex, a game series from Eidos Montréal, is likely referencing ‘deus ex machina’, a term applied to a theatrical device (in both senses of the word) attributed to  playwrights of ancient Greece. (For anyone who’s unfamiliar with the term, at the end of a play, all of the conflicts would be resolved by a god descending from the heavens. The term refers both to the plot device itself and to the mechanical device used to lower the ‘god’.)

The latest game in the series, Deus Ex: Human Revolution, a role-playing shooter, will be released August 23, 2011. From the August 16, 2011 article by Susan Karlin for Fast Company,

The result—Deus Ex: Human Revolution, a role-playing shooter that comes out August 23–extrapolates MicroTransponder, prosthetics, robotics, and other current augmentation technology into a vision of how technologically enhanced people might gain superhuman abilities and at what cost.

… “We built a timeline that traces the history of augmentation, creating new things, and predicting how would it get out into society. We wanted to ground it in today, and make something where everyone could say, ‘I can see the world going that way.'” [Mary DeMarle, Human Revolution’s lead writer]

Human Revolution, although the third in the series, is a prequel to the original Deus Ex which took place 25 years after Human Revolution.

I’m glad to see games that bring up interesting philosophical questions and possible social impacts of emerging technologies along with the action. In a February 3, 2011 interview with Mary DeMarle, Quintin Smith of Rock, Paper, Shotgun posed these questions,

RPS: Finally, with anti-augmentation groups featuring in Human Revolution, I was just wondering what your own opinions are on human augmentation and human bioengineering are.

MD: Oh, gosh. Well I have to tell you that the joke on the team is that for the duration of this story I’d be supporting the anti-technology view, because most people on the team wouldn’t be anti-technology, and it’d help me make the game more human, you know? And now that the project’s over I bought my first iPad, and I have to admit I’m suddenly like “You know, if I could get one of those InfoLinks in my head, it’d be really useful.”

But you know, all of this stuff is already out there. We already have people putting cameras in their eyes to improve their vision. [emphasis mine] The technology’s there, we’re just not aware of it. As far as our team’s technology expert is concerned, human augmentation’s been going on for decades. If you look at all the sports controversy regarding drugs, that is augmentation. It’s already happening.

RPS: But you have no qualms with our using technology to make ourselves more than we can be?

MD: From my perspective, I think mankind will always try to be more than he is. That’s part of being human. But I do admit we have to be careful about how we do it.

In my February 2, 2010 posting (scroll down about 1/2 way), I featured a quote that resonates with DeMarle’s comments about humans trying to be more,

“I don’t think I would have said this if it had never happened,” says Bailey, referring to the accident that tore off his pinkie, ring, and middle fingers. “But I told Touch Bionics I’d cut the rest of my hand off if I could make all five of my fingers robotic.”

Bailey went on to say that having machinery incorporated into his body made him feel “above human”.

As for cameras being implanted in eyes to improve vision, I would be delighted to hear from anyone who has information about this. The only project I could find in my search was EyeBorg, a project with a one-eyed Canadian filmmaker who was planning to have a video camera implanted into his eye socket to record images. From the About the Project page,

Take a one eyed film maker, an unemployed engineer, and a vision for something that’s never been done before and you have yourself the EyeBorg Project. Rob Spence and Kosta Grammatis are trying to make history by embedding a video camera and a transmitter in a prosthetic eye. That eye is going in Robs eye socket, and will record the world from a perspective that’s never been seen before.

There are more details about the EyeBorg project in a June 11, 2010 posting by Tim Hornyak for the Automaton blog (on the IEEE [Institute of Electrical and Electronics Engineers] website),

When Canadian filmmaker Rob Spence was a kid, he would peer through the bionic eye of his Six Million Dollar action figure. After a shooting accident left him partially blind, he decided to create his own electronic eye. Now he calls himself Eyeborg.

Spence’s bionic eye contains a battery-powered, wireless video camera. Not only can he record everything he sees just by looking around, but soon people will be able to log on to his video feed and view the world through his right eye.

I don’t know how the Eyeborg project is proceeding as there haven’t been any updates on the site since August 25, 2010.

While I wish Quintin Smith had asked for more details about the science information DeMarle was passing on in the February 3, 2011 interview, I think it’s interesting to note that information about science and technology comes to us in many ways: advertisements, popular television programmes, comic books, interviews, and games, as well as, formal public science outreach programmes through museums and educational institutions.

ETA August 19, 2011: I found some information about visual prosthetics at the European Commission’s Future and Emerging Technologies (FET) website, We can rebuild you page featuring a TEDxVienna November 2010 talk by electrical engineer, Grégoire Cosendai, from the Swiss Federal Institute of Technology. He doesn’t mention the prosthetics until approximately 13 minutes, 25 seconds into the talk. The work is being done to help people with retinitis pigmentosa, a condition that is incurable at this time but it may have implications for others. There are 30 people worldwide in a clinical trial testing a retinal implant that requires the person wear special glasses containing a camera and an antenna. For Star Trek fans, this seems similar to Geordi LaForge‘s special glasses.

ETA Sept. 13, 2011: Better late than never, here’s an excerpt from Dexter Johnson’s Sept. 2, 2011 posting (on his Nanoclast blog at the Institute of Electrical and Electronics Engineers [IEEE] website) about a nano retina project,

The Israel-based company [Nano Retina] is a joint venture between Rainbow Medical and Zyvex Labs, the latter being well known for its work in nanotechnology and its founder Jim Von Ehr, who has been a strong proponent of molecular mechanosynthesis.

It’s well worth contrasting the information in the company video that Dexter provides and the information in the FET video mentioned in the Aug. 19, 2011 update preceding this one. The company presents a vastly more optimistic claim for the vision these implants will provide than one would expect after viewing the information in the FET video about clinical trials, for another similar (to me) system, currently taking place.