Tag Archives: brains

University of Waterloo researchers use 2.5M (virtual) neurons to simulate a brain

I hinted about some related work at the University of Waterloo earlier this week in my Nov. 26, 2012 posting (Existential risk) about a proposed centre at the University of Cambridge which would be tasked with examining possible risks associated with ‘ultra intelligent machines’.  Today (Science (magazine) published an article about SPAUN (Semantic Pointer Architecture Unified Network) [behind a paywall])and its ability to solve simple arithmetic and perform other tasks as well.

Ed Yong writing for Nature magazine (Simulated brain scores top test marks, Nov. 29, 2012) offers this description,

Spaun sees a series of digits: 1 2 3; 5 6 7; 3 4 ?. Its neurons fire, and it calculates the next logical number in the sequence. It scrawls out a 5, in legible if messy writing.

This is an unremarkable feat for a human, but Spaun is actually a simulated brain. It contains2.5 millionvirtual neurons — many fewer than the 86 billion in the average human head, but enough to recognize lists of numbers, do simple arithmetic and solve reasoning problems.

Here’s a video demonstration, from the University of Waterloo’s Nengo Neural Simulator home page,

The University of Waterloo’s Nov. 29, 2012 news release offers more technical detail,

… The model captures biological details of each neuron, including which neurotransmitters are used, how voltages are generated in the cell, and how they communicate. Spaun uses this network of neurons to process visual images in order to control an arm that draws Spaun’s answers to perceptual, cognitive and motor tasks. …

“This is the first model that begins to get at how our brains can perform a wide variety of tasks in a flexible manner—how the brain coordinates the flow of information between different areas to exhibit complex behaviour,” said Professor Chris Eliasmith, Director of the Centre for Theoretical Neuroscience at Waterloo. He is Canada Research Chair in Theoretical Neuroscience, and professor in Waterloo’s Department of Philosophy and Department of Systems Design Engineering.

Unlike other large brain models, Spaun can perform several tasks. Researchers can show patterns of digits and letters the model’s eye, which it then processes, causing it to write its responses to any of eight tasks.  And, just like the human brain, it can shift from task to task, recognizing an object one moment and memorizing a list of numbers the next. [emphasis mine] Because of its biological underpinnings, Spaun can also be used to understand how changes to the brain affect changes to behaviour.

“In related work, we have shown how the loss of neurons with aging leads to decreased performance on cognitive tests,” said Eliasmith. “More generally, we can test our hypotheses about how the brain works, resulting in a better understanding of the effects of drugs or damage to the brain.”

In addition, the model provides new insights into the sorts of algorithms that might be useful for improving machine intelligence. [emphasis mine] For instance, it suggests new methods for controlling the flow of information through a large system attempting to solve challenging cognitive tasks.

Laura Sanders’ Nov. 29, 2012 article for ScienceNews suggests that there is some controversy as to whether or not SPAUN does resemble a human brain,

… Henry Markram, who leads a different project to reconstruct the human brain called the Blue Brain, questions whether Spaun really captures human brain behavior. Because Spaun’s design ignores some important neural properties, it’s unlikely to reveal anything about the brain’s mechanics, says Markram, of the Swiss Federal Institute of Technology in Lausanne. “It is not a brain model.”

Personally, I have a little difficulty seeing lines of code as ever being able to truly simulate brain activity. I think the notion of moving to something simpler (using fewer neurons as the Eliasmith team does) is a move in the right direction but I’m still more interested in devices such as the memristor and the electrochemical atomic switch and their potential.

Blue Brain Project

Memristor and artificial synapses in my April 19, 2012 posting

Atomic or electrochemical atomic switches and neuromorphic engineering briefly mentioned (scroll 1/2 way down) in my Oct. 17, 2011 posting.

ETA Dec. 19, 2012: There was an AMA (ask me anything) session on Reddit with the SPAUN team in early December, if you’re interested, you can still access the questions and answers,

We are the computational neuroscientists behind the world’s largest functional brain model

Zombies, brains, collapsing boundaries, and entanglements at the 4th annual S.NET conference

My proposal, Zombies, brains, collapsing boundaries, and entanglements, for the 4th annual S.NET (Society for the Study of Nanoscience and Emerging Technologies) conference was accepted. Mentioned in my Feb. 9, 2012 posting, the conference will be held at the University of Twente (Netherlands) from Oct. 22 – 25, 2012.

Here’s the abstract I provided,

The convergence between popular culture’s current fascination with zombies and their appetite for human brains (first established in the 1985 movie, Night of the Living Dead) and an extraordinarily high level of engagement in brain research by various medical and engineering groups around the world is no coincidence

Amongst other recent discoveries, the memristor (a concept from nanoelectronics) is collapsing the boundaries between humans and machines/robots and ushering in an age where humanistic discourse must grapple with cognitive entanglements.

Perceptible only at the level of molecular electronics (nanoelectronics), the memristor was a theoretical concept until 2008. Traditionally in electrical engineering, there are three circuit elements: resistors, inductors, and capacitors. The new circuit element, the memristor, was postulated in a paper by Dr. Leon Chua in 1971 to account for anomalies that had been experienced and described in the literature since the 1950s.

According to Chua’s theory and confirmed by the research team headed by R. Stanley Williams, the memristor remembers how much and when current has been flowing. The memristor is capable of an in-between state similar to certain brain states and this capacity lends itself to learning. As some have described it, the memristor is a synapse on a chip making neural computing a reality and/or the possibility of repairing brains stricken with neurological conditions. In other words, with post-human engineering exploiting discoveries such as the memristor we will have machines/robots that can learn and think and human brains that could incorporate machines.

As Jacques Derrida used the zombie to describe a state that this is neither life nor death as undecidable, the memristor can be described as an agent of transformation conferring robots with the ability to learn (a human trait) thereby rendering them as undecidable, i.e., neither machine nor life. Mirroring its transformative agency in robots, the memristor could also confer the human brain with machine/robot status and undecidability when used for repair or enhancement.

The memristor moves us past Jacques Derrida’s notion of undecidability as largely theoretical to a world where we confront this reality in a type of cognitive entanglement on a daily basis.

You can find the preliminary programme here.  My talk is scheduled for Thursday, Oct. 25, 2012 in one of the last sessions for the conference, 11 – 12:30 pm in the Tracing Transhuman Narratives strand.

I do see a few names I recognize, Wickson, Pat (Roy)  Mooney and Youtie. I believe Wickson is Fern Wickson from the University of Bergen last mentioned here in a Jul;y 7, 2010 posting about nature, nanotechnology, and metaphors. Pat Roy Mooney is from The ETC Group (an activist or civil society group) and was last mentioned here in my Oct. 7, 2011 posting), and I believe Youtie is Jan Youtie who wss mentioned in my March 29, 2012 posting about nanotechnology, economic impacts, and full life cycle assessments.

Brain, brains, brains: a roundup

I’ve decided to do a roundup of the various brain-related projects I’ve been coming across in the last several months. I was inspired by this article (Real-life Jedi: Pushing the limits of mind control) by Katia Moskvitch,

You don’t have to be a Jedi to make things move with your mind.

Granted, we may not be able to lift a spaceship out of a swamp like Yoda does in The Empire Strikes Back, but it is possible to steer a model car, drive a wheelchair and control a robotic exoskeleton with just your thoughts.

We are standing in a testing room at IBM’s Emerging Technologies lab in Winchester, England.

On my head is a strange headset that looks like a black plastic squid. Its 14 tendrils, each capped with a moistened electrode, are supposed to detect specific brain signals.

In front of us is a computer screen, displaying an image of a floating cube.

As I think about pushing it, the cube responds by drifting into the distance.

Moskvitch goes on to discuss a number of projects that translate thought into movement via various pieces of equipment before she mentions a project at Brown University (US) where researchers are implanting computer chips into brains,

Headsets and helmets offer cheap, easy-to-use ways of tapping into the mind. But there are other,

Imagine some kind of a wireless computer device in your head that you’ll use for mind control – what if people hacked into that”

At Brown Institute for Brain Science in the US, scientists are busy inserting chips right into the human brain.

The technology, dubbed BrainGate, sends mental commands directly to a PC.

Subjects still have to be physically “plugged” into a computer via cables coming out of their heads, in a setup reminiscent of the film The Matrix. However, the team is now working on miniaturising the chips and making them wireless.

The researchers are recruiting for human clinical trials, from the BrainGate Clinical Trials webpage,

Clinical Trials – Now Recruiting

The purpose of the first phase of the pilot clinical study of the BrainGate2 Neural Interface System is to obtain preliminary device safety information and to demonstrate the feasibility of people with tetraplegia using the System to control a computer cursor and other assistive devices with their thoughts. Another goal of the study is to determine the participants’ ability to operate communication software, such as e-mail, simply by imagining the movement of their own hand. The study is invasive and requires surgery.

Individuals with limited or no ability to use both hands due to cervical spinal cord injury, brainstem stroke, muscular dystrophy, or amyotrophic lateral sclerosis (ALS) or other motor neuron diseases are being recruited into a clinical study at Massachusetts General Hospital (MGH) and Stanford University Medical Center. Clinical trial participants must live within a three-hour drive of Boston, MA or Palo Alto, CA. Clinical trial sites at other locations may be opened in the future. The study requires a commitment of 13 months.

They have been recruiting since at least November 2011, from the Nov. 14, 2011 news item by Tanya Lewis on MedicalXpress,

Stanford University researchers are enrolling participants in a pioneering study investigating the feasibility of people with paralysis using a technology that interfaces directly with the brain to control computer cursors, robotic arms and other assistive devices.

The pilot clinical trial, known as BrainGate2, is based on technology developed at Brown University and is led by researchers at Massachusetts General Hospital, Brown and the Providence Veterans Affairs Medical Center. The researchers have now invited the Stanford team to establish the only trial site outside of New England.

Under development since 2002, BrainGate is a combination of hardware and software that directly senses electrical signals in the brain that control movement. The device — a baby-aspirin-sized array of electrodes — is implanted in the cerebral cortex (the outer layer of the brain) and records its signals; computer algorithms then translate the signals into digital instructions that may allow people with paralysis to control external devices.

Confusingly, there seemto be two BrainGate organizations. One appears to be a research entity where a number of institutions collaborate and the other is some sort of jointly held company. From the About Us webpage of the BrainGate research entity,

In the late 1990s, the initial translation of fundamental neuroengineering research from “bench to bedside” – that is, to pilot clinical testing – would require a level of financial commitment ($10s of millions) available only from private sources. In 2002, a Brown University spin-off/startup medical device company, Cyberkinetics, Inc. (later, Cyberkinetics Neurotechnology Systems, Inc.) was formed to collect the regulatory permissions and financial resources required to launch pilot clinical trials of a first-generation neural interface system. The company’s efforts and substantial initial capital investment led to the translation of the preclinical research at Brown University to an initial human device, the BrainGate Neural Interface System [Caution: Investigational Device. Limited by Federal Law to Investigational Use]. The BrainGate system uses a brain-implantable sensor to detect neural signals that are then decoded to provide control signals for assistive technologies. In 2004, Cyberkinetics received from the U.S. Food and Drug Administration (FDA) the first of two Investigational Device Exemptions (IDEs) to perform this research. Hospitals in Rhode Island, Massachusetts, and Illinois were established as clinical sites for the pilot clinical trial run by Cyberkinetics. Four trial participants with tetraplegia (decreased ability to use the arms and legs) were enrolled in the study and further helped to develop the BrainGate device. Initial results from these trials have been published or presented, with additional publications in preparation.

While scientific progress towards the creation of this promising technology has been steady and encouraging, Cyberkinetics’ financial sponsorship of the BrainGate research – without which the research could not have been started – began to wane. In 2007, in response to business pressures and changes in the capital markets, Cyberkinetics turned its focus to other medical devices. Although Cyberkinetics’ own funds became unavailable for BrainGate research, the research continued through grants and subcontracts from federal sources. By early 2008 it became clear that Cyberkinetics would eventually need to withdraw completely from directing the pilot clinical trials of the BrainGate device. Also in 2008, Cyberkinetics spun off its device manufacturing to new ownership, BlackRock Microsystems, Inc., which now produces and is further developing research products as well as clinically-validated (510(k)-cleared) implantable neural recording devices.

Beginning in mid 2008, with the agreement of Cyberkinetics, a new, fully academically-based IDE application (for the “BrainGate2 Neural Interface System”) was developed to continue this important research. In May 2009, the FDA provided a new IDE for the BrainGate2 pilot clinical trial. [Caution: Investigational Device. Limited by Federal Law to Investigational Use.] The BrainGate2 pilot clinical trial is directed by faculty in the Department of Neurology at Massachusetts General Hospital, a teaching affiliate of Harvard Medical School; the research is performed in close scientific collaboration with Brown University’s Department of Neuroscience, School of Engineering, and Brown Institute for Brain Sciences, and the Rehabilitation Research and Development Service of the U.S. Department of Veteran’s Affairs at the Providence VA Medical Center. Additionally, in late 2011, Stanford University joined the BrainGate Research Team as a clinical site and is currently enrolling participants in the clinical trial. This interdisciplinary research team includes scientific partners from the Functional Electrical Stimulation Center at Case Western Reserve University and the Cleveland VA Medical Center. As was true of the decades of fundamental, preclinical research that provided the basis for the recent clinical studies, funding for BrainGate research is now entirely from federal and philanthropic sources.

The BrainGate Research Team at Brown University, Massachusetts General Hospital, Stanford University, and Providence VA Medical Center comprises physicians, scientists, and engineers working together to advance understanding of human brain function and to develop neurotechnologies for people with neurologic disease, injury, or limb loss.

I think they’re saying there was a reverse takeover of Cyberkinetics, from the BrainGate company About webpage,

The BrainGate™ Co. is a privately-held firm focused on the advancement of the BrainGate™ Neural Interface System.  The Company owns the Intellectual property of the BrainGate™ system as well as new technology being developed by the BrainGate company.  In addition, the Company also owns  the intellectual property of Cyberkinetics which it purchased in April 2009.

Meanwhile, in Europe there are two projects BrainAble and the Human Brain Project. The BrainAble project is similar to BrainGate in that it is intended for people with injuries but they seem to be concentrating on a helmet or cap for thought transmission (as per Moskovitch’s experience at the beginning of this posting). From the Feb. 28, 2012 news item on Science Daily,

In the 2009 film Surrogates, humans live vicariously through robots while safely remaining in their own homes. That sci-fi future is still a long way off, but recent advances in technology, supported by EU funding, are bringing this technology a step closer to reality in order to give disabled people more autonomy and independence than ever before.

“Our aim is to give people with motor disabilities as much autonomy as technology currently allows and in turn greatly improve their quality of life,” says Felip Miralles at Barcelona Digital Technology Centre, a Spanish ICT research centre.

Mr. Miralles is coordinating the BrainAble* project (http://www.brainable.org/), a three-year initiative supported by EUR 2.3 million in funding from the European Commission to develop and integrate a range of different technologies, services and applications into a commercial system for people with motor disabilities.

Here’s more from the BrainAble home page,

In terms of HCI [human-computer interface], BrainAble improves both direct and indirect interaction between the user and his smart home. Direct control is upgraded by creating tools that allow controlling inner and outer environments using a “hybrid” Brain Computer Interface (BNCI) system able to take into account other sources of information such as measures of boredom, confusion, frustration by means of the so-called physiological and affective sensors.

Furthermore, interaction is enhanced by means of Ambient Intelligence (AmI) focused on creating a proactive and context-aware environments by adding intelligence to the user’s surroundings. AmI’s main purpose is to aid and facilitate the user’s living conditions by creating proactive environments to provide assistance.

Human-Computer Interfaces are complemented by an intelligent Virtual Reality-based user interface with avatars and scenarios that will help the disabled move around freely, and interact with any sort of devices. Even more the VR will provide self-expression assets using music, pictures and text, communicate online and offline with other people, play games to counteract cognitive decline, and get trained in new functionalities and tasks.

Perhaps this video helps,

Another European project, NeuroCare, which I discussed in my March 5, 2012 posting, is focused on creating neural implants to replace damaged and/or destroyed sensory cells in the eye or the ear.

The Human Brain Project is, despite its title, a neuromorphic engineering project (although the researchers do mention some medical applications on the project’s home page)  in common with the work being done at the University of Michigan/HRL Labs mentioned in my April 19, 2012 posting (A step closer to artificial synapses courtesy of memritors) about that project. From the April 11, 2012 news item about the Human Brain Project on Science Daily,

Researchers at the EPFL [Ecole Polytechnique Fédérale de Lausanne] have discovered rules that relate the genes that a neuron switches on and off, to the shape of that neuron, its electrical properties and its location in the brain.

The discovery, using state-of-the-art informatics tools, increases the likelihood that it will be possible to predict much of the fundamental structure and function of the brain without having to measure every aspect of it. That in turn makes the Holy Grail of modelling the brain in silico — the goal of the proposed Human Brain Project — a more realistic, less Herculean, prospect. “It is the door that opens to a world of predictive biology,” says Henry Markram, the senior author on the study, which is published this week in PLoS ONE.

Here’s a bit more about the Human Brain Project (from the home page),

Today, simulating a single neuron requires the full power of a laptop computer. But the brain has billions of neurons and simulating all them simultaneously is a huge challenge. To get round this problem, the project will develop novel techniques of multi-level simulation in which only groups of neurons that are highly active are simulated in detail. But even in this way, simulating the complete human brain will require a computer a thousand times more powerful than the most powerful machine available today. This means that some of the key players in the Human Brain Project will be specialists in supercomputing. Their task: to work with industry to provide the project with the computing power it will need at each stage of its work.

The Human Brain Project will impact many different areas of society. Brain simulation will provide new insights into the basic causes of neurological diseases such as autism, depression, Parkinson’s, and Alzheimer’s. It will give us new ways of testing drugs and understanding the way they work. It will provide a test platform for new drugs that directly target the causes of disease and that have fewer side effects than current treatments. It will allow us to design prosthetic devices to help people with disabilities. The benefits are potentially huge. As world populations grow older, more than a third will be affected by some kind of brain disease. Brain simulation provides us with a powerful new strategy to tackle the problem.

The project also promises to become a source of new Information Technologies. Unlike the computers of today, the brain has the ability to repair itself, to take decisions, to learn, and to think creatively – all while consuming no more energy than an electric light bulb. The Human Brain Project will bring these capabilities to a new generation of neuromorphic computing devices, with circuitry directly derived from the circuitry of the brain. The new devices will help us to build a new generation of genuinely intelligent robots to help us at work and in our daily lives.

The Human Brain Project builds on the work of the Blue Brain Project. Led by Henry Markram of the Ecole Polytechnique Fédérale de Lausanne (EPFL), the Blue Brain Project has already taken an essential first towards simulation of the complete brain. Over the last six years, the project has developed a prototype facility with the tools, know-how and supercomputing technology necessary to build brain models, potentially of any species at any stage in its development. As a proof of concept, the project has successfully built the first ever, detailed model of the neocortical column, one of the brain’s basic building blocks.

The Human Brain Project is a flagship project  in contention for the 1B Euro research prize that I’ve mentioned in the context of the GRAPHENE-CA flagship project (my Feb. 13, 2012 posting gives a better description of these flagship projects while mentioned both GRAPHENE-CA and another brain-computer interface project, PRESENCCIA).

Part of the reason for doing this roundup, is the opportunity to look at a number of these projects in one posting; the effect is more overwhelming than I expected.

For anyone who’s interested in Markram’s paper (open access),

Georges Khazen, Sean L. Hill, Felix Schürmann, Henry Markram. Combinatorial Expression Rules of Ion Channel Genes in Juvenile Rat (Rattus norvegicus) Neocortical Neurons. PLoS ONE, 2012; 7 (4): e34786 DOI: 10.1371/journal.pone.0034786

I do have earlier postings on brains and neuroprostheses, one of the more recent ones is this March 16, 2012 posting. Meanwhile, there are  new announcements from Northwestern University (US) and the US National Institutes of Health (National Institute of Neurological Disorders and Stroke). From the April 18, 2012 news item (originating from the National Institutes of Health) on Science Daily,

An artificial connection between the brain and muscles can restore complex hand movements in monkeys following paralysis, according to a study funded by the National Institutes of Health.

In a report in the journal Nature, researchers describe how they combined two pieces of technology to create a neuroprosthesis — a device that replaces lost or impaired nervous system function. One piece is a multi-electrode array implanted directly into the brain which serves as a brain-computer interface (BCI). The array allows researchers to detect the activity of about 100 brain cells and decipher the signals that generate arm and hand movements. The second piece is a functional electrical stimulation (FES) device that delivers electrical current to the paralyzed muscles, causing them to contract. The brain array activates the FES device directly, bypassing the spinal cord to allow intentional, brain-controlled muscle contractions and restore movement.

From the April 19, 2012 news item (originating from Northwestern University) on Science Daily,

A new Northwestern Medicine brain-machine technology delivers messages from the brain directly to the muscles — bypassing the spinal cord — to enable voluntary and complex movement of a paralyzed hand. The device could eventually be tested on, and perhaps aid, paralyzed patients.

The research was done in monkeys, whose electrical brain and muscle signals were recorded by implanted electrodes when they grasped a ball, lifted it and released it into a small tube. Those recordings allowed the researchers to develop an algorithm or “decoder” that enabled them to process the brain signals and predict the patterns of muscle activity when the monkeys wanted to move the ball.

These experiments were performed by Christian Ethier, a post-doctoral fellow, and Emily Oby, a graduate student in neuroscience, both at the Feinberg School of Medicine. The researchers gave the monkeys a local anesthetic to block nerve activity at the elbow, causing temporary, painless paralysis of the hand. With the help of the special devices in the brain and the arm — together called a neuroprosthesis — the monkeys’ brain signals were used to control tiny electric currents delivered in less than 40 milliseconds to their muscles, causing them to contract, and allowing the monkeys to pick up the ball and complete the task nearly as well as they did before.

“The monkey won’t use his hand perfectly, but there is a process of motor learning that we think is very similar to the process you go through when you learn to use a new computer mouse or a different tennis racquet. Things are different and you learn to adjust to them,” said Miller [Lee E. Miller], also a professor of physiology and of physical medicine and rehabilitation at Feinberg and a Sensory Motor Performance Program lab chief at the Rehabilitation Institute of Chicago.

The National Institutes of Health news item supplies a little history and background for this latest breakthrough while the Northwestern University news item offers more technical details more technical details.

You can find the researchers’ paper with this citation (assuming you can get past the paywall,

C. Ethier, E. R. Oby, M. J. Bauman, L. E. Miller. Restoration of grasp following paralysis through brain-controlled stimulation of muscles. Nature, 2012; DOI: 10.1038/nature10987

I was surprised to find the Health Research Fund of Québec listed as one of the funders but perhaps Christian Ethier has some connection with the province.

Rats with robot brains

A robotic cerebellum has been implanted into a rat’s skull. From the Oct. 4, 2011 news item on Science Daily,

With new cutting-edge technology aimed at providing amputees with robotic limbs, a Tel Aviv University researcher has successfully implanted a robotic cerebellum into the skull of a rodent with brain damage, restoring its capacity for movement.

The cerebellum is responsible for co-ordinating movement, explains Prof. Matti Mintz of TAU’s [Tel Aviv University] Department of Psychology. When wired to the brain, his “robo-cerebellum” receives, interprets, and transmits sensory information from the brain stem, facilitating communication between the brain and the body. To test this robotic interface between body and brain, the researchers taught a brain-damaged rat to blink whenever they sounded a particular tone. The rat could only perform the behavior when its robotic cerebellum was functional.

This is the third item I’ve found in the last few weeks about computer chips being implanted in brains. I found the other two items in a discussion about extreme human enhancement on Slate.com (first mentioned in my Sept. 15, 2011 posting). One of the Brad Allenby [the other two discussants are Nicholas Agar and Kyle Munkittrick] entries (posted Sept. 16, 2011) featured these two references,

Experiments that began here at Arizona State University and have been continued at Duke and elsewhere have involved monkeys learning to move mechanical arms to which they are wirelessly connected as if they were part of themselves, using them effectively even when the arms (but not the monkey) are shifted up to MIT and elsewhere. More recently, monkeys with chips implanted in their brains [2008 according to the video on the website] at Duke University have kept a robot wirelessly connected to their chip running in Japan. Similar technologies are being explored to enable paraplegics and other injured people to interact with their environments and to communicate effectively, as well. The upshot is that “the body” is becoming more than just a spatial presence; rather, it becomes a designed extended cognitive network.

The projects are almost mirror images of each other. The rat can’t move without input from its robotic cerebellum while the monkeys control the robots’ movement with their thoughts. From the Oct. 3, 2011 news release on Eureka Alert,

According to the researcher, the chip is designed to mimic natural neuronal activity. “It’s a proof of the concept that we can record information from the brain, analyze it in a way similar to the biological network, and then return it to the brain,” says Prof. Mintz, who recently presented his research at the Strategies for Engineered Negligible Senescence meeting in Cambridge, UK.

In reading these items, I can’t help but remember that plastic surgery was a means of helping soldiers with horrendous wounds and it has now become part of the cosmetics industry. Given that history, it is possible to imagine (or to assume) that these brain ‘repairs’ could be used to augment or reshape our brains to increase intelligence, heighten senses, improve motor coordination, etc. In short. to accomplish very different goals than those originally set out.