Tag Archives: biotechnology

World heritage music stored in DNA

It seems a Swiss team from the École Polytechnique de Lausanne (EPFL) have collaborated with American companies Twist Bioscience and Microsoft, as well as, the University of Washington (state) to preserve two iconic jazz pieces on DNA (deoxyribonucleic acid) according to a Sept. 29, 2017 news item on phys.org,,

Thanks to an innovative technology for encoding data in DNA strands, two items of world heritage – songs recorded at the Montreux Jazz Festival [held in Switzerland] and digitized by EPFL – have been safeguarded for eternity. This marks the first time that cultural artifacts granted UNESCO heritage status have been saved in such a manner, ensuring they are preserved for thousands of years. The method was developed by US company Twist Bioscience and is being unveiled today in a demonstrator created at the EPFL+ECAL Lab.

“Tutu” by Miles Davis and “Smoke on the Water” by Deep Purple have already made their mark on music history. Now they have entered the annals of science, for eternity. Recordings of these two legendary songs were digitized by the Ecole Polytechnique Fédérale de Lausanne (EPFL) as part of the Montreux Jazz Digital Project, and they are the first to be stored in the form of a DNA sequence that can be subsequently decoded and listened to without any reduction in quality.

A Sept. 29, 2017 EPFL press release by Emmanuel Barraud, which originated the news item, provides more details,

This feat was achieved by US company Twist Bioscience working in association with Microsoft Research and the University of Washington. The pioneering technology is actually based on a mechanism that has been at work on Earth for billions of years: storing information in the form of DNA strands. This fundamental process is what has allowed all living species, plants and animals alike, to live on from generation to generation.

The entire world wide web in a shoe box

All electronic data storage involves encoding data in binary format – a series of zeros and ones – and then recording it on a physical medium. DNA works in a similar way, but is composed of long strands of series of four nucleotides (A, T, C and G) that make up a “code.” While the basic principle may be the same, the two methods differ greatly in terms of efficiency: if all the information currently on the internet was stored in the form of DNA, it would fit in a shoe box!

Recent advances in biotechnology now make it possible for humans to do what Mother Nature has always done. Today’s scientists can create artificial DNA strands, “record” any kind of genetic code on them and then analyze them using a sequencer to reconstruct the original data. What’s more, DNA is extraordinarily stable, as evidenced by prehistoric fragments that have been preserved in amber. Artificial strands created by scientists and carefully encapsulated should likewise last for millennia.

To help demonstrate the feasibility of this new method, EPFL’s Metamedia Center provided recordings of two famous songs played at the Montreux Jazz Festival: “Tutu” by Miles Davis, and “Smoke on the Water” by Deep Purple. Twist Bioscience and its research partners encoded the recordings, transformed them into DNA strands and then sequenced and decoded them and played them again – without any reduction in quality.

The amount of artificial DNA strands needed to record the two songs is invisible to the naked eye, and the amount needed to record all 50 years of the Festival’s archives, which have been included in UNESCO’s [United Nations Educational, Scientific and Cultural Organization] Memory of the World Register, would be equal in size to a grain of sand. “Our partnership with EPFL in digitizing our archives aims not only at their positive exploration, but also at their preservation for the next generations,” says Thierry Amsallem, president of the Claude Nobs Foundation. “By taking part in this pioneering experiment which writes the songs into DNA strands, we can be certain that they will be saved on a medium that will never become obsolete!”

A new concept of time

At EPFL’s first-ever ArtTech forum, attendees got to hear the two songs played after being stored in DNA, using a demonstrator developed at the EPFL+ECAL Lab. The system shows that being able to store data for thousands of years is a revolutionary breakthrough that can completely change our relationship with data, memory and time. “For us, it means looking into radically new ways of interacting with cultural heritage that can potentially cut across civilizations,” says Nicolas Henchoz, head of the EPFL+ECAL Lab.

Quincy Jones, a longstanding Festival supporter, is particularly enthusiastic about this technological breakthrough: “With advancements in nanotechnology, I believe we can expect to see people living prolonged lives, and with that, we can also expect to see more developments in the enhancement of how we live. For me, life is all about learning where you came from in order to get where you want to go, but in order to do so, you need access to history! And with the unreliability of how archives are often stored, I sometimes worry that our future generations will be left without such access… So, it absolutely makes my soul smile to know that EPFL, Twist Bioscience and their partners are coming together to preserve the beauty and history of the Montreux Jazz Festival for our future generations, on DNA! I’ve been a part of this festival for decades and it truly is a magnificent representation of what happens when different cultures unite for the sake of music. Absolute magic. And I’m proud to know that the memory of this special place will never be lost.

A Sept. 29, 2017 Twist Bioscience news release is repetitive in some ways but interesting nonetheless,

Twist Bioscience, a company accelerating science and innovation through rapid, high-quality DNA synthesis, today announced that, working with Microsoft and University of Washington researchers, they have successfully stored archival-quality audio recordings of two important music performances from the archives of the world-renowned Montreux Jazz Festival.
These selections are encoded and stored in nature’s preferred storage medium, DNA, for the first time. These tiny specks of DNA will preserve a part of UNESCO’s Memory of the World Archive, where valuable cultural heritage collections are recorded. This is the first time DNA has been used as a long-term archival-quality storage medium.
Quincy Jones, world-renowned Entertainment Executive, Music Composer and Arranger, Musician and Music Producer said, “With advancements in nanotechnology, I believe we can expect to see people living prolonged lives, and with that, we can also expect to see more developments in the enhancement of how we live. For me, life is all about learning where you came from in order to get where you want to go, but in order to do so, you need access to history! And with the unreliability of how archives are often stored, I sometimes worry that our future generations will be left without such access…So, it absolutely makes my soul smile to know that EPFL, Twist Bioscience and others are coming together to preserve the beauty and history of the Montreux Jazz Festival for our future generations, on DNA!…I’ve been a part of this festival for decades and it truly is a magnificent representation of what happens when different cultures unite for the sake of music. Absolute magic. And I’m proud to know that the memory of this special place will never be lost.”
“Our partnership with EPFL in digitizing our archives aims not only at their positive exploration, but also at their preservation for the next generations,” says Thierry Amsallem, president of the Claude Nobs Foundation. “By taking part in this pioneering experiment which writes the songs into DNA strands, we can be certain that they will be saved on a medium that will never become obsolete!”
The Montreux Jazz Digital Project is a collaboration between the Claude Nobs Foundation, curator of the Montreux Jazz Festival audio-visual collection and the École Polytechnique Fédérale de Lausanne (EPFL) to digitize, enrich, store, show, and preserve this notable legacy created by Claude Nobs, the Festival’s founder.
In this proof-of-principle project, two quintessential music performances from the Montreux Jazz Festival – Smoke on the Water, performed by Deep Purple and Tutu, performed by Miles Davis – have been encoded onto DNA and read back with 100 percent accuracy. After being decoded, the songs were played on September 29th [2017] at the ArtTech Forum (see below) in Lausanne, Switzerland. Smoke on the Water was selected as a tribute to Claude Nobs, the Montreux Jazz Festival’s founder. The song memorializes a fire and Funky Claude’s rescue efforts at the Casino Barrière de Montreux during a Frank Zappa concert promoted by Claude Nobs. Miles Davis’ Tutu was selected for the role he played in music history and the Montreux Jazz Festival’s success. Miles Davis died in 1991.
“We archived two magical musical pieces on DNA of this historic collection, equating to 140MB of stored data in DNA,” said Karin Strauss, Ph.D., a Senior Researcher at Microsoft, and one of the project’s leaders.  “The amount of DNA used to store these songs is much smaller than one grain of sand. Amazingly, storing the entire six petabyte Montreux Jazz Festival’s collection would result in DNA smaller than one grain of rice.”
Luis Ceze, Ph.D., a professor in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, said, “DNA, nature’s preferred information storage medium, is an ideal fit for digital archives because of its durability, density and eternal relevance. Storing items from the Montreux Jazz Festival is a perfect way to show how fast DNA digital data storage is becoming real.”
Nature’s Preferred Storage Medium
Nature selected DNA as its hard drive billions of years ago to encode all the genetic instructions necessary for life. These instructions include all the information necessary for survival. DNA molecules encode information with sequences of discrete units. In computers, these discrete units are the 0s and 1s of “binary code,” whereas in DNA molecules, the units are the four distinct nucleotide bases: adenine (A), cytosine (C), guanine (G) and thymine (T).
“DNA is a remarkably efficient molecule that can remain stable for millennia,” said Bill Peck, Ph.D., chief technology officer of Twist Bioscience.  “This is a very exciting project: we are now in an age where we can use the remarkable efficiencies of nature to archive master copies of our cultural heritage in DNA.   As we develop the economies of this process new performances can be added any time.  Unlike current storage technologies, nature’s media will not change and will remain readable through time. There will be no new technology to replace DNA, nature has already optimized the format.”
DNA: Far More Efficient Than a Computer 
Each cell within the human body contains approximately three billion base pairs of DNA. With 75 trillion cells in the human body, this equates to the storage of 150 zettabytes (1021) of information within each body. By comparison, the largest data centers can be hundreds of thousands to even millions of square feet to hold a comparable amount of stored data.
The Elegance of DNA as a Storage Medium
Like music, which can be widely varied with a finite number of notes, DNA encodes individuality with only four different letters in varied combinations. When using DNA as a storage medium, there are several advantages in addition to the universality of the format and incredible storage density. DNA can be stable for thousands of years when stored in a cool dry place and is easy to copy using polymerase chain reaction to create back-up copies of archived material. In addition, because of PCR, small data sets can be targeted and recovered quickly from a large dataset without needing to read the entire file.
How to Store Digital Data in DNA
To encode the music performances into archival storage copies in DNA, Twist Bioscience worked with Microsoft and University of Washington researchers to complete four steps: Coding, synthesis/storage, retrieval and decoding. First, the digital files were converted from the binary code using 0s and 1s into sequences of A, C, T and G. For purposes of the example, 00 represents A, 10 represents C, 01 represents G and 11 represents T. Twist Bioscience then synthesizes the DNA in short segments in the sequence order provided. The short DNA segments each contain about 12 bytes of data as well as a sequence number to indicate their place within the overall sequence. This is the process of storage. And finally, to ensure that the file is stored accurately, the sequence is read back to ensure 100 percent accuracy, and then decoded from A, C, T or G into a two-digit binary representation.
Importantly, to encapsulate and preserve encoded DNA, the collaborators are working with Professor Dr. Robert Grass of ETH Zurich. Grass has developed an innovative technology inspired by preservation of DNA within prehistoric fossils.  With this technology, digital data encoded in DNA remains preserved for millennia.
About UNESCO’s Memory of the World Register
UNESCO established the Memory of the World Register in 1992 in response to a growing awareness of the perilous state of preservation of, and access to, documentary heritage in various parts of the world.  Through its National Commissions, UNESCO prepared a list of endangered library and archive holdings and a world list of national cinematic heritage.
A range of pilot projects employing contemporary technology to reproduce original documentary heritage on other media began. These included, for example, a CD-ROM of the 13th Century Radzivill Chronicle, tracing the origins of the peoples of Europe, and Memoria de Iberoamerica, a joint newspaper microfilming project involving seven Latin American countries. These projects enhanced access to this documentary heritage and contributed to its preservation.
“We are incredibly proud to be a part of this momentous event, with the first archived songs placed into the UNESCO Memory of the World Register,” said Emily Leproust, Ph.D., CEO of Twist Bioscience.
About ArtTech
The ArtTech Foundation, created by renowned scientists and dignitaries from Crans-Montana, Switzerland, wishes to stimulate reflection and support pioneering and innovative projects beyond the known boundaries of culture and science.
Benefitting from the establishment of a favorable environment for the creation of technology companies, the Foundation aims to position itself as key promoter of ideas and innovative endeavors within a landscape of “Culture and Science” that is still being shaped.
Several initiatives, including our annual global platform launched in the spring of 2017, are helping to create a community that brings together researchers, celebrities in the world of culture and the arts, as well as investors and entrepreneurs from Switzerland and across the globe.
 
About EPFL
EPFL, one of the two Swiss Federal Institutes of Technology, based in Lausanne, is Europe’s most cosmopolitan technical university with students, professors and staff from over 120 nations. A dynamic environment, open to Switzerland and the world, EPFL is centered on its three missions: teaching, research and technology transfer. EPFL works together with an extensive network of partners including other universities and institutes of technology, developing and emerging countries, secondary schools and colleges, industry and economy, political circles and the general public, to bring about real impact for society.
About Twist Bioscience
At Twist Bioscience, our expertise is accelerating science and innovation by leveraging the power of scale. We have developed a proprietary semiconductor-based synthetic DNA manufacturing process featuring a high throughput silicon platform capable of producing synthetic biology tools, including genes, oligonucleotide pools and variant libraries. By synthesizing DNA on silicon instead of on traditional 96-well plastic plates, our platform overcomes the current inefficiencies of synthetic DNA production, and enables cost-effective, rapid, high-quality and high throughput synthetic gene production, which in turn, expedites the design, build and test cycle to enable personalized medicines, pharmaceuticals, sustainable chemical production, improved agriculture production, diagnostics and biodetection. We are also developing new technologies to address large scale data storage. For more information, please visit www.twistbioscience.com. Twist Bioscience is on Twitter. Sign up to follow our Twitter feed @TwistBioscience at https://twitter.com/TwistBioscience.

If you hadn’t read the EPFL press release first, it might have taken a minute to figure out why EPFL is being mentioned in the Twist Bioscience news release. Presumably someone was rushing to make a deadline. Ah well, I’ve seen and written worse.

I haven’t been able to find any video or audio recordings of the DNA-preserved performances but there is an informational video (originally published July 7, 2016) from Microsoft and the University of Washington describing the DNA-based technology,

I also found this description of listening to the DNA-preserved music in an Oct. 6, 2017 blog posting for the Canadian Broadcasting Corporation’s (CBC) Day 6 radio programme,

To listen to them, one must first suspend the DNA holding the songs in a solution. Next, one can use a DNA sequencer to read the letters of the bases forming the molecules. Then, algorithms can determine the digital code those letters form. From that code, comes the music.

It’s complicated but Ceze says his team performed this process without error.

You can find out more about UNESCO’s Memory of the World and its register here , more about the EPFL+ECAL Lab here, and more about Twist Bioscience here.

CRISPR corn to come to market in 2020

It seems most of the recent excitement around CRISPR/CAS9 (clustered regularly interspaced short palindromic repeats) has focused on germline editing, specifically human embryos. Most people don’t realize that the first ‘CRISPR’ product is slated to enter the US market in 2020. A June 14, 2017 American Chemical Society news release (also on EurekAlert) provides a preview,

The gene-editing technique known as CRISPR/Cas9 made a huge splash in the news when it was initially announced. But the first commercial product, expected around 2020, could make it to the market without much fanfare: It’s a waxy corn destined to contribute to paper glue and food thickeners. The cover story of Chemical & Engineering News (C&EN), the weekly newsmagazine of the American Chemical Society, explores what else is in the works.

Melody M. Bomgardner, a senior editor at C&EN [Chemical & Engineering News], notes that compared to traditional biotechnology, CRISPR allows scientists to add and remove specific genes from organisms with greater speed, precision and oftentimes, at a lower cost. Among other things, it could potentially lead to higher quality cotton, non-browning mushrooms, drought-resistant corn and — finally — tasty, grocery store tomatoes.

Some hurdles remain, however, before more CRISPR products become available. Regulators are assessing how they should approach crops modified with the technique, which often (though not always) splices genes into a plant from within the species rather than introducing a foreign gene. And scientists still don’t understand all the genes in any given crop, much less know which ones might be good candidates for editing. Luckily, researchers can use CRISPR to find out.

Melody M. Bomgardner’s June 12, 2017 article for C&EN describes in detail how CRISPR could significantly change agriculture (Note: Links have been removed),

When the seed firm DuPont Pioneer first announced the new corn in early 2016, few people paid attention. Pharmaceutical companies using CRISPR for new drugs got the headlines instead.

But people should notice DuPont’s waxy corn because using CRISPR—an acronym for clustered regularly interspaced short palindromic repeats—to delete or alter traits in plants is changing the world of plant breeding, scientists say. Moreover, the technique’s application in agriculture is likely to reach the public years before CRISPR-aided drugs hit the market.

Until CRISPR tools were developed, the process of finding useful traits and getting them into reliable, productive plants took many years. It involved a lot of steps and was plagued by randomness.

“Now, because of basic research in the lab and in the field, we can go straight after the traits we want,” says Zachary Lippman, professor of biological sciences at Cold Spring Harbor Laboratory. CRISPR has been transformative, Lippman says. “It’s basically a freight train that’s not going to stop.”

Proponents hope consumers will embrace gene-edited crops in a way that they did not accept genetically engineered ones, especially because they needn’t involve the introduction of genes from other species—a process that gave rise to the specter of Frankenfood.

But it’s not clear how consumers will react or if gene editing will result in traits that consumers value. And the potential commercial uses of CRISPR may narrow if agriculture agencies in the U.S. and Europe decide to regulate gene-edited crops in the same way they do genetically engineered crops.

DuPont Pioneer expects the U.S. to treat its gene-edited waxy corn like a conventional crop because it does not contain any foreign genes, according to Neal Gutterson, the company’s vice president of R&D. In fact, the waxy trait already exists in some corn varieties. It gives the kernels a starch content of more than 97% amylopectin, compared with 75% amylopectin in regular feed corn. The rest of the kernel is amylose. Amylopectin is more soluble than amylose, making starch from waxy corn a better choice for paper adhesives and food thickeners.

Like most of today’s crops, DuPont’s current waxy corn varieties are the result of decades of effort by plant breeders using conventional breeding techniques.

Breeders identify new traits by examining unusual, or mutant, plants. Over many generations of breeding, they work to get a desired trait into high-performing (elite) varieties that lack the trait. They begin with a first-generation cross, or hybrid, of a mutant and an elite plant and then breed several generations of hybrids with the elite parent in a process called backcrossing. They aim to achieve a plant that best approximates the elite version with the new trait.

But it’s tough to grab only the desired trait from a mutant and make a clean getaway. DuPont’s plant scientists found that the waxy trait came with some genetic baggage; even after backcrossing, the waxy corn plant did not offer the same yield as elite versions without the trait. The disappointing outcome is common enough that it has its own term: yield drag.

Because the waxy trait is native to certain corn plants, DuPont did not have to rely on the genetic engineering techniques that breeders have used to make herbicide-tolerant and insect-resistant corn plants. Those commonly planted crops contain DNA from other species.

In addition to giving some consumers pause, that process does not precisely place the DNA into the host plant. So researchers must raise hundreds or thousands of modified plants to find the best ones with the desired trait and work to get that trait into each elite variety. Finally, plants modified with traditional genetic engineering need regulatory approval in the U.S. and other countries before they can be marketed.

Instead, DuPont plant scientists used CRISPR to zero in on, and partially knock out, a gene for an enzyme that produces amylose. By editing the gene directly, they created a waxy version of the elite corn without yield drag or foreign DNA.

Plant scientists who adopt gene editing may still need to breed, measure, and observe because traits might not work well together or bring a meaningful benefit. “It’s not a panacea,” Lippman says, “but it is one of the most powerful tools to come around, ever.”

It’s an interesting piece which answers the question of why tomatoes from the grocery store don’t taste good.

Emerging technology and the law

I have three news bits about legal issues that are arising as a consequence of emerging technologies.

Deep neural networks, art, and copyright

Caption: The rise of automated art opens new creative avenues, coupled with new problems for copyright protection. Credit: Provided by: Alexander Mordvintsev, Christopher Olah and Mike Tyka

Presumably this artwork is a demonstration of automated art although they never really do explain how in the news item/news release. An April 26, 2017 news item on ScienceDaily announces research into copyright and the latest in using neural networks to create art,

In 1968, sociologist Jean Baudrillard wrote on automatism that “contained within it is the dream of a dominated world […] that serves an inert and dreamy humanity.”

With the growing popularity of Deep Neural Networks (DNN’s), this dream is fast becoming a reality.

Dr. Jean-Marc Deltorn, researcher at the Centre d’études internationales de la propriété intellectuelle in Strasbourg, argues that we must remain a responsive and responsible force in this process of automation — not inert dominators. As he demonstrates in a recent Frontiers in Digital Humanities paper, the dream of automation demands a careful study of the legal problems linked to copyright.

An April 26, 2017 Frontiers (publishing) news release on EurekAlert, which originated the news item, describes the research in more detail,

For more than half a century, artists have looked to computational processes as a way of expanding their vision. DNN’s are the culmination of this cross-pollination: by learning to identify a complex number of patterns, they can generate new creations.

These systems are made up of complex algorithms modeled on the transmission of signals between neurons in the brain.

DNN creations rely in equal measure on human inputs and the non-human algorithmic networks that process them.

Inputs are fed into the system, which is layered. Each layer provides an opportunity for a more refined knowledge of the inputs (shape, color, lines). Neural networks compare actual outputs to expected ones, and correct the predictive error through repetition and optimization. They train their own pattern recognition, thereby optimizing their learning curve and producing increasingly accurate outputs.

The deeper the layers are, the higher the level of abstraction. The highest layers are able to identify the contents of a given input with reasonable accuracy, after extended periods of training.

Creation thus becomes increasingly automated through what Deltorn calls “the arcane traceries of deep architecture”. The results are sufficiently abstracted from their sources to produce original creations that have been exhibited in galleries, sold at auction and performed at concerts.

The originality of DNN’s is a combined product of technological automation on one hand, human inputs and decisions on the other.

DNN’s are gaining popularity. Various platforms (such as DeepDream) now allow internet users to generate their very own new creations . This popularization of the automation process calls for a comprehensive legal framework that ensures a creator’s economic and moral rights with regards to his work – copyright protection.

Form, originality and attribution are the three requirements for copyright. And while DNN creations satisfy the first of these three, the claim to originality and attribution will depend largely on a given country legislation and on the traceability of the human creator.

Legislation usually sets a low threshold to originality. As DNN creations could in theory be able to create an endless number of riffs on source materials, the uncurbed creation of original works could inflate the existing number of copyright protections.

Additionally, a small number of national copyright laws confers attribution to what UK legislation defines loosely as “the person by whom the arrangements necessary for the creation of the work are undertaken.” In the case of DNN’s, this could mean anybody from the programmer to the user of a DNN interface.

Combined with an overly supple take on originality, this view on attribution would further increase the number of copyrightable works.

The risk, in both cases, is that artists will be less willing to publish their own works, for fear of infringement of DNN copyright protections.

In order to promote creativity – one seminal aim of copyright protection – the issue must be limited to creations that manifest a personal voice “and not just the electric glint of a computational engine,” to quote Deltorn. A delicate act of discernment.

DNN’s promise new avenues of creative expression for artists – with potential caveats. Copyright protection – a “catalyst to creativity” – must be contained. Many of us gently bask in the glow of an increasingly automated form of technology. But if we want to safeguard the ineffable quality that defines much art, it might be a good idea to hone in more closely on the differences between the electric and the creative spark.

This research is and be will part of a broader Frontiers Research Topic collection of articles on Deep Learning and Digital Humanities.

Here’s a link to and a citation for the paper,

Deep Creations: Intellectual Property and the Automata by Jean-Marc Deltorn. Front. Digit. Humanit., 01 February 2017 | https://doi.org/10.3389/fdigh.2017.00003

This paper is open access.

Conference on governance of emerging technologies

I received an April 17, 2017 notice via email about this upcoming conference. Here’s more from the Fifth Annual Conference on Governance of Emerging Technologies: Law, Policy and Ethics webpage,

The Fifth Annual Conference on Governance of Emerging Technologies:

Law, Policy and Ethics held at the new

Beus Center for Law & Society in Phoenix, AZ

May 17-19, 2017!

Call for Abstracts – Now Closed

The conference will consist of plenary and session presentations and discussions on regulatory, governance, legal, policy, social and ethical aspects of emerging technologies, including (but not limited to) nanotechnology, synthetic biology, gene editing, biotechnology, genomics, personalized medicine, human enhancement technologies, telecommunications, information technologies, surveillance technologies, geoengineering, neuroscience, artificial intelligence, and robotics. The conference is premised on the belief that there is much to be learned and shared from and across the governance experience and proposals for these various emerging technologies.

Keynote Speakers:

Gillian HadfieldRichard L. and Antoinette Schamoi Kirtland Professor of Law and Professor of Economics USC [University of Southern California] Gould School of Law

Shobita Parthasarathy, Associate Professor of Public Policy and Women’s Studies, Director, Science, Technology, and Public Policy Program University of Michigan

Stuart Russell, Professor at [University of California] Berkeley, is a computer scientist known for his contributions to artificial intelligence

Craig Shank, Vice President for Corporate Standards Group in Microsoft’s Corporate, External and Legal Affairs (CELA)

Plenary Panels:

Innovation – Responsible and/or Permissionless

Ellen-Marie Forsberg, Senior Researcher/Research Manager at Oslo and Akershus University College of Applied Sciences

Adam Thierer, Senior Research Fellow with the Technology Policy Program at the Mercatus Center at George Mason University

Wendell Wallach, Consultant, ethicist, and scholar at Yale University’s Interdisciplinary Center for Bioethics

 Gene Drives, Trade and International Regulations

Greg Kaebnick, Director, Editorial Department; Editor, Hastings Center Report; Research Scholar, Hastings Center

Jennifer Kuzma, Goodnight-North Carolina GlaxoSmithKline Foundation Distinguished Professor in Social Sciences in the School of Public and International Affairs (SPIA) and co-director of the Genetic Engineering and Society (GES) Center at North Carolina State University

Andrew Maynard, Senior Sustainability Scholar, Julie Ann Wrigley Global Institute of Sustainability Director, Risk Innovation Lab, School for the Future of Innovation in Society Professor, School for the Future of Innovation in Society, Arizona State University

Gary Marchant, Regents’ Professor of Law, Professor of Law Faculty Director and Faculty Fellow, Center for Law, Science & Innovation, Arizona State University

Marc Saner, Inaugural Director of the Institute for Science, Society and Policy, and Associate Professor, University of Ottawa Department of Geography

Big Data

Anupam Chander, Martin Luther King, Jr. Professor of Law and Director, California International Law Center, UC Davis School of Law

Pilar Ossorio, Professor of Law and Bioethics, University of Wisconsin, School of Law and School of Medicine and Public Health; Morgridge Institute for Research, Ethics Scholar-in-Residence

George Poste, Chief Scientist, Complex Adaptive Systems Initiative (CASI) (http://www.casi.asu.edu/), Regents’ Professor and Del E. Webb Chair in Health Innovation, Arizona State University

Emily Shuckburgh, climate scientist and deputy head of the Polar Oceans Team at the British Antarctic Survey, University of Cambridge

 Responsible Development of AI

Spring Berman, Ira A. Fulton Schools of Engineering, Arizona State University

John Havens, The IEEE [Institute of Electrical and Electronics Engineers] Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems

Subbarao Kambhampati, Senior Sustainability Scientist, Julie Ann Wrigley Global Institute of Sustainability, Professor, School of Computing, Informatics and Decision Systems Engineering, Ira A. Fulton Schools of Engineering, Arizona State University

Wendell Wallach, Consultant, Ethicist, and Scholar at Yale University’s Interdisciplinary Center for Bioethics

Existential and Catastrophic Ricks [sic]

Tony Barrett, Co-Founder and Director of Research of the Global Catastrophic Risk Institute

Haydn Belfield,  Academic Project Administrator, Centre for the Study of Existential Risk at the University of Cambridge

Margaret E. Kosal Associate Director, Sam Nunn School of International Affairs, Georgia Institute of Technology

Catherine Rhodes,  Academic Project Manager, Centre for the Study of Existential Risk at CSER, University of Cambridge

These were the panels that are of interest to me; there are others on the homepage.

Here’s some information from the Conference registration webpage,

Early Bird Registration – $50 off until May 1! Enter discount code: earlybirdGETs50

New: Group Discount – Register 2+ attendees together and receive an additional 20% off for all group members!

Click Here to Register!

Conference registration fees are as follows:

  • General (non-CLE) Registration: $150.00
  • CLE Registration: $350.00
  • *Current Student / ASU Law Alumni Registration: $50.00
  • ^Cybsersecurity sessions only (May 19): $100 CLE / $50 General / Free for students (registration info coming soon)

There you have it.

Neuro-techno future laws

I’m pretty sure this isn’t the first exploration of potential legal issues arising from research into neuroscience although it’s the first one I’ve stumbled across. From an April 25, 2017 news item on phys.org,

New human rights laws to prepare for advances in neurotechnology that put the ‘freedom of the mind’ at risk have been proposed today in the open access journal Life Sciences, Society and Policy.

The authors of the study suggest four new human rights laws could emerge in the near future to protect against exploitation and loss of privacy. The four laws are: the right to cognitive liberty, the right to mental privacy, the right to mental integrity and the right to psychological continuity.

An April 25, 2017 Biomed Central news release on EurekAlert, which originated the news item, describes the work in more detail,

Marcello Ienca, lead author and PhD student at the Institute for Biomedical Ethics at the University of Basel, said: “The mind is considered to be the last refuge of personal freedom and self-determination, but advances in neural engineering, brain imaging and neurotechnology put the freedom of the mind at risk. Our proposed laws would give people the right to refuse coercive and invasive neurotechnology, protect the privacy of data collected by neurotechnology, and protect the physical and psychological aspects of the mind from damage by the misuse of neurotechnology.”

Advances in neurotechnology, such as sophisticated brain imaging and the development of brain-computer interfaces, have led to these technologies moving away from a clinical setting and into the consumer domain. While these advances may be beneficial for individuals and society, there is a risk that the technology could be misused and create unprecedented threats to personal freedom.

Professor Roberto Andorno, co-author of the research, explained: “Brain imaging technology has already reached a point where there is discussion over its legitimacy in criminal court, for example as a tool for assessing criminal responsibility or even the risk of reoffending. Consumer companies are using brain imaging for ‘neuromarketing’, to understand consumer behaviour and elicit desired responses from customers. There are also tools such as ‘brain decoders’ which can turn brain imaging data into images, text or sound. All of these could pose a threat to personal freedom which we sought to address with the development of four new human rights laws.”

The authors explain that as neurotechnology improves and becomes commonplace, there is a risk that the technology could be hacked, allowing a third-party to ‘eavesdrop’ on someone’s mind. In the future, a brain-computer interface used to control consumer technology could put the user at risk of physical and psychological damage caused by a third-party attack on the technology. There are also ethical and legal concerns over the protection of data generated by these devices that need to be considered.

International human rights laws make no specific mention to neuroscience, although advances in biomedicine have become intertwined with laws, such as those concerning human genetic data. Similar to the historical trajectory of the genetic revolution, the authors state that the on-going neurorevolution will force a reconceptualization of human rights laws and even the creation of new ones.

Marcello Ienca added: “Science-fiction can teach us a lot about the potential threat of technology. Neurotechnology featured in famous stories has in some cases already become a reality, while others are inching ever closer, or exist as military and commercial prototypes. We need to be prepared to deal with the impact these technologies will have on our personal freedom.”

Here’s a link to and a citation for the paper,

Towards new human rights in the age of neuroscience and neurotechnology by Marcello Ienca and Roberto Andorno. Life Sciences, Society and Policy201713:5 DOI: 10.1186/s40504-017-0050-1 Published: 26 April 2017

©  The Author(s). 2017

This paper is open access.

New principles for AI (artificial intelligence) research along with some history and a plea for a democratic discussion

For almost a month I’ve been meaning to get to this Feb. 1, 2017 essay by Andrew Maynard (director of Risk Innovation Lab at Arizona State University) and Jack Stilgoe (science policy lecturer at University College London [UCL]) on the topic of artificial intelligence and principles (Note: Links have been removed). First, a walk down memory lane,

Today [Feb. 1, 2017] in Washington DC, leading US and UK scientists are meeting to share dispatches from the frontiers of machine learning – an area of research that is creating new breakthroughs in artificial intelligence (AI). Their meeting follows the publication of a set of principles for beneficial AI that emerged from a conference earlier this year at a place with an important history.

In February 1975, 140 people – mostly scientists, with a few assorted lawyers, journalists and others – gathered at a conference centre on the California coast. A magazine article from the time by Michael Rogers, one of the few journalists allowed in, reported that most of the four days’ discussion was about the scientific possibilities of genetic modification. Two years earlier, scientists had begun using recombinant DNA to genetically modify viruses. The Promethean nature of this new tool prompted scientists to impose a moratorium on such experiments until they had worked out the risks. By the time of the Asilomar conference, the pent-up excitement was ready to burst. It was only towards the end of the conference when a lawyer stood up to raise the possibility of a multimillion-dollar lawsuit that the scientists focussed on the task at hand – creating a set of principles to govern their experiments.

The 1975 Asilomar meeting is still held up as a beacon of scientific responsibility. However, the story told by Rogers, and subsequently by historians, is of scientists motivated by a desire to head-off top down regulation with a promise of self-governance. Geneticist Stanley Cohen said at the time, ‘If the collected wisdom of this group doesn’t result in recommendations, the recommendations may come from other groups less well qualified’. The mayor of Cambridge, Massachusetts was a prominent critic of the biotechnology experiments then taking place in his city. He said, ‘I don’t think these scientists are thinking about mankind at all. I think that they’re getting the thrills and the excitement and the passion to dig in and keep digging to see what the hell they can do’.

The concern in 1975 was with safety and containment in research, not with the futures that biotechnology might bring about. A year after Asilomar, Cohen’s colleague Herbert Boyer founded Genentech, one of the first biotechnology companies. Corporate interests barely figured in the conversations of the mainly university scientists.

Fast-forward 42 years and it is clear that machine learning, natural language processing and other technologies that come under the AI umbrella are becoming big business. The cast list of the 2017 Asilomar meeting included corporate wunderkinds from Google, Facebook and Tesla as well as researchers, philosophers, and other academics. The group was more intellectually diverse than their 1975 equivalents, but there were some notable absences – no public and their concerns, no journalists, and few experts in the responsible development of new technologies.

Maynard and Stilgoe offer a critique of the latest principles,

The principles that came out of the meeting are, at least at first glance, a comforting affirmation that AI should be ‘for the people’, and not to be developed in ways that could cause harm. They promote the idea of beneficial and secure AI, development for the common good, and the importance of upholding human values and shared prosperity.

This is good stuff. But it’s all rather Motherhood and Apple Pie: comforting and hard to argue against, but lacking substance. The principles are short on accountability, and there are notable absences, including the need to engage with a broader set of stakeholders and the public. At the early stages of developing new technologies, public concerns are often seen as an inconvenience. In a world in which populism appears to be trampling expertise into the dirt, it is easy to understand why scientists may be defensive.

I encourage you to read this thoughtful essay in its entirety although I do have one nit to pick:  Why only US and UK scientists? I imagine the answer may lie in funding and logistics issues but I find it surprising that the critique makes no mention of the international community as a nod to inclusion.

For anyone interested in the Asolimar AI principles (2017), you can find them here. You can also find videos of the two-day workshop (Jan. 31 – Feb. 1, 2017 workshop titled The Frontiers of Machine Learning (a Raymond and Beverly Sackler USA-UK Scientific Forum [US National Academy of Sciences]) here (videos for each session are available on Youtube).

New Wave and its non-shrimp shrimp

I received a news release from a start-up company, New Wave Foods, which specializes in creating plant-based seafood. The concept looks very interesting and sci fi (Lois McMaster Bujold, and I’m sure others, has featured vat-grown meat and fish in her novels). Apparently, Google has already started using some of the New Wave product in its employee cafeteria. Here’s more from the July 19, 2016 New Wave Foods news release,

New Wave Foods announced today that it has successfully opened a seed round aimed at developing seafood that is healthier for humans and the planet. Efficient Capacity kicked off the round and New Crop Capital provided additional funding.

New Wave Foods uses plant-based ingredients, such as red algae, to engineer new edible materials that replicate the taste and texture of fish and shellfish while improving their nutritional profiles. Its first product, which has already been served in Google’s cafeterias, will be a truly sustainable shrimp. Shrimp is the nation’s most popular seafood, currently representing more than a quarter of the four billion pounds of fish and shellfish consumed by Americans annually. For each pound of shrimp caught, up to 15 pounds of other animals, including endangered dolphins, turtles, and sharks, die.

The market for meat analogs is expected to surpass $5 billion by 2020, and savvy investors are increasingly taking notice. In recent years, millions in venture capital has flowed into plant-based alternatives to animal foods from large food processors and investors like Bill Gates and Li Ka-shing, Asia’s richest businessman.

“The astounding scale of our consumption of sea animals is decimating ocean ecosystems through overfishing, massive death through bycatch, water pollution, carbon emissions, derelict fishing gear, mangrove deforestation, and more,” said New Wave Foods co-founder and CEO Dominique Barnes. “Shrimping is also fraught with human rights abuses and slave labor, so we’re pleased to introduce a product that is better for people, the planet, and animals.”

Efficient Capacity is an investment fund that advises and invests in companies worldwide. Efficient Capacity partners have founded or co-founded more than ten companies and served as advisors or directors to dozens of others.

New Crop Capital is a specialized private venture capital fund that provides early-stage investments to companies that develop “clean,” (i.e., cultured) and plant-based meat, dairy, and egg products or facilitate the promotion and sale of such products.

The current round of investments follows investments from SOS Ventures via IndieBio, an accelerator group funding and building biotech startups. IndieBio companies use technology to solve our culture’s most challenging problems, such as feeding a growing population sustainably. Along with investment, IndieBio offers its startups resources such as lab space and mentorship to help take an idea to a product.

Along with its funding round, New Wave Foods announced the appointment of John Wiest as COO. Wiest brings more than 15 years of senior management experience in food and consumer products, including animal-based seafood companies, to the company. As an executive and consultant, Wiest has helped dozens of food ventures develop new products, expand distribution channels, and create strategic partnerships.

New Wave Foods, founded in 2015, is a leader in plant-based seafood that is healthier and better for the environment. New Wave products are high in clean nutrients and deliver a culinary experience consumers expect without the devastating environmental impact of commercial fishing. Co-founder and CEO Dominique Barnes holds a master’s in marine biodiversity and conservation from Scripps Institution of Oceanography, and co-founder and CTO Michelle Wolf holds a bachelor’s in materials science and engineering and a master’s in biomedical engineering. New Wave Foods’ first products will reach consumers as early as Q4 2016.

I found a February 5, 2016 review article about the plant-based shrimp written by Ariel Schwartz for Tech Insider (Note: A link has been removed),

… after trying a lab-made “shrimp” made of plant proteins and algae, I’d consider giving it up the real thing. Maybe others will too.

The shrimp I ate came from New Wave Foods, a startup that just graduated from biotech startup accelerator IndieBio. When I first met New Wave’s founders in the fall of 2015, they had been working for eight weeks at IndieBio’s San Francisco lab. …

Barnes and Wolf [marine conservationist Dominique Barnes and materials scientist Michelle Wolf ] ultimately figured out a way to use plant proteins, along with the same algae that shrimp eat — the stuff that helps give the crustaceans their color and flavor — to come up with a substitute that has a similar texture, taste, color, and nutritional value.

The fact that New Wave’s product has the same high protein, low fat content as real shrimp is a big source of differentiation from other shrimp substitutes, according to Barnes.

In early February, I finally tried a breaded version of New Wave’s shrimp. Here’s what it looked like:

New Wave Foods Ariel Schwartz/Tech Insider

It was a little hard to judge the taste because of the breading, but the texture was almost perfect. The lab-made shrimp had that springiness and mixture of crunch and chew that you’d expect from the real thing. I could see myself replacing real shrimp with this in some situations.

Whether it could replace shrimp all the time depends on how the product tastes without the breading. “Our ultimate goal is to get to the cocktail shrimp level,” says Barnes.

I’m glad to have stumbled across Ariel Schwartz again as I’ve always enjoyed her writing and it has been a few years.

For the curious, you can check out more of Ariel Schwartz’s work here and find out more about Efficient Capacity in a listing on CrunchBase, New Crop Capital here, SOS Ventures here, IndieBio here. and, of course,  New Wave Foods here.

One final comment, I am not endorsing this company or its products. This is presented as interesting information and, hopefully, I will be hearing more about the company and its products in the future.

Korea Advanced Institute of Science and Technology (KAIST) at summer 2016 World Economic Forum in China

From the Ideas Lab at the 2016 World Economic Forum at Davos to offering expertise at the 2016 World Economic Forum in Tanjin, China that is taking place from June 26 – 28, 2016.

Here’s more from a June 24, 2016 KAIST news release on EurekAlert,

Scientific and technological breakthroughs are more important than ever as a key agent to drive social, economic, and political changes and advancements in today’s world. The World Economic Forum (WEF), an international organization that provides one of the broadest engagement platforms to address issues of major concern to the global community, will discuss the effects of these breakthroughs at its 10th Annual Meeting of the New Champions, a.k.a., the Summer Davos Forum, in Tianjin, China, June 26-28, 2016.

Three professors from the Korea Advanced Institute of Science and Technology (KAIST) will join the Annual Meeting and offer their expertise in the fields of biotechnology, artificial intelligence, and robotics to explore the conference theme, “The Fourth Industrial Revolution and Its Transformational Impact.” The Fourth Industrial Revolution, a term coined by WEF founder, Klaus Schwab, is characterized by a range of new technologies that fuse the physical, digital, and biological worlds, such as the Internet of Things, cloud computing, and automation.

Distinguished Professor Sang Yup Lee of the Chemical and Biomolecular Engineering Department will speak at the Experts Reception to be held on June 25, 2016 on the topic of “The Summer Davos Forum and Science and Technology in Asia.” On June 27, 2016, he will participate in two separate discussion sessions.

In the first session entitled “What If Drugs Are Printed from the Internet?” Professor Lee will discuss the future of medicine being impacted by advancements in biotechnology and 3D printing technology with Nita A. Farahany, a Duke University professor, under the moderation of Clare Matterson, the Director of Strategy at Wellcome Trust in the United Kingdom. The discussants will note recent developments made in the way patients receive their medicine, for example, downloading drugs directly from the internet and the production of yeast strains to make opioids for pain treatment through systems metabolic engineering, and predicting how these emerging technologies will transform the landscape of the pharmaceutical industry in the years to come.

In the second session, “Lessons for Life,” Professor Lee will talk about how to nurture life-long learning and creativity to support personal and professional growth necessary in an era of the new industrial revolution.

During the Annual Meeting, Professors Jong-Hwan Kim of the Electrical Engineering School and David Hyunchul Shim of the Aerospace Department will host, together with researchers from Carnegie Mellon University and AnthroTronix, an engineering research and development company, a technological exhibition on robotics. Professor Kim, the founder of the internally renowned Robot World Cup, will showcase his humanoid micro-robots that play soccer, displaying their various cutting-edge technologies such as imaging processing, artificial intelligence, walking, and balancing. Professor Shim will present a human-like robotic piloting system, PIBOT, which autonomously operates a simulated flight program, grabbing control sticks and guiding an airplane from take offs to landings.

In addition, the two professors will join Professor Lee, who is also a moderator, to host a KAIST-led session on June 26, 2016, entitled “Science in Depth: From Deep Learning to Autonomous Machines.” Professors Kim and Shim will explore new opportunities and challenges in their fields from machine learning to autonomous robotics including unmanned vehicles and drones.

Since 2011, KAIST has been participating in the World Economic Forum’s two flagship conferences, the January and June Davos Forums, to introduce outstanding talents, share their latest research achievements, and interact with global leaders.

KAIST President Steve Kang said, “It is important for KAIST to be involved in global talks that identify issues critical to humanity and seek answers to solve them, where our skills and knowledge in science and technology could play a meaningful role. The Annual Meeting in China will become another venue to accomplish this.”

I mentioned KAIST and the Ideas Lab at the 2016 Davos meeting in this Nov. 20, 2015 posting and was able to clear up my (and possible other people’s) confusion as to what the Fourth Industrial revolution might be in my Dec. 3, 2015 posting.

AquAdvantage salmon (genetically modified) approved for consumption in Canada

This is an update of the AquAdvantage salmon story covered in my Dec. 4, 2015 post (scroll down about 40% of the way). At the time, the US Food and Drug Administration (FDA) had just given approval for consumption of the fish. There was speculation there would be a long hard fight over approval in Canada. This does not seem to have been the case, according to a May 10, 2016 news item announcing Health Canada’s on phys.org,

Canada’s health ministry on Thursday [May 19, 2016] approved a type of genetically modified salmon as safe to eat, making it the first transgenic animal destined for Canadian dinner tables.

This comes six months after US authorities gave the green light to sell the fish in American grocery stores.

The decisions by Health Canada and the US Food and Drug Administration follow two decades of controversy over the fish, which is an Atlantic salmon injected with genes from Pacific Chinook salmon and a fish known as the ocean pout to make it grow faster.

The resulting fish, called AquAdvantage Salmon, is made by AquaBounty Technologies in Massachusetts, and can reach adult size in 16 to 18 months instead of 30 months for normal Atlantic salmon.

A May 19, 2016 BIOTECanada news release on businesswire provides more detail about one of the salmon’s Canadian connections,

Canadian technology emanating from Memorial University developed the AquAdvantage salmon by introducing a growth hormone gene from Chinook salmon into the genome of Atlantic salmon. This results in a salmon which grows faster and reaches market size quicker and AquAdvantage salmon is identical to other farmed salmon. The AquAdvantage salmon also received US FDA approval in November 2015. With the growing world population, AquaBounty is one of many biotechnology companies offering safe and sustainable means to enhance the security and supply of food in the world. AquaBounty has improved the productivity of aquaculture through its use of biotechnology and modern breeding technics that have led to the development of AquAdvantage salmon.

“Importantly, today’s approval is a result of a four year science-based regulatory approval process which involved four federal government departments including Agriculture and AgriFood, Canada Food Inspection Agency, Environment and Climate Change, Fisheries and Oceans and Health which demonstrates the rigour and scope of science based regulatory approvals in Canada. Coupled with the report from the [US] National Academy of Sciences today’s [May 19, 2016] approval clearly demonstrates that genetic engineering of food is not only necessary but also extremely safe,” concluded Casey [Andrew Casey, President and CEO BIOTECanada].

There’s another connection, the salmon hatcheries are based in Prince Edward Island.

While BIOTECanada’s Andrew Casey is crowing about this approval, it should be noted that there was a losing court battle with British Columbia’s Living Oceans Society and Nova Scotia’s Ecology Action Centre both challenging the federal government’s approval. They may have lost *the* battle but, as the cliché goes, ‘the war is not over yet’. There’s an Issue about the lack of labeling and there’s always the  possibility that retailers and/or consumers may decide to boycott the fish.

As for BIOTECanada, there’s this description from the news release,

BIOTECanada is the national industry association with more than 230 members reflecting the diverse nature of Canada’s health, industrial and agricultural biotechnology sectors. In addition to providing significant health benefits for Canadians, the biotechnology industry has quickly become an essential part of the transformation of many traditional cornerstones of the Canadian economy including manufacturing, automotive, energy, aerospace and forestry industries. Biotechnology in all of its applications from health, agriculture and industrial is offering solutions for the collective population.

You can find the BIOTECanada website here.

Personally, I’m a bit ambivalent about it all. I understand the necessity for changing our food production processes but I do think more attention should be paid to consumers’ concerns and that organizations such as BIOTECanada could do a better job of communicating.

*’the’ added on Aug. 4, 2016.

Bacteria, pyramids, cancer, and Sylvain Martel

Canada’s national newspaper (as they like to bill themselves), the Globe and Mail featured Québec researcher’s (Sylvain Martel) work in a Dec. 13, 2011 article by Bertrand Marotte. From the news article,

Professor Sylvain Martel is already a world leader in the field of nano-robotics, but now he’s working to make a medical dream reality: To deliver toxic drug treatments directly to cancerous cells without damaging the body’s healthy tissue.

I have profiled Martel’s work before in an April 6 2010 posting about bacterial nanobots (amongst other subjects) and in a March 16, 2011 posting about his work with remote-controlled microcarriers.

It seems that his next project will combine the work on bacteria and microcarriers (from the Globe and Mail article),

Bolstered by his recent success in guiding micro-carriers loaded with cancer-fighting medications into a rabbit’s liver, he and his team of up to 20 researchers from several disciplines are working to transfer the method to the treatment of colorectal cancer in humans within four years.

This time around he is not using micro-carriers to deliver the drug to the tumour, but rather bacteria.

Here’s a video of the bacteria which illustrates Martel’s earlier success with ‘training’ them to build a pyramid.

The latest breakthrough reported in March 2011 (from my posting) implemented an MRI (magnetic resonance imaging) machine,

Known for being the world’s first researcher to have guided a magnetic sphere through a living artery, Professor Martel is announcing a spectacular new breakthrough in the field of nanomedicine. Using a magnetic resonance imaging (MRI) system, his team successfully guided microcarriers loaded with a dose of anti-cancer drug through the bloodstream of a living rabbit, right up to a targeted area in the liver, where the drug was successfully administered. This is a medical first that will help improve chemoembolization, a current treatment for liver cancer.

Here’s what Martel is trying to accomplish now (from the Globe and Mail article),

The MRI machine’s magnetic field is manipulated by [a] sophisticated software program that helps guide the magnetically sensitive bacteria to the tumour mass.

Attached to the bacteria is a capsule containing the cancer-fighting drug. The bacteria are tricked into swimming to an artificially created “magnetic north” at the centre of the tumour, where they will die off after 30 to 40 minutes. The micro-mules, however, have left their precious cargo: the capsule, whose envelope breaks and releases the drug.

I’m not entirely sure why the drug won’t destroy health tissue after it’s finished with the tumour but that detail is not offered in Marotte’s story which, in the last few paragraphs, switches focus from medical breakthroughs to the importance of venture capital funding for Canadian biotech research.

I wish Martel and his team great success.