Tag Archives: University of Toronto

D-PLACE: an open access database of places, language, culture, and enviroment

In an attempt to be a bit more broad in my interpretation of the ‘society’ part of my commentary I’m including this July 8, 2016 news item on ScienceDaily (Note: A link has been removed),

An international team of researchers has developed a website at d-place.org to help answer long-standing questions about the forces that shaped human cultural diversity.

D-PLACE — the Database of Places, Language, Culture and Environment — is an expandable, open access database that brings together a dispersed body of information on the language, geography, culture and environment of more than 1,400 human societies. It comprises information mainly on pre-industrial societies that were described by ethnographers in the 19th and early 20th centuries.

A July 8, 2016 University of Toronto news release (also on EurekAlert), which originated the news item, expands on the theme,

“Human cultural diversity is expressed in numerous ways: from the foods we eat and the houses we build, to our religious practices and political organisation, to who we marry and the types of games we teach our children,” said Kathryn Kirby, a postdoctoral fellow in the Departments of Ecology & Evolutionary Biology and Geography at the University of Toronto and lead author of the study. “Cultural practices vary across space and time, but the factors and processes that drive cultural change and shape patterns of diversity remain largely unknown.

“D-PLACE will enable a whole new generation of scholars to answer these long-standing questions about the forces that have shaped human cultural diversity.”

Co-author Fiona Jordan, senior lecturer in anthropology at the University of Bristol and one of the project leads said, “Comparative research is critical for understanding the processes behind cultural diversity. Over a century of anthropological research around the globe has given us a rich resource for understanding the diversity of humanity – but bringing different resources and datasets together has been a huge challenge in the past.

“We’ve drawn on the emerging big data sets from ecology, and combined these with cultural and linguistic data so researchers can visualise diversity at a glance, and download data to analyse in their own projects.”

D-PLACE allows users to search by cultural practice (e.g., monogamy vs. polygamy), environmental variable (e.g. elevation, mean annual temperature), language family (e.g. Indo-European, Austronesian), or region (e.g. Siberia). The search results can be displayed on a map, a language tree or in a table, and can also be downloaded for further analysis.

It aims to enable researchers to investigate the extent to which patterns in cultural diversity are shaped by different forces, including shared history, demographics, migration/diffusion, cultural innovations, and environmental and ecological conditions.

D-PLACE was developed by an international team of scientists interested in cross-cultural research. It includes researchers from Max Planck Institute for the Science of Human history in Jena Germany, University of Auckland, Colorado State University, University of Toronto, University of Bristol, Yale, Human Relations Area Files, Washington University in Saint Louis, University of Michigan, American Museum of Natural History, and City University of New York.

The diverse team included: linguists; anthropologists; biogeographers; data scientists; ethnobiologists; and evolutionary ecologists, who employ a variety of research methods including field-based primary data collection; compilation of cross-cultural data sources; and analyses of existing cross-cultural datasets.

“The team’s diversity is reflected in D-PLACE, which is designed to appeal to a broad user base,” said Kirby. “Envisioned users range from members of the public world-wide interested in comparing their cultural practices with those of other groups, to cross-cultural researchers interested in pushing the boundaries of existing research into the drivers of cultural change.”

Here’s a link to and a citation for the paper,

D-PLACE: A Global Database of Cultural, Linguistic and Environmental Diversity by Kathryn R. Kirby, Russell D. Gray, Simon J. Greenhill, Fiona M. Jordan, Stephanie Gomes-Ng, Hans-Jörg Bibiko, Damián E. Blasi, Carlos A. Botero, Claire Bowern, Carol R. Ember, Dan Leehr, Bobbi S. Low, Joe McCarter, William Divale, Michael C. Gavin.  PLOS ONE, 2016; 11 (7): e0158391 DOI: 10.1371/journal.pone.0158391 Published July 8, 2016.

This paper is open access.

You can find D-PLACE here.

While it might not seem like that there would be a close link between anthropology and physics in the 19th and early 20th centuries, that information can be mined for more contemporary applications. For example, someone who wants to make a case for a more diverse scientific community may want to develop a social science approach to the discussion. The situation in my June 16, 2016 post titled: Science literacy, science advice, the US Supreme Court, and Britain’s House of Commons, could  be extended into a discussion and educational process using data from D-Place and other sources to make the point,

Science literacy may not be just for the public, it would seem that US Supreme Court judges may not have a basic understanding of how science works. David Bruggeman’s March 24, 2016 posting (on his Pasco Phronesis blog) describes a then current case before the Supreme Court (Justice Antonin Scalia has since died), Note: Links have been removed,

It’s a case concerning aspects of the University of Texas admissions process for undergraduates and the case is seen as a possible means of restricting race-based considerations for admission.  While I think the arguments in the case will likely revolve around factors far removed from science and or technology, there were comments raised by two Justices that struck a nerve with many scientists and engineers.

Both Justice Antonin Scalia and Chief Justice John Roberts raised questions about the validity of having diversity where science and scientists are concerned [emphasis mine].  Justice Scalia seemed to imply that diversity wasn’t esential for the University of Texas as most African-American scientists didn’t come from schools at the level of the University of Texas (considered the best university in Texas).  Chief Justice Roberts was a bit more plain about not understanding the benefits of diversity.  He stated, “What unique perspective does a black student bring to a class in physics?”

To that end, Dr. S. James Gates, theoretical physicist at the University of Maryland, and member of the President’s Council of Advisers on Science and Technology (and commercial actor) has an editorial in the March 25 [2016] issue of Science explaining that the value of having diversity in science does not accrue *just* to those who are underrepresented.

Dr. Gates relates his personal experience as a researcher and teacher of how people’s background inform their practice of science, and that two different people may use the same scientific method, but think about the problem differently.

I’m guessing that both Scalia and Roberts and possibly others believe that science is the discovery and accumulation of facts. In this worldview science facts such as gravity are waiting for discovery and formulation into a ‘law’. They do not recognize that most science is a collection of beliefs and may be influenced by personal beliefs. For example, we believe we’ve proved the existence of the Higgs boson but no one associated with the research has ever stated unequivocally that it exists.

More generally, with D-PLACE and the recently announced Trans-Atlantic Platform (see my July 15, 2016 post about it), it seems Canada’s humanities and social sciences communities are taking strides toward greater international collaboration and a more profound investment in digital scholarship.

Taking DNA beyond genetics with living computers and nanobots

You might want to keep a salt shaker with you while reading a June 7, 2016 essay by Matteo Palma (Queen Mary’s University of London) about nanotechnology and DNA on The Conversation website (h/t June 7, 2016 news item on Nanowerk).

This is not a ‘hype’ piece as Palma backs every claim with links to the research while providing a good overview of some very exciting work but the mood is a bit euphoric so you may want to keep the earlier mentioned salt shaker nearby.

Palma offers a very nice beginner introduction especially helpful for someone who only half-remembers their high school biology (from the June 7, 2016 essay)

DNA is one of the most amazing molecules in nature, providing a way to carry the instructions needed to create almost any lifeform on Earth in a microscopic package. Now scientists are finding ways to push DNA even further, using it not just to store information but to create physical components in a range of biological machines.

Deoxyribonucleic acid or “DNA” carries the genetic information that we, and all living organisms, use to function. It typically comes in the form of the famous double-helix shape, made up of two single-stranded DNA molecules folded into a spiral. Each of these is made up of a series of four different types of molecular component: adenine (A), guanine (G), thymine (T), and cytosine (C).

Genes are made up from different sequences of these building block components, and the order in which they appear in a strand of DNA is what encodes genetic information. But by precisely designing different A,G,T and C sequences, scientists have recently been able to develop new ways of folding DNA into different origami shapes, beyond the conventional double helix.

This approach has opened up new possibilities of using DNA beyond its genetic and biological purpose, turning it into a Lego-like material for building objects that are just a few billionths of a metre in diameter (nanoscale). DNA-based materials are now being used for a variety of applications, ranging from templates for electronic nano-devices, to ways of precisely carrying drugs to diseased cells.

He highlights some Canadian work,

Designing electronic devices that are just nanometres in size opens up all sorts of possible applications but makes it harder to spot defects. As a way of dealing with this, researchers at the University of Montreal have used DNA to create ultrasensitive nanoscale thermometers that could help find minuscule hotspots in nanodevices (which would indicate a defect). They could also be used to monitor the temperature inside living cells.

The nanothermometers are made using loops of DNA that act as switches, folding or unfolding in response to temperature changes. This movement can be detected by attaching optical probes to the DNA. The researchers now want to build these nanothermometers into larger DNA devices that can work inside the human body.

He also mentions the nanobots that will heal your body (according to many works of fiction),

Researchers at Harvard Medical School have used DNA to design and build a nanosized robot that acts as a drug delivery vehicle to target specific cells. The nanorobot comes in the form of an open barrel made of DNA, whose two halves are connected by a hinge held shut by special DNA handles. These handles can recognise combinations of specific proteins present on the surface of cells, including ones associated with diseases.

When the robot comes into contact with the right cells, it opens the container and delivers its cargo. When applied to a mixture of healthy and cancerous human blood cells, these robots showed the ability to target and kill half of the cancer cells, while the healthy cells were left unharmed.

Palma is describing a very exciting development and there are many teams worldwide working on ways to make drugs more effective and less side effect-ridden. However there does seem to be a bit of a problem with targeted drug delivery as noted in my April 27, 2016 posting,

According to an April 27, 2016 news item on Nanowerk researchers at the University of Toronto (Canada) along with their collaborators in the US (Harvard Medical School) and Japan (University of Tokyo) have determined that less than 1% of nanoparticle-based drugs reach their intended destination …

Less than 1%? Admittedly, nanoparticles are not the same as nanobots but the problem is in the delivery, from my April 27, 2016 posting,

… the authors argue that, in order to increase nanoparticle delivery efficiency, a systematic and coordinated long-term strategy is necessary. To build a strong foundation for the field of cancer nanomedicine, researchers will need to understand a lot more about the interactions between nanoparticles and the body’s various organs than they do today. …

I imagine nanobots will suffer a similar fate since the actual delivery mechanism to a targeted cell is still a mystery.

I quite enjoyed Palma’s essay and appreciated the links he provided. My only proviso, keep a salt shaker nearby. That rosy future is going take a while to get here.

Encapsulation of proteins in nanoparticles no longer necessary for time release?

A team of researchers at the University of Toronto (Canada) have developed a technique for the therapeutic use of proteins that doesn’t require ‘nanoencapsulation’ although nanoparticles are still used according to a May 27, 2016 news item on ScienceDaily,

A U of T [University of Toronto] Engineering team has designed a simpler way to keep therapeutic proteins where they are needed for long periods of time. The discovery is a potential game-changer for the treatment of chronic illnesses or injuries that often require multiple injections or daily pills.

For decades, biomedical engineers have been painstakingly encapsulating proteins in nanoparticles to control their release. Now, a research team led by University Professor Molly Shoichet has shown that proteins can be released over several weeks, even months, without ever being encapsulated. In this case the team looked specifically at therapeutic proteins relevant to tissue regeneration after stroke and spinal cord injury.

“It was such a surprising and unexpected discovery,” said co-lead author Dr. Irja Elliott Donaghue, who first found that the therapeutic protein NT3, a factor that promotes the growth of nerve cells, was slowly released when just mixed into a Jello-like substance that also contained nanoparticles. “Our first thought was, ‘What could be happening to cause this?'”

A May 27, 2016 University of Toronto news release (also on EurekAlert) by Marit Mitchell, which originated the news item, provides more in depth explanation,

Proteins hold enormous promise to treat chronic conditions and irreversible injuries — for example, human growth hormone is encapsulated in these tiny polymeric particles, and used to treat children with stunted growth. In order to avoid repeated injections or daily pills, researchers use complicated strategies both to deliver proteins to their site of action, and to ensure they’re released over a long enough period of time to have a beneficial effect.

This has long been a major challenge for protein-based therapies, especially because proteins are large and often fragile molecules. Until now, investigators have been treating proteins the same way as small drug molecules and encapsulating them in polymeric nanoparticles, often made of a material called poly(lactic-co-glycolic acid) or PLGA.

As the nanoparticles break down, the drug molecules escape. The same process is true for proteins; however, the encapsulating process itself often damages or denatures some of the encapsulated proteins, rendering them useless for treatment. Skipping encapsulation altogether means fewer denatured proteins, making for more consistent protein therapeutics that are easier to make and store.

“This is really exciting from a translational perspective,” said PhD candidate Jaclyn Obermeyer. “Having a simpler, more reliable fabrication process leaves less room for complications with scale-up for clinical use.”

The three lead authors, Elliott Donoghue, Obermeyer and Dr. Malgosia Pakulska have shown that to get the desired controlled release, proteins only need to be alongside the PLGA nanoparticles, not inside them. …

“We think that this could speed up the path for protein-based drugs to get to the clinic,” said Elliott Donaghue.

The mechanism for this encapsulation-free controlled release is surprisingly elegant. Shoichet’s group mixes the proteins and nanoparticles in a Jello-like substance called a hydrogel, which keeps them localized when injected at the site of injury. The positively charged proteins and negatively charged nanoparticles naturally stick together. As the nanoparticles break down they make the solution more acidic, weakening the attraction and letting the proteins break free.

“We are particularly excited to show long-term, controlled protein release by simply controlling the electrostatic interactions between proteins and polymeric nanobeads,” said Shoichet. “By manipulating the pH of the solution, the size and number of nanoparticles, we can control release of bioactive proteins. This has already changed and simplified the protein release strategies that we are pursuing in pre-clinical models of disease in the brain and spinal cord.”

“We’ve learned how to control this simple phenomena,” Pakulska said. “Our next question is whether we can do the opposite—design a similar release system for positively charged nanoparticles and negatively charged proteins.”

Here’s a link to and a citation for the paper,

Encapsulation-free controlled release: Electrostatic adsorption eliminates the need for protein encapsulation in PLGA nanoparticles by Malgosia M. Pakulska, Irja Elliott Donaghue, Jaclyn M. Obermeyer, Anup Tuladhar, Christopher K. McLaughlin, Tyler N. Shendruk, and Molly S. Shoichet. Science Advances  27 May 2016: Vol. 2, no. 5, e1600519 DOI: 10.1126/sciadv.1600519

This paper appears to be open access.

Dr. Molly Shoichet was featured here in a May 11, 2015 posting about the launch of her Canada-wide science communication project Research2.Reality.

Split some water molecules and save solar and wind (energy) for a future day

Professor Ted Sargent’s research team at the University of Toronto has a developed a new technique for saving the energy harvested by sun and wind farms according to a March 28, 2016 news item on Nanotechnology Now,

We can’t control when the wind blows and when the sun shines, so finding efficient ways to store energy from alternative sources remains an urgent research problem. Now, a group of researchers led by Professor Ted Sargent at the University of Toronto’s Faculty of Applied Science & Engineering may have a solution inspired by nature.

The team has designed the most efficient catalyst for storing energy in chemical form, by splitting water into hydrogen and oxygen, just like plants do during photosynthesis. Oxygen is released harmlessly into the atmosphere, and hydrogen, as H2, can be converted back into energy using hydrogen fuel cells.

Discovering a better way of storing energy from solar and wind farms is “one of the grand challenges in this field,” Ted Sargent says (photo above by Megan Rosenbloom via flickr) Courtesy: University of Toronto

Discovering a better way of storing energy from solar and wind farms is “one of the grand challenges in this field,” Ted Sargent says (photo above by Megan Rosenbloom via flickr) Courtesy: University of Toronto

A March 24, 2016 University of Toronto news release by Marit Mitchell, which originated the news item, expands on the theme,

“Today on a solar farm or a wind farm, storage is typically provided with batteries. But batteries are expensive, and can typically only store a fixed amount of energy,” says Sargent. “That’s why discovering a more efficient and highly scalable means of storing energy generated by renewables is one of the grand challenges in this field.”

You may have seen the popular high-school science demonstration where the teacher splits water into its component elements, hydrogen and oxygen, by running electricity through it. Today this requires so much electrical input that it’s impractical to store energy this way — too great proportion of the energy generated is lost in the process of storing it.

This new catalyst facilitates the oxygen-evolution portion of the chemical reaction, making the conversion from H2O into O2 and H2 more energy-efficient than ever before. The intrinsic efficiency of the new catalyst material is over three times more efficient than the best state-of-the-art catalyst.

Details are offered in the news release,

The new catalyst is made of abundant and low-cost metals tungsten, iron and cobalt, which are much less expensive than state-of-the-art catalysts based on precious metals. It showed no signs of degradation over more than 500 hours of continuous activity, unlike other efficient but short-lived catalysts. …

“With the aid of theoretical predictions, we became convinced that including tungsten could lead to a better oxygen-evolving catalyst. Unfortunately, prior work did not show how to mix tungsten homogeneously with the active metals such as iron and cobalt,” says one of the study’s lead authors, Dr. Bo Zhang … .

“We invented a new way to distribute the catalyst homogenously in a gel, and as a result built a device that works incredibly efficiently and robustly.”

This research united engineers, chemists, materials scientists, mathematicians, physicists, and computer scientists across three countries. A chief partner in this joint theoretical-experimental studies was a leading team of theorists at Stanford University and SLAC National Accelerator Laboratory under the leadership of Dr. Aleksandra Vojvodic. The international collaboration included researchers at East China University of Science & Technology, Tianjin University, Brookhaven National Laboratory, Canadian Light Source and the Beijing Synchrotron Radiation Facility.

“The team developed a new materials synthesis strategy to mix multiple metals homogeneously — thereby overcoming the propensity of multi-metal mixtures to separate into distinct phases,” said Jeffrey C. Grossman, the Morton and Claire Goulder and Family Professor in Environmental Systems at Massachusetts Institute of Technology. “This work impressively highlights the power of tightly coupled computational materials science with advanced experimental techniques, and sets a high bar for such a combined approach. It opens new avenues to speed progress in efficient materials for energy conversion and storage.”

“This work demonstrates the utility of using theory to guide the development of improved water-oxidation catalysts for further advances in the field of solar fuels,” said Gary Brudvig, a professor in the Department of Chemistry at Yale University and director of the Yale Energy Sciences Institute.

“The intensive research by the Sargent group in the University of Toronto led to the discovery of oxy-hydroxide materials that exhibit electrochemically induced oxygen evolution at the lowest overpotential and show no degradation,” said University Professor Gabor A. Somorjai of the University of California, Berkeley, a leader in this field. “The authors should be complimented on the combined experimental and theoretical studies that led to this very important finding.”

Here’s a link to and a citation for the paper,

Homogeneously dispersed, multimetal oxygen-evolving catalysts by Bo Zhang, Xueli Zheng, Oleksandr Voznyy, Riccardo Comin, Michal Bajdich, Max García-Melchor, Lili Han, Jixian Xu, Min Liu, Lirong Zheng, F. Pelayo García de Arquer, Cao Thang Dinh, Fengjia Fan, Mingjian Yuan, Emre Yassitepe, Ning Chen, Tom Regier, Pengfei Liu, Yuhang Li, Phil De Luna, Alyf Janmohamed, Huolin L. Xin, Huagui Yang, Aleksandra Vojvodic, Edward H. Sargent. Science  24 Mar 2016: DOI: 10.1126/science.aaf1525

This paper is behind a paywall.

University of Toronto (Canada) researchers and lab-grown heart and liver tissue (person-on-a-chip)

Usually called ‘human-on-a-chip’, a team at the University of Toronto have developed a two-organ ‘person on a chip’ according to a March 7, 2016 news item on phys.org (Note: Links have been removed),

Researchers at U of T [University of Toronto] Engineering have developed a new way of growing realistic human tissues outside the body. Their “person-on-a-chip” technology, called AngioChip, is a powerful platform for discovering and testing new drugs, and could eventually be used to repair or replace damaged organs.

Professor Milica Radisic (IBBME, ChemE), graduate student Boyang Zhang and the rest of the team are among those research groups around the world racing to find ways to grow human tissues in the lab, under conditions that mimic a real person’s body. They have developed unique methods for manufacturing small, intricate scaffolds for individual cells to grow on. These artificial environments produce cells and tissues that resemble the real thing more closely than those grown lying flat in a petri dish.

The team’s recent creations have included BiowireTM—an innovative method of growing heart cells around a silk suture—as well as a scaffold for heart cells that snaps together like sheets of Velcro. But AngioChip takes tissue engineering to a whole new level. “It’s a fully three-dimensional structure complete with internal blood vessels,” says Radisic. “It behaves just like vasculature, and around it there is a lattice for other cells to attach and grow.” …

A March 7, 2016 University of Toronto news release (also on EurekAlert), which originated the news item, provides more detail about the AngioChip,

Zhang built the scaffold out of POMaC, a polymer that is both biodegradable and biocompatible. The scaffold is built out of a series of thin layers, stamped with a pattern of channels that are each about 50 to 100 micrometres wide. The layers, which resemble the computer microchips, are then stacked into a 3D structure of synthetic blood vessels. As each layer is added, UV light is used to cross-link the polymer and bond it to the layer below.

When the structure is finished, it is bathed in a liquid containing living cells. The cells quickly attach to the inside and outside of the channels and begin growing just as they would in the human body.

“Previously, people could only do this using devices that squish the cells between sheets of silicone and glass,” says Radisic. “You needed several pumps and vacuum lines to run just one chip. Our system runs in a normal cell culture dish, and there are no pumps; we use pressure heads to perfuse media through the vasculature. The wells are open, so you can easily access the tissue.”

Using the platform, the team has built model versions of both heart and liver tissues that function like the real thing. “Our liver actually produced urea and metabolized drugs,” says Radisic. They can connect the blood vessels of the two artificial organs, thereby modelling not just the organs themselves, but the interactions between them. They’ve even injected white blood cells into the vessels and watched as they squeezed through gaps in the vessel wall to reach the tissue on the other side, just as they do in the human body.

The news release also mentions potential markets and the work that needs to be accomplished before AngioChip is available for purchase,

AngioChip has great potential in the field of pharmaceutical testing. Current drug-testing methods, such as animal testing and controlled clinical trials, are costly and fraught with ethical concerns. Testing on lab-grown human tissues would provide a realistic model at a fraction of the cost, but this area of research is still in its infancy. “In the last few years, it has become possible to order cultures of human cells for testing, but they’re grown on a plate, a two-dimensional environment,” says Radisic. “They don’t capture all the functional hallmarks of a real heart muscle, for example.”

A more realistic platform like AngioChip could enable drug companies to detect dangerous side effects and interactions between organ compartments long before their products reach the market, saving countless lives. It could also be used to understand and validate the effectiveness of current drugs and even to screen libraries of chemical compounds to discover new drugs. Through TARA Biosystems Inc., a spin-off company co-founded by Radisic, the team is already working on commercializing the technology.

In future, Radisic envisions her lab-grown tissues being implanted into the body to repair organs damaged by disease. Because the cells used to seed the platform can come from anyone, the new tissues could be genetically identical to the intended host, reducing the risk of organ rejection. Even in its current form, the team has shown that the AngioChip can be implanted into a living animal, its artificial blood vessels connected to a real circulatory system. The polymer scaffolding itself simply biodegrades after several months.

The team still has much work to do. Each AngioChip is currently made by hand; if the platform is to be used industrially, the team will need to develop high-throughput manufacturing methods to create many copies at once. Still, the potential is obvious. “It really is multifunctional, and solves many problems in the tissue engineering space,” says Radisic. “It’s truly next-generation.”

Here’s a link to and a citation for the paper,

Biodegradable scaffold with built-in vasculature for organ-on-a-chip engineering and direct surgical anastomosis by Boyang Zhang, Miles Montgomery, M. Dean Chamberlain, Shinichiro Ogawa, Anastasia Korolj, Aric Pahnke, Laura A. Wells, Stéphane Massé, Jihye Kim, Lewis Reis, Abdul Momen, Sara S. Nunes, Aaron R. Wheeler, Kumaraswamy Nanthakumar, Gordon Keller, Michael V. Sefton, & Milica Radisic. Nature Materials (2016) doi:10.1038/nmat4570 Published online 07 March 2016

This paper is behind a paywall.

The researchers have made two images illustrating their work available. There’s this still image,

These tiny polymer scaffolds contain channels that are about 100 micrometres wide, about the same diameter as a human hair. When seeded with cells, the channels act as artificial blood vessels. By mimicking tissues in the human heart and other organs, these scaffolds provide a new way to test drugs for potentially dangerous side effects. (Image: Tyler Irving/Boyang Zhang/Kevin Soobrian)

These tiny polymer scaffolds contain channels that are about 100 micrometres wide, about the same diameter as a human hair. When seeded with cells, the channels act as artificial blood vessels. By mimicking tissues in the human heart and other organs, these scaffolds provide a new way to test drugs for potentially dangerous side effects. (Image: Tyler Irving/Boyang Zhang/Kevin Soobrian)

Perhaps more intriguing is this one,

UofT_AngioChipMoving

When seeded with heart cells, the flexible polymer scaffold contracts with a regular rhythm, just like real heart tissue. (Image: Boyang Zhang)

I have mentioned ‘human-on-a-chip’ projects many times here and as the news release writer notes, there is an international race. My July 1, 2015 posting (cross-posted from the June 30, 2015 posting [Testing times: the future of animal alternatives] on the International Innovation blog [a CORDIS-listed project dissemination partner for FP7 and H2020 projects]) notes a couple of those projects,

Organ-on-a-chip projects use stem cells to create human tissues that replicate the functions of human organs. Discussions about human-on-a-chip activities – a phrase used to describe 10 interlinked organ chips – were a highlight of the 9th World Congress on Alternatives to Animal Testing held in Prague, Czech Republic, last year. One project highlighted at the event was a joint US National Institutes of Health (NIH), US Food and Drug Administration (FDA) and US Defense Advanced Research Projects Agency (DARPA) project led by Dan Tagle that claimed it would develop functioning human-on-a-chip by 2017. However, he and his team were surprisingly close-mouthed and provided few details making it difficult to assess how close they are to achieving their goal.

By contrast, Uwe Marx – Leader of the ‘Multi-Organ-Chip’ programme in the Institute of Biotechnology at the Technical University of Berlin and Scientific Founder of TissUse, a human-on-a-chip start-up company – claims to have sold two-organ chips. He also claims to have successfully developed a four-organ chip and that he is on his way to building a human-on-a-chip. Though these chips remain to be seen, if they are, they will integrate microfluidics, cultured cells and materials patterned at the nanoscale to mimic various organs, and will allow chemical testing in an environment that somewhat mirrors a human.

As for where the University of Toronto efforts fit into the race, I don’t know for sure. It’s the first time I’ve come across a reference to liver tissue producing urea but I believe there’s at least one other team in China which has achieved a three-dimensional, more lifelike aspect for liver tissue in my Jan. 29, 2016 posting ‘Constructing a liver’.

A demonstration of quantum surrealism

The Canadian Institute for Advanced Research (CIFAR) has announced some intriguing new research results. A Feb. 19, 2016 news item on ScienceDaily gets the ball rolling,

New research demonstrates that particles at the quantum level can in fact be seen as behaving something like billiard balls rolling along a table, and not merely as the probabilistic smears that the standard interpretation of quantum mechanics suggests. But there’s a catch — the tracks the particles follow do not always behave as one would expect from “realistic” trajectories, but often in a fashion that has been termed “surrealistic.”

A Feb. 19, 2016 CIFAR news release by Kurt Kleiner, which originated the news item, offers the kind of explanation that allows an amateur such as myself to understand the principles (while I’m reading it), thank you Kurt Kleiner,

In a new version of an old experiment, CIFAR Senior Fellow Aephraim Steinberg (University of Toronto) and colleagues tracked the trajectories of photons as the particles traced a path through one of two slits and onto a screen. But the researchers went further, and observed the “nonlocal” influence of another photon that the first photon had been entangled with.

The results counter a long-standing criticism of an interpretation of quantum mechanics called the De Broglie-Bohm theory. Detractors of this interpretation had faulted it for failing to explain the behaviour of entangled photons realistically. For Steinberg, the results are important because they give us a way of visualizing quantum mechanics that’s just as valid as the standard interpretation, and perhaps more intuitive.

“I’m less interested in focusing on the philosophical question of what’s ‘really’ out there. I think the fruitful question is more down to earth. Rather than thinking about different metaphysical interpretations, I would phrase it in terms of having different pictures. Different pictures can be useful. They can help shape better intuitions.”

At stake is what is “really” happening at the quantum level. The uncertainty principle tells us that we can never know both a particle’s position and momentum with complete certainty. And when we do interact with a quantum system, for instance by measuring it, we disturb the system. So if we fire a photon at a screen and want to know where it will hit, we’ll never know for sure exactly where it will hit or what path it will take to get there.

The standard interpretation of quantum mechanics holds that this uncertainty means that there is no “real” trajectory between the light source and the screen. The best we can do is to calculate a “wave function” that shows the odds of the photon being in any one place at any time, but won’t tell us where it is until we make a measurement.

Yet another interpretation, called the De Broglie-Bohm theory, says that the photons do have real trajectories that are guided by a “pilot wave” that accompanies the particle. The wave is still probabilistic, but the particle takes a real trajectory from source to target. It doesn’t simply “collapse” into a particular location once it’s measured.

In 2011 Steinberg and his colleagues showed that they could follow trajectories for photons by subjecting many identical particles to measurements so weak that the particles were barely disturbed, and then averaging out the information. This method showed trajectories that looked similar to classical ones — say, those of balls flying through the air.

But critics had pointed out a problem with this viewpoint. Quantum mechanics also tells us that two particles can be entangled, so that a measurement of one particle affects the other. The critics complained that in some cases, a measurement of one particle would lead to an incorrect prediction of the trajectory of the entangled particle. They coined the term “surreal trajectories” to describe them.

In the most recent experiment, Steinberg and colleagues showed that the surrealism was a consequence of non-locality — the fact that the particles were able to influence one another instantaneously at a distance. In fact, the “incorrect” predictions of trajectories by the entangled photon were actually a consequence of where in their course the entangled particles were measured. Considering both particles together, the measurements made sense and were consistent with real trajectories.

Steinberg points out that both the standard interpretation of quantum mechanics and the De Broglie-Bohm interpretation are consistent with experimental evidence, and are mathematically equivalent. But it is helpful in some circumstances to visualize real trajectories, rather than wave function collapses, he says.

An image illustrating the work has been provided,

On the left, a still image from an animation of reconstructed trajectories for photons going through a double-slit. A second photon “measures” which slit each photon traversed, so no interference results on the screen. The image on the right shows the polarisation of this second, “probe." Credit: Dylan Mahler Courtesy: CIFAR

On the left, a still image from an animation of reconstructed trajectories for photons going through a double-slit. A second photon “measures” which slit each photon traversed, so no interference results on the screen. The image on the right shows the polarisation of this second, “probe.” Credit: Dylan Mahler Courtesy: CIFAR

Here’s a link to and a citation for the paper,

Experimental nonlocal and surreal Bohmian trajectories by Dylan H. Mahler, Lee Rozema, Kent Fisher, Lydia Vermeyden, Kevin J. Resch, Howard M. Wiseman, and Aephraim Steinberg. Science Advances  19 Feb 2016: Vol. 2, no. 2, e1501466 DOI: 10.1126/science.1501466

This article appears to be open access.

Shape-shifting nanoparticles for better chemotherapy from the University of Toronto (Canada)

A research team from the University of Toronto and its shape-shifting nanoparticles are being touted in a Feb. 19, 2016 news item on Nanowerk,

Chemotherapy isn’t supposed to make your hair fall out — it’s supposed to kill cancer cells. A new molecular delivery system created at U of T [University of Toronto] Engineering could help ensure that chemotherapy drugs get to their target while minimizing collateral damage.

Many cancer drugs target fast-growing cells. Injected into a patient, they swirl around in the bloodstream acting on fast-growing cells wherever they find them. That includes tumours, but unfortunately also hair follicles, the lining of your digestive system, and your skin.

U of T Engineering Professor Warren Chan has spent the last decade figuring out how to deliver chemotherapy drugs into tumours — and nowhere else. Now his lab has designed a set of nanoparticles attached to strands of DNA that can change shape to gain access to diseased tissue.

A Feb. 18, 2016 University of Toronto news release (also on EurekAlert), which originated the news item, expands on the theme,

“Your body is basically a series of compartments,” says Chan. “Think of it as a giant house with rooms inside. We’re trying to figure out how to get something that’s outside, into one specific room. One has to develop a map and a system that can move through the house where each path to the final room may have different restrictions such as height and width.”

One thing we know about cancer: no two tumours are identical. Early-stage breast cancer, for example, may react differently to a given treatment than pancreatic cancer, or even breast cancer at a more advanced stage. Which particles can get inside which tumours depends on multiple factors such as the particle’s size, shape and surface chemistry.

Chan and his research group have studied how these factors dictate the delivery of small molecules and nanotechnologies to tumours, and have now designed a targeted molecular delivery system that uses modular nanoparticles whose shape, size and chemistry can be altered by the presence of specific DNA sequences.

“We’re making shape-changing nanoparticles,” says Chan. “They’re a series of building blocks, kind of like a LEGO set.” The component pieces can be built into many shapes, with binding sites exposed or hidden. They are designed to respond to biological molecules by changing shape, like a key fitting into a lock.

These shape-shifters are made of minuscule chunks of metal with strands of DNA attached to them. Chan envisions that the nanoparticles will float around harmlessly in the blood stream, until a DNA strand binds to a sequence of DNA known to be a marker for cancer. When this happens, the particle changes shape, then carries out its function: it can target the cancer cells, expose a drug molecule to the cancerous cell, tag the cancerous cells with a signal molecule, or whatever task Chan’s team has designed the nanoparticle to carry out.

“We were inspired by the ability of proteins to alter their conformation — they somehow figure out how to alleviate all these delivery issues inside the body,” says Chan. “Using this idea, we thought, ‘Can we engineer a nanoparticle to function like a protein, but one that can be programmed outside the body with medical capabilities?’”

Applying nanotechnology and materials science to medicine, and particularly to targeted drug delivery, is still a relatively new concept, but one Chan sees as full of promise. The real problem is how to deliver enough of the nanoparticles directly to the cancer to produce an effective treatment.

“Here’s how we look at these problems: it’s like you’re going to Vancouver from Toronto, but no one tells you how to get there, no one gives you a map, or a plane ticket, or a car — that’s where we are in this field,” he says. “The idea of targeting drugs to tumours is like figuring out how to go to Vancouver. It’s a simple concept, but to get there isn’t simple if not enough information is provided.”

“We’ve only scratched the surface of how nanotechnology ‘delivery’ works in the body, so now we’re continuing to explore different details of why and how tumours and other organs allow or block certain things from getting in,” adds Chan.

He and his group plan to apply the delivery system they’ve designed toward personalized nanomedicine — further tailoring their particles to deliver drugs to your precise type of tumour, and nowhere else.

Here are links to and citations for the team’s two published papers,

DNA-controlled dynamic colloidal nanoparticle systems for mediating cellular interaction by Seiichi Ohta, Dylan Glancy, Warren C. W. Chan. Science  19 Feb 2016: Vol. 351, Issue 6275, pp. 841-845 DOI: 10.1126/science.aad4925

Tailoring nanoparticle designs to target cancer based on tumor pathophysiology by Edward A. Sykes, Qin Dai, Christopher D. Sarsons, Juan Chen, Jonathan V. Rocheleau, David M. Hwang, Gang Zheng, David T. Cramb, Kristina D. Rinker, and Warren C. W. Chan. PNAS     doi: 10.1073/pnas.1521265113 published online Feb. 16, 2016.

Both papers are behind paywalls.

University of New Brunswick (Canada), ‘sun in a can’, and buckyballs

Cutting the cost for making solar cells could be a step in the right direction for more widespread adoption. At any rate, that seems to be the motivation for Dr. Felipe Chibante of the University of New Brunswick  and his team as they’ve worked for the past three years or so on cutting production costs for fullerenes (also known as, buckminsterfullerenes, C60, and buckyballs). From a Dec. 23, 2015 article by Michael Tutton for Canadian Press,

A heating system so powerful it gave its creator a sunburn from three metres away is being developed by a New Brunswick engineering professor as a method to sharply reduce the costs of making the carbon used in some solar cells.

Felipe Chibante says his “sun in a can” method of warming carbon at more than 5,000 degrees Celsius helps create the stable carbon 60 needed in more flexible forms of photovoltaic panels.

Tutton includes some technical explanations in his article,

Chibante and senior students at the University of New Brunswick created the system to heat baseball-sized lumps of plasma — a form of matter composed of positively charged gas particles and free-floating negatively charged electrons — at his home and later in a campus lab.

According to a May 22, 2012 University of New Brunswick news release received funding of almost $1.5M from the Atlantic Canada Opportunities Agency for his work with fullerenes,

Dr. Felipe Chibante, associate professor in UNB’s department of chemical engineering, and his team at the Applied Nanotechnology Lab received nearly $1.5 million to lower the cost of fullerenes, which is the molecular form of pure carbon and is a critical ingredient for the plastic solar cell market.

Dr. Chibante and the collaborators on the project have developed fundamental synthesis methods that will be integrated in a unique plasma reactor to result in a price reduction of 50-75 per cent.

Dr. Chibante and his work were also featured in a June 10, 2013 news item on CBC (Canadian Broadcasting Corporation) news online,

Judges with the New Brunswick Innovation Fund like the idea and recently awarded Chibante $460,000 to continue his research at the university’s Fredericton campus.

Chibante has a long history of working with fullerenes — carbon molecules that can store the sun’s energy. He was part of the research team that discovered fullerenes in 1985 [the three main researchers at Rice University, Texas, received Nobel Prizes for the work].

He says they can be added to liquid, spread over plastic and shingles and marketed as a cheaper way to convert sunlight into electricity.

“What we’re trying to do in New Brunswick with the science research and innovation is we’re really trying to get the maximum bang for the buck,” said Chibante.

As it stands, fullerenes cost about $15,000 per kilogram. Chibante hopes to lower the cost by a factor of 10.

The foundation investment brings Chibante’s research funding to about $6.2 million.

Not everyone is entirely sold on this approach to encouraging solar energy adoption (from the CBC news item),

The owner of Urban Pioneer, a Fredericton [New Brunswick] company that sells alternative energy products, likes the concept, but doubts there’s much of a market in New Brunswick.

“We have conventional solar panels right now and they’re not that popular,” said Tony Craft.

“So I can’t imagine, like, when you throw something completely brand new into it, I don’t know how people are going to respond to that even, so it may be a very tough sell,” he said.

Getting back to Chibante’s breakthrough (from Tutton’s Dec. 23, 2015 article),

The 52-year-old researcher says he first set up the system to operate in his garage.

He installed optical filters to watch the melting process but said the light from the plasma was so intense that he later noticed a sunburn on his neck.

The plasma is placed inside a container that can contain and cool the extremely hot material without exposing it to the air.

The conversion technology has the advantage of not using solvents and doesn’t produce the carbon dioxide that other baking systems use, says Chibante.

He says the next stage is finding commercial partners who can help his team further develop the system, which was originally designed and patented by French researcher Laurent Fulcheri.

Chibante said he doesn’t believe the carbon-based, thin-film solar cells will displace the silicon-based cells because they capture less energy.

But he nonetheless sees a future for the more flexible sheets of solar cells.

“You can make fibres, you can make photovoltaic threads and you get into wearable, portable forms of power that makes it more ubiquitous rather than having to carry a big, rigid structure,” he said.

The researcher says the agreement earlier this month [Nov. 30 – Dec. 12, 2015] in Paris among 200 countries to begin reducing the use of fossil fuels and slow global warming may help his work.

By the way,  Chibante estimates production costs for fullerenes, when using his system, would be less that $50/kilogram for what is now the highest priced component of carbon-based solar cells.

There is another researcher in Canada who works in the field of solar energy, Dr. Ted Sargent at the University of Toronto (Ontario). He largely focuses on harvesting solar energy by using quantum dots. I last featured Sargent’s quantum dot work in a Dec. 9, 2014 posting.

Quantum teleportation

It’s been two years (my Aug. 16, 2013 posting features a German-Japanese collaboration) since the last quantum teleportation posting here. First, a little visual stimulation,

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain Kirk, also known as William Shatner, is from Montréal, Canada and that’s not the only Canadian connection to this story which is really about some research at York University (UK). From an Oct. 1, 2015 news item on Nanotechnology Now,

Mention the word ‘teleportation’ and for many people it conjures up “Beam me up, Scottie” images of Captain James T Kirk.

But in the last two decades quantum teleportation – transferring the quantum structure of an object from one place to another without physical transmission — has moved from the realms of Star Trek fantasy to tangible reality.

A Sept. 30, 2015 York University (UK) press release, which originated the news item, describes the quantum teleportation research problem and solution,

Quantum teleportation is an important building block for quantum computing, quantum communication and quantum network and, eventually, a quantum Internet. While theoretical proposals for a quantum Internet already exist, the problem for scientists is that there is still debate over which of various technologies provides the most efficient and reliable teleportation system. This is the dilemma which an international team of researchers, led by Dr Stefano Pirandola of the Department of Computer Science at the University of York, set out to resolve.

In a paper published in Nature Photonics, the team, which included scientists from the Freie Universität Berlin and the Universities of Tokyo and Toronto [emphasis mine], reviewed the theoretical ideas around quantum teleportation focusing on the main experimental approaches and their attendant advantages and disadvantages.

None of the technologies alone provide a perfect solution, so the scientists concluded that a hybridisation of the various protocols and underlying structures would offer the most fruitful approach.

For instance, systems using photonic qubits work over distances up to 143 kilometres, but they are probabilistic in that only 50 per cent of the information can be transported. To resolve this, such photon systems may be used in conjunction with continuous variable systems, which are 100 per cent effective but currently limited to short distances.

Most importantly, teleportation-based optical communication needs an interface with suitable matter-based quantum memories where quantum information can be stored and further processed.

Dr Pirandola, who is also a member of the York Centre for Quantum Technologies, said: “We don’t have an ideal or universal technology for quantum teleportation. The field has developed a lot but we seem to need to rely on a hybrid approach to get the best from each available technology.

“The use of quantum teleportation as a building block for a quantum network depends on its integration with quantum memories. The development of good quantum memories would allow us to build quantum repeaters, therefore extending the range of teleportation. They would also give us the ability to store and process the transmitted quantum information at local quantum computers.

“This could ultimately form the backbone of a quantum Internet. The revised hybrid architecture will likely rely on teleportation-based long-distance quantum optical communication, interfaced with solid state devices for quantum information processing.”

Here’s a link to and a citation for the paper,

Advances in quantum teleportation by S. Pirandola, J. Eisert, C. Weedbrook, A. Furusawa, & S. L. Braunstein. Nature Photonics 9, 641–652 (2015) doi:10.1038/nphoton.2015.154 Published online 29 September 2015

This paper is behind a paywall.

 

Kavli Foundation roundtable on artificial synthesis as a means to produce clean fuel

A Sept. 9, 2015 news item on Azonano features a recent roundtable discussion about artificial photosynthesis and clean fuel held by the Kavli Foundation,

Imagine creating artificial plants that make gasoline and natural gas using only sunlight. And imagine using those fuels to heat our homes or run our cars without adding any greenhouse gases to the atmosphere. By combining nanoscience and biology, researchers led by scientists at University of California, Berkeley, have taken a big step in that direction.

Peidong Yang, a professor of chemistry at Berkeley and co-director of the school’s Kavli Energy NanoSciences Institute, leads a team that has created an artificial leaf that produces methane, the primary component of natural gas, using a combination of semiconducting nanowires and bacteria. The research, detailed in the online edition of Proceedings of the National Academy of Sciences in August, builds on a similar hybrid system, also recently devised by Yang and his colleagues, that yielded butanol, a component in gasoline, and a variety of biochemical building blocks.

The research is a major advance toward synthetic photosynthesis, a type of solar power based on the ability of plants to transform sunlight, carbon dioxide and water into sugars. Instead of sugars, however, synthetic photosynthesis seeks to produce liquid fuels that can be stored for months or years and distributed through existing energy infrastructure.

In a [Kavli Foundation] roundtable discussion on his recent breakthroughs and the future of synthetic photosynthesis, Yang said his hybrid inorganic/biological systems give researchers new tools to study photosynthesis — and learn its secrets.

There is a list of the participants and an edited transcript of the roundtable, which took place sometime during summer 2015, on the Kavli Foundation’s Fueling up: How nanoscience is creating a new type of solar power webpage (Note: Links have been removed),

The participants were:

PEIDONG YANG – is professor of chemistry and Chan Distinguished Professor of Energy at University of California, Berkeley, and co-director of the Kavli Energy NanoScience Institute at Berkeley National Laboratory and UC Berkeley. He serves as director of the California Research Alliance by BASF, and was a founding member of the U.S. Department of Energy (DOE) Joint Center for Artificial Photosynthesis (JCAP).
THOMAS MOORE – is Regents’ Professor of Chemistry and Biochemistry and past director of the Center for Bioenergy & Photosynthesis at Arizona State University. He is a past president of the American Society for Photobiology, and a team leader at the Center for Bio-Inspired Solar Fuel Production.
TED SARGENT – is a University Professor of Electrical and Computer Engineering at the University of Toronto where he is vice-dean for research for the Faculty of Applied Science and Engineering. He holds the Canada Research Chair in Nanotechnology and is a founder of two companies, InVisage Technologies and Xagenic.

THE KAVLI FOUNDATION (TKF): Solar cells do a good job of converting sunlight into electricity. Converting light into fuel seems far more complicated. Why go through the bother?

THOMAS MOORE: That’s a good question. In order to create sustainable, solar-driven societies, we need a way to store solar energy. With solar cells, we can make electricity efficiently, but we cannot conveniently store that electricity to use when it is cloudy or at night. If we want to stockpile large quantities of energy, we have to store it as chemical energy, the way it is locked up in coal, oil, natural gas, hydrogen and biomass.

PEIDONG YANG: I agree. Perhaps, one day, researchers will come up with an effective battery to store photoelectric energy produced by solar cells. But photosynthesis can solve the energy conversion and storage problem in one step. It converts and stores solar energy in the chemical bonds of organic molecules.

TED SARGENT: Much of the globe’s power infrastructure, from automobiles, trucks and planes to gas-fired electrical generators, is built upon carbon-based fossil fuels. So creating a new technology that can generate liquid fuels that can use this infrastructure is a very powerful competitive advantage for a renewable energy technology.

For someone who’s interested in solar energy and fuel issues, this discussion provide a good introduction to some of what’s driving the research and, happily, none of these scientists are proselytizing.

One final comment. Ted Sargent has been mentioned here several times in connection with his work on solar cells and/or quantum dots.