Tag Archives: University of Tokyo

Happy Thanksgiving! Oct. 12, 2015, my last mention of science debates in the Canadian 2015 federal election, and my 4001st posting

Two things for me to celebrate today: Thanksgiving (in Canada, we celebrate on the 2nd Monday of October) and my 4001st posting (this one).

Science for the people

Plus, there’s much to celebrate about science discussion during the 2015 Canadian federal election. I stumbled across Science for the People, which is a weekly radio show based in Canada (from the About page),

Science for the People is a syndicated radio show and podcast that broadcasts weekly across North America. We are a long-format interview show that explores the connections between science, popular culture, history, and public policy, to help listeners understand the evidence and arguments behind what’s in the news and on the shelves.

Every week, our hosts sit down with science researchers, writers, authors, journalists, and experts to discuss science from the past, the science that affects our lives today, and how science might change our future.


If you have comments, show ideas, or questions about Science for the People, email feedback@scienceforthepeople.ca.

Theme Song

Our theme song music comes from the song “Binary Consequence” by the band Fractal Pattern. You can find the full version of it on their album No Hope But Mt. Hope.

License & Copyright

All Science for the People episodes are under the Creative Commons license. You are free to distribute unedited versions of the episodes for non-commercial purposes. If you would like to edit the episode please contact us.

Episode #338 (2015 Canadian federal election and science) was originally broadcast on Oct. 9,  2015 and features,

This week, we’re talking about politics, and the prospects for pro-science politicians, parties and voters in Canada. We’ll spend the hour with panelists Katie Gibbs, Executive Director of Evidence for Democracy, science librarian John Dupuis, journalist Mike De Souza, and former Canadian government scientist Steven Campana, for an in-depth discussion about the treatment of science by the current Canadian government, and what’s at stake for science in the upcoming federal election.

The podcast is approximately one hour long and Désirée Schell (sp?) hosts/moderates an interesting discussion where one of the participants notes that issues about science and science muzzles predate Harper. The speaker dates the issues back to the Chrétien/Martin years. Note: Jean Chrétien was Prime Minister from 1993 to 2003 and Paul Martin, his successor, was Prime Minister from 2003 to 2006 when he was succeeded by current Prime Minister, Stephen Harper. (I attended a Philosophers’ Cafe event on Oct. 1, 2015 where the moderator dated the issues back to the Mulroney years. Note: Brian Mulroney was Prime Minister from 1984 – 1993.) So, it’s been 10, 20, or 30 years depending on your viewpoint and when you started noticing (assuming you’re of an age to have noticed something happening 30 years ago).

The participants also spent some time discussing why Canadians would care about science. Interestingly, one of the speakers claimed the current Syrian refugee crisis has its roots in climate change, a science issue, and he noted the US Dept. of Defense views climate change as a threat multiplier. For anyone who doesn’t know, the US Dept. of Defense funds a lot of science research.

It’s a far ranging discussion, which doesn’t really touch on science as an election issue until some 40 mins. into the podcast.

One day later on Oct. 10, 2015 (where you’ll find the podcast), the Canadian Broadcasting Corporation’s Quirks & Quarks radio programme broadcast and made available its podcast of a 2015 Canadian election science debate/panel,

There is just over a week to go before Canadians head to the polls to elect a new government. But one topic that hasn’t received much attention on the campaign trail is science.

So we thought we’d gather together candidates from each of the major federal parties to talk about science and environmental issues in this election.

We asked each of them where they and their parties stood on federal funding of science; basic vs. applied research; the controversy around federal scientists being permitted to speak about their research, and how to cut greenhouse gas emissions while protecting jobs and the economy.

Our panel of candidates were:

– Lynne Quarmby, The Green Party candidate [and Green Party Science critic] in Burnaby North-Seymour, and  professor and Chair of the Department of Molecular Biology and Biochemistry at Simon Fraser University

– Gary Goodyear, Conservative Party candidate in Cambridge, Ontario, and former Minister of State for Science and Technology

– Marc Garneau, Liberal Party candidate in NDG-Westmount, and a former Canadian astronaut

– Megan Leslie, NDP candidate in Halifax and her party’s environment critic

It was a crackling debate. Gary Goodyear was the biggest surprise in that he was quite vigorous and informed in his defence of the government’s track record. Unfortunately, he was also quite patronizing.

The others didn’t seem to have as much information and data at their fingertips. Goodyear quote OECD reports of Canada doing well in the sciences and they didn’t have any statistics of their own to provide a counter argument. Quarmby, Garneau, and Leslie did at one time or another come back strongly on one point or another but none of them seriously damaged Goodyear’s defense. I can’t help wondering if Kennedy Stewart, NDP science critic, or Laurin Liu, NDP deputy science critic, and Ted Hsu, Liberal science critic might have been better choices for this debate.

The Quirks & Quarks debate was approximately 40 or 45 mins. with the remainder of the broadcast devoted to Canadian 2015 Nobel Prize winner in Physics, Arthur B. McDonald (Takaaki Kajita of the University of Tokyo shared the prize) for the discovery of neutrino oscillations, i.e., neutrinos have mass.

Kate Allen writing an Oct. 9, 2015 article for thestar.com got a preview of the pretaped debate and excerpted a few of the exchanges,

On science funding

Gary Goodyear: Currently, we spend more than twice what the Liberals spent in their last year. We have not cut science, and in fact our science budget this year is over $10 billion. But the strategy is rather simple. We are very strong in Canada on basic research. Where we fall down sometimes as compared to other countries is moving the knowledge that we discover in our laboratories out of the laboratory onto our factory floors where we can create jobs, and then off to the hospitals and living rooms of the world — which is how we make that home run. No longer is publishing an article the home run, as it once was.

Lynne Quarmby: I would take issue with the statement that science funding is robust in this country … The fact is that basic scientific research is at starvation levels. Truly fundamental research, without an obvious immediate application, is starving. And that is the research that is feeding the creativity — it’s the source of new ideas, and new understanding about the world, that ultimately feeds innovation.

If you’re looking for a good representation of the discussion and you don’t have time to listen to the podcast, Allen’s article is a good choice.

Finally, Research2Reality, a science outreach and communication project I profiled earlier in 2015 has produced an Oct. 9, 2015 election blog posting by Karyn Ho, which in addition to the usual ‘science is dying in Canada’ talk includes links to more information and to the official party platforms, as well as, an exhortation to get out there and vote.

Something seems to be in the air as voter turnout for the advance polls is somewhere from 24% to 34% higher than usual.

Happy Thanksgiving!

ETA Oct. 14, 2015:  There’s been some commentary about the Quirks & Quarks debate elsewhere. First, there’s David Bruggeman’s Oct. 13, 2015 post on his Pasco Phronesis blog (Note: Links have been removed),

Chalk it up to being a Yank who doesn’t give Canadian science policy his full attention, but one thing (among several) I learned from the recent Canadian cross-party science debate concerns open access policy.

As I haven’t posted anything on Canadian open access policies since 2010, clearly I need to catch up.  I am assuming Goodyear is referring to the Tri-Agency Open Access Policy, introduced in February by his successor as Minister of State for Science and Technology.  It applies to all grants issued from May 1, 2015 and forward (unless the work was already applicable to preexisting government open access policy), and applies most of the open access policy of the Canadian Institutes for Health Research (CIHR) to the other major granting agencies (the Natural Sciences and Engineering Research Council of Canada and the Social Sciences and Humanities Research Council of Canada).

The policy establishes that grantees must make research articles coming from their grants available free to the public within 12 months of publication. …

Then, there’s Michael Rennie, an Assistant Professor at Lakehead University and a former Canadian government scientist whose Oct. 14, 2015 posting on his unmuzzled science blog notes this,

This [Gary Goodyear’s debate presentation] pissed me off so much it made me come out of retirement on this blog.

Listening to Gary Goodyear (Conservative representative, and MP in Cambridge and former Minister of State for Science and Technology), I became furious with the level of misinformation given. …

Rennie went ahead and Storified the twitter responses to the Goodyear’s comments (Note: Links have been removed),

Here’s my Storify of tweets that help clarify a good deal of the misinformation Gary Goodyear presented during the debate, as well as some rebuttals from folks who are in the know: I was a Canadian Government Scientist with DFO [Department of Fisheries and Oceans] from 2010-2014, and was a Research Scientist at the Experimental Lakes Area [ELA], who heard about the announcement regarding the intention of the government to close the facility first-hand on the telephone at ELA.

Goodyear: “I was involved in that decision. With respect to the Experimental Lakes, we never said we would shut it down. We said that we wanted to transfer it to a facility that was better suited to operate it. And that’s exactly what we’ve done. Right now, DFO is up there undertaking some significant remediation effects to clean up those lakes that are contaminated by the science that’s been going on up there. We all hope these lakes will recover soon so that science and experimentation can continue but not under the federal envelope. So it’s secure and it’s misleading to suggest that we were trying to stop science there.”
There’s so many inaccuracies in here, it’s hard to know where to start. First, Goodyear’s assertion that there are “contaminated lakes” at ELA is nonsense. Experiments conducted there are done using environmentally-relevant exposures; in other words, what you’d see going on somewhere else on earth, and in every case, each lake has recovered to it’s natural state, simply by stopping the experiment.

Second, there ARE experiments going on at ELA currently, many of which I am involved in; the many tours, classes and researchers on site this year can attest to this.

Third, this “cleanup” that is ongoing is to clean up all the crap that was left behind by DFO staff during 40 years of experiments- wood debris, old gear, concrete, basically junk that was left on the shorelines of lakes. No “lake remediation” to speak of.

Fourth, the conservative government DID stop science at ELA- no new experiments were permitted to begin, even ones that were already funded and on the books like the nanosilver experiment which was halted until 2014, jeopardizing the futures the futures of many students involved. Only basic monitoring occurred between 2012-2014.

Last, the current government deserves very little credit for the transfer of ELA to another operator; the successful move was conceived and implemented largely by other people and organizations, and the attempts made by the government to try and move the facility to a university were met with incredulity by the deans and vice presidents invited to the discussion.

There’s a lot more and I strongly recommend reading Rennie’s Storify piece.

It was unfortunate that the representatives from the other parties were not able to seriously question Goodyear’s points.

Perhaps next time (fingers crossed), the representatives from the various parties will be better prepared. I’d also like to suggest that there be some commentary from experts afterwards in the same way the leaders’ debates are followed by commentary. And while I’m dreaming, maybe there could be an opportunity for phone-in or Twitter questions.

Quantum teleportation

It’s been two years (my Aug. 16, 2013 posting features a German-Japanese collaboration) since the last quantum teleportation posting here. First, a little visual stimulation,

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain James T Kirk (credit: http://www.comicvine.com/james-t-kirk/4005-20078/)

Captain Kirk, also known as William Shatner, is from Montréal, Canada and that’s not the only Canadian connection to this story which is really about some research at York University (UK). From an Oct. 1, 2015 news item on Nanotechnology Now,

Mention the word ‘teleportation’ and for many people it conjures up “Beam me up, Scottie” images of Captain James T Kirk.

But in the last two decades quantum teleportation – transferring the quantum structure of an object from one place to another without physical transmission — has moved from the realms of Star Trek fantasy to tangible reality.

A Sept. 30, 2015 York University (UK) press release, which originated the news item, describes the quantum teleportation research problem and solution,

Quantum teleportation is an important building block for quantum computing, quantum communication and quantum network and, eventually, a quantum Internet. While theoretical proposals for a quantum Internet already exist, the problem for scientists is that there is still debate over which of various technologies provides the most efficient and reliable teleportation system. This is the dilemma which an international team of researchers, led by Dr Stefano Pirandola of the Department of Computer Science at the University of York, set out to resolve.

In a paper published in Nature Photonics, the team, which included scientists from the Freie Universität Berlin and the Universities of Tokyo and Toronto [emphasis mine], reviewed the theoretical ideas around quantum teleportation focusing on the main experimental approaches and their attendant advantages and disadvantages.

None of the technologies alone provide a perfect solution, so the scientists concluded that a hybridisation of the various protocols and underlying structures would offer the most fruitful approach.

For instance, systems using photonic qubits work over distances up to 143 kilometres, but they are probabilistic in that only 50 per cent of the information can be transported. To resolve this, such photon systems may be used in conjunction with continuous variable systems, which are 100 per cent effective but currently limited to short distances.

Most importantly, teleportation-based optical communication needs an interface with suitable matter-based quantum memories where quantum information can be stored and further processed.

Dr Pirandola, who is also a member of the York Centre for Quantum Technologies, said: “We don’t have an ideal or universal technology for quantum teleportation. The field has developed a lot but we seem to need to rely on a hybrid approach to get the best from each available technology.

“The use of quantum teleportation as a building block for a quantum network depends on its integration with quantum memories. The development of good quantum memories would allow us to build quantum repeaters, therefore extending the range of teleportation. They would also give us the ability to store and process the transmitted quantum information at local quantum computers.

“This could ultimately form the backbone of a quantum Internet. The revised hybrid architecture will likely rely on teleportation-based long-distance quantum optical communication, interfaced with solid state devices for quantum information processing.”

Here’s a link to and a citation for the paper,

Advances in quantum teleportation by S. Pirandola, J. Eisert, C. Weedbrook, A. Furusawa, & S. L. Braunstein. Nature Photonics 9, 641–652 (2015) doi:10.1038/nphoton.2015.154 Published online 29 September 2015

This paper is behind a paywall.


Japanese researchers note the emergence of the ‘Devil’s staircase’

I wanted to know why it’s called the ‘Devil’s staircase’ and this is what I found. According to Wikipedia there are several of them,

I gather the scientists are referring to the Cantor function (mathematics), Note: Links have been removed,

In mathematics, the Cantor function is an example of a function that is continuous, but not absolutely continuous. It is also referred to as the Cantor ternary function, the Lebesgue function, Lebesgue’s singular function, the Cantor-Vitali function, the Devil’s staircase,[1] the Cantor staircase function,[2] and the Cantor-Lebesgue function.[3]

Here’s a diagram illustrating the Cantor function (from the Wikipedia entry),

CC BY-SA 3.0 File:CantorEscalier.svg Uploaded by Theon Created: January 24, 2009

CC BY-SA 3.0
Uploaded by Theon
Created: January 24, 2009

As for this latest ‘Devil’s staircase’, a June 17, 2015 news item on Nanowerk announces the research (Note: A link has been removed),

Researchers at the University of Tokyo have revealed a novel magnetic structure named the “Devil’s staircase” in cobalt oxides using soft X-rays (“Observation of a Devil’s Staircase in the Novel Spin-Valve System SrCo6O11“). This is an important result since the researchers succeeded in determining the detailed magnetic structure of a very small single crystal invisible to the human eye.

A June 17, 2015 University of Tokyo press release, which originated the news item on Nanowerk, describes why this research is now possible and the impact it could have,

Recent remarkable progress in resonant soft x-ray diffraction performed in synchrotron facilities has made it possible to determine spin ordering (magnetic structure) in small-volume samples including thin films and nanostructures, and thus is expected to lead not only to advances in materials science but also application to spintronics, a technology which is expected to form the basis of future electronic devices. Cobalt oxide is known as one material that is suitable for spintronics applications, but its magnetic structure was not fully understood.

The research group of Associate Professor Hiroki Wada at the University of Tokyo Institute for Solid State Physics, together with the researchers at Kyoto University and in Germany, performed a resonant soft X-ray diffraction study of cobalt (Co) oxides in the synchrotron facility BESSY II in Germany. They observed all the spin orderings which are theoretically possible and determined how these orderings change with the application of magnetic fields. The plateau-like behavior of magnetic structure as a function of magnetic field is called the “Devil’s staircase,” and is the first such discovery in spin systems in 3D transition metal oxides including cobalt, iron, manganese.

By further resonant soft X-ray diffraction studies, one can expect to find similar “Devil’s staircase” behavior in other materials. By increasing the spatial resolution of microscopic observation of the “Devil’s staircase” may lead to the development of novel types of spintronics materials.

Here’s an example of the ‘cobalt’ Devil’s staircase,

The magnetic structure that gives rise to the Devil's Staircase Magnetization (vertical axis) of cobalt oxide shows plateau like behaviors as a function of the externally-applied magnetic field (horizontal axis). The researchers succeeded in determining the magnetic structures which create such plateaus. Red and blue arrows indicate spin direction. © 2015 Hiroki Wadati.

The magnetic structure that gives rise to the Devil’s Staircase
Magnetization (vertical axis) of cobalt oxide shows plateau like behaviors as a function of the externally-applied magnetic field (horizontal axis). The researchers succeeded in determining the magnetic structures which create such plateaus. Red and blue arrows indicate spin direction.
© 2015 Hiroki Wadati.

Here’s a link to and a citation for the paper,

Observation of a Devil’s Staircase in the Novel Spin-Valve System SrCo6O11 by T. Matsuda, S. Partzsch, T. Tsuyama, E. Schierle, E. Weschke, J. Geck, T. Saito, S. Ishiwata, Y. Tokura, and H. Wadati. Phys. Rev. Lett. 114, 236403 – Published 11 June 2015 (paper: Vol. 114, Iss. 23 — 12 June 2015)  DOI: 10.1103/PhysRevLett.114.236403

This paper is behind a paywall.

Magnetic sensitivity under the microscope

Humans do not have the sense of magnetoreception (the ability to detect magnetic fields) unless they’ve been enhanced. On the other hand, species of fish, insects, birds, and some mammals (other than human) possess the sense naturally. Scientists at the University of Tokyo (Japan) have developed a microscope capable of observing magnetoreception according to a June 4, 2015 news item on Nanowerk (Note: A link has been removed),

Researchers at the University of Tokyo have succeeded in developing a new microscope capable of observing the magnetic sensitivity of photochemical reactions believed to be responsible for the ability of some animals to navigate in the Earth’s magnetic field, on a scale small enough to follow these reactions taking place inside sub-cellular structures (Angewandte Chemie International Edition, “Optical Absorption and Magnetic Field Effect Based Imaging of Transient Radicals”).

A June 4, 2015 University of Tokyo news release on EurekAlert, which originated the news item, describes the research in more detail,

Several species of insects, fish, birds and mammals are believed to be able to detect magnetic fields – an ability known as magnetoreception. For example, birds are able to sense the Earth’s magnetic field and use it to help navigate when migrating. Recent research suggests that a group of proteins called cryptochromes and particularly the molecule flavin adenine dinucleotide (FAD) that forms part of the cryptochrome, are implicated in magnetoreception. When cryptochromes absorb blue light, they can form what are known as radical pairs. The magnetic field around the cryptochromes determines the spins of these radical pairs, altering their reactivity. However, to date there has been no way to measure the effect of magnetic fields on radical pairs in living cells.

The research group of Associate Professor Jonathan Woodward at the Graduate School of Arts and Sciences are specialists in radical pair chemistry and investigating the magnetic sensitivity of biological systems. In this latest research, PhD student Lewis Antill made measurements using a special microscope to detect radical pairs formed from FAD, and the influence of very weak magnetic fields on their reactivity, in volumes less than 4 millionths of a billionth of a liter (4 femtoliters). This was possible using a technique the group developed called TOAD (transient optical absorption detection) imaging, employing a microscope built by postdoctoral research associate Dr. Joshua Beardmore based on a design by Beardmore and Woodward.

“In the future, using another mode of the new microscope called MIM (magnetic intensity modulation), also introduced in this work, it may be possible to directly image only the magnetically sensitive regions of living cells,” says Woodward. “The new imaging microscope developed in this research will enable the study of the magnetic sensitivity of photochemical reactions in a variety of important biological and other contexts, and hopefully help to unlock the secrets of animals’ miraculous magnetic sense.”

Here’s a link to and a citation for the paper,

Optical Absorption and Magnetic Field Effect Based Imaging of Transient Radicals by Dr. Joshua P. Beardmore, Lewis M. Antill, and Prof. Jonathan R. Woodward. Angewandte Chemie International Edition DOI: 10.1002/anie.201502591 Article first published online: 3 JUN 2015

© 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

I mentioned human enhancement earlier with regard to magnetoreception. There are people (body hackers) who’ve had implants that give them this extra sense. Dann Berg in a March 21, 2012 post on his website blog (iamdann.com) describes why he implanted a magnet into his finger and his experience with it (at that time, three years and counting),

I quickly learned that magnetic surfaces provided almost no sensation at all. Rather, it was movement that caused my finger to perk up. Things like power cord transformers, microwaves, and laptop fans became interactive in a whole new way. Each object has its own unique field, with different strength and “texture.” I started holding my finger over almost everything that I could, getting a feeling for each object’s invisible reach.

Portable electronics proved to be an experience as well. There were two fairly large electronic items that hit the shelves around the same time as I got my implant: the first iPad and the Kindle 2.

Something to consider,

Courtesy: iamdann.com (Dann Berg)

Courtesy: iamdann.com (Dann Berg)

A city of science in Japan: Kawasaki (Kanagawa)

Happily, I’m getting more nanotechnology (for the most part) information from Japan. Given Japan’s prominence in this field of endeavour I’ve long felt FrogHeart has not adequately represented Japanese contributions. Now that I’m receiving English language translations, I hope to better address the situation.

This morning (March 26, 2015), there were two news releases from Kawasaki INnovation Gateway at SKYFRONT (KING SKYFRONT), Coastal Area International Strategy Office, Kawasaki City, Japan in my mailbox. Before getting on to the news releases, here’s a little about  the city of Kawasaki and about its innovation gateway. From the Kawasaki, Kanagawa entry in Wikipedia (Note: Links have been removed),

Kawasaki (川崎市 Kawasaki-shi?) is a city in Kanagawa Prefecture, Japan, located between Tokyo and Yokohama. It is the 9th most populated city in Japan and one of the main cities forming the Greater Tokyo Area and Keihin Industrial Area.

Kawasaki occupies a belt of land stretching about 30 kilometres (19 mi) along the south bank of the Tama River, which divides it from Tokyo. The eastern end of the belt, centered on JR Kawasaki Station, is flat and largely consists of industrial zones and densely built working-class housing, the Western end mountainous and more suburban. The coastline of Tokyo Bay is occupied by vast heavy industrial complexes built on reclaimed land.

There is a 2014 video about Kawasaki’s innovation gateway, which despite its 14 mins. 39 secs. running time I am embedding here. (Caution: They highlight their animal testing facility at some length.)

Now on to the two news releases. The first concerns research on gold nanoparticles that was published in 2014. From a March 26, 2015 Kawasaki INnovation Gateway news release,

Gold nanoparticles size up to cancer treatment

Incorporating gold nanoparticles helps optimise treatment carrier size and stability to improve delivery of cancer treatment to cells.

Treatments that attack cancer cells through the targeted silencing of cancer genes could be developed using small interfering RNA molecules (siRNA). However delivering the siRNA into the cells intact is a challenge as it is readily degraded by enzymes in the blood and small enough to be eliminated from the blood stream by kidney filtration.  Now Kazunori Kataoka at the University of Tokyo and colleagues at Tokyo Institute of Technology have designed a protective treatment delivery vehicle with optimum stability and size for delivering siRNA to cells.

The researchers formed a polymer complex with a single siRNA molecule. The siRNA-loaded complex was then bonded to a 20 nm gold nanoparticle, which thanks to advances in synthesis techniques can be produced with a reliably low size distribution. The resulting nanoarchitecture had the optimum overall size – small enough to infiltrate cells while large enough to accumulate.

In an assay containing heparin – a biological anti-coagulant with a high negative charge density – the complex was found to release the siRNA due to electrostatic interactions. However when the gold nanoparticle was incorporated the complex remained stable. Instead, release of the siRNA from the complex with the gold nanoparticle could be triggered once inside the cell by the presence of glutathione, which is present in high concentrations in intracellular fluid. The glutathione bonded with the gold nanoparticles and the complex, detaching them from each other and leaving the siRNA prone to release.

The researchers further tested their carrier in a subcutaneous tumour model. The authors concluded that the complex bonded to the gold nanoparticle “enabled the efficient tumor accumulation of siRNA and significant in vivo gene silencing effect in the tumor, demonstrating the potential for siRNA-based cancer therapies.”

The news release provides links to the March 2015 newsletter which highlights this research and to the specific article and video,

March 2015 Issue of Kawasaki SkyFront iNewsletter: http://inewsletter-king-skyfront.jp/en/


Feature video on Professor Kataoka’s research : http://inewsletter-king-skyfront.jp/en/video_feature/vol_3/feature01/

Research highlights: http://inewsletter-king-skyfront.jp/en/research_highlights/vol_3/research01/

Here’s a link to and a citation for the paper,

Precise Engineering of siRNA Delivery Vehicles to Tumors Using Polyion Complexes and Gold Nanoparticles by Hyun Jin Kim, Hiroyasu Takemoto, Yu Yi, Meng Zheng, Yoshinori Maeda, Hiroyuki Chaya, Kotaro Hayashi, Peng Mi, Frederico Pittella, R. James Christie, Kazuko Toh, Yu Matsumoto, Nobuhiro Nishiyama, Kanjiro Miyata, and Kazunori Kataoka. ACS Nano, 2014, 8 (9), pp 8979–8991 DOI: 10.1021/nn502125h Publication Date (Web): August 18, 2014
Copyright © 2014 American Chemical Society

This article is behind a paywall.

The second March 26, 2015 Kawasaki INnovation Gateway news release concerns a DNA chip and food-borne pathogens,

Rapid and efficient DNA chip technology for testing 14 major types of food borne pathogens

Conventional methods for testing food-borne pathogens is based on the cultivation of pathogens, a process that is complicated and time consuming. So there is demand for alternative methods to test for food-borne pathogens that are simpler, quick and applicable to a wide range of potential applications.

Now Toshiba Ltd and Kawasaki City Institute for Public Health have collaborated in the development of a rapid and efficient automatic abbreviated DNA detection technology that can test for 14 major types of food borne pathogens. The so called ‘DNA chip card’ employs electrochemical DNA chips and overcomes the complicated procedures associated with genetic testing of conventional methods. The ‘DNA chip card’ is expected to find applications in hygiene management in food manufacture, pharmaceuticals, and cosmetics.


The so-called automatic abbreviated DNA detection technology ‘DNA chip card’ was developed by Toshiba Ltd and in a collaboration with Kawasaki City Institute for Public Health, used to simultaneously detect 14 different types of food-borne pathogens in less than 90 minutes. The detection sensitivity depends on the target pathogen and has a range of 1E+01~05 cfu/mL.

Notably, such tests would usually take 4-5 days using conventional methods based on pathogen cultivation. Furthermore, in contrast to conventional DNA protocols that require high levels of skill and expertise, the ‘DNA chip card’ only requires the operator to inject nucleic acid, thereby making the procedure easier to use and without specialized operating skills.

Examples of pathogens associated with food poisoning that were tested with the “DNA chip card”

Enterohemorrhagic Escherichia coli



Vibrio parahaemolyticus


Staphylococcus aureus

Enterotoxigenic Escherichia coli

Enteroaggregative Escherichia coli

Enteropathogenic Escherichia coli

Clostridium perfringens

Bacillus cereus



Vibrio cholerae

I think 14 is the highest number of tests I’ve seen for one of these chips. This chip is quite an achievement.

One final bit from the news release about the DNA chip provides a brief description of the gateway and something they call King SkyFront,


The Kawasaki INnovation Gateway (KING) SKYFRONT is the flagship science and technology innovation hub of Kawasaki City. KING SKYFRONT is a 40 hectare area located in the Tonomachi area of the Keihin Industrial Region that spans Tokyo and Kanagawa Prefecture and Tokyo International Airport (also often referred to as Haneda Airport).

KING SKYFRONT was launched in 2013 as a base for scholars, industrialists and government administrators to work together to devise real life solutions to global issues in the life sciences and environment.

I find this emphasis on the city interesting. It seems that cities are becoming increasingly important and active where science research and development are concerned. Europe seems to have adopted a biannual event wherein a city is declared a European City of Science in conjunction with the EuroScience Open Forum (ESOF) conferences. The first such city was Dublin in 2012 (I believe the Irish came up with the concept themselves) and was later adopted by Copenhagen for 2014. The latest city to embrace the banner will be Manchester in 2016.

Quantum teleportation from a Japan-Germany collaboration

An Aug. 15, 2013 Johannes Gutenberg University Mainz press release (also on EurekAlert) has somewhat gobsmacked me with its talk of teleportation,

By means of the quantum-mechanical entanglement of spatially separated light fields, researchers in Tokyo and Mainz have managed to teleport photonic qubits with extreme reliability. This means that a decisive breakthrough has been achieved some 15 years after the first experiments in the field of optical teleportation. The success of the experiment conducted in Tokyo is attributable to the use of a hybrid technique in which two conceptually different and previously incompatible approaches were combined. “Discrete digital optical quantum information can now be transmitted continuously – at the touch of a button, if you will,” explained Professor Peter van Loock of Johannes Gutenberg University Mainz (JGU). As a theoretical physicist, van Loock advised the experimental physicists in the research team headed by Professor Akira Furusawa of the University of Tokyo on how they could most efficiently perform the teleportation experiment to ultimately verify the success of quantum teleportation.

The press release goes on to describe quantum teleportation,

Quantum teleportation involves the transfer of arbitrary quantum states from a sender, dubbed Alice, to a spatially distant receiver, named Bob. This requires that Alice and Bob initially share an entangled quantum state across the space in question, e.g., in the form of entangled photons. Quantum teleportation is of fundamental importance to the processing of quantum information (quantum computing) and quantum communication. Photons are especially valued as ideal information carriers for quantum communication since they can be used to transmit signals at the speed of light. A photon can represent a quantum bit or qubit analogous to a binary digit (bit) in standard classical information processing. Such photons are known as ‘flying quantum bits.

Before explaining the new technique, there’s an overview of previous efforts,

The first attempts to teleport single photons or light particles were made by the Austrian physicist Anton Zeilinger. Various other related experiments have been performed in the meantime. However, teleportation of photonic quantum bits using conventional methods proved to have its limitations because of experimental deficiencies and difficulties with fundamental principles.

What makes the experiment in Tokyo so different is the use of a hybrid technique. With its help, a completely deterministic and highly reliable quantum teleportation of photonic qubits has been achieved. The accuracy of the transfer was 79 to 82 percent for four different qubits. In addition, the qubits were teleported much more efficiently than in previous experiments, even at a low degree of entanglement.

The concept of entanglement was first formulated by Erwin Schrödinger and involves a situation in which two quantum systems, such as two light particles for example, are in a joint state, so that their behavior is mutually dependent to a greater extent than is normally (classically) possible. In the Tokyo experiment, continuous entanglement was achieved by means of entangling many photons with many other photons. This meant that the complete amplitudes and phases of two light fields were quantum correlated. Previous experiments only had a single photon entangled with another single photon – a less efficient solution. “The entanglement of photons functioned very well in the Tokyo experiment – practically at the press of a button, as soon as the laser was switched on,” said van Loock, Professor for Theory of Quantum Optics and Quantum Information at Mainz University. This continuous entanglement was accomplished with the aid of so-called ‘squeezed light’, which takes the form of an ellipse in the phase space of the light field. Once entanglement has been achieved, a third light field can be attached to the transmitter. From there, in principle, any state and any number of states can be transmitted to the receiver. “In our experiment, there were precisely four sufficiently representative test states that were transferred from Alice to Bob using entanglement. Thanks to continuous entanglement, it was possible to transmit the photonic qubits in a deterministic fashion to Bob, in other words, in each run,” added van Loock.

Earlier attempts to achieve optical teleportation were performed differently and, before now, the concepts used have proved to be incompatible. Although in theory it had already been assumed that the two different strategies, from the discrete and the continuous world, needed to be combined, it represents a technological breakthrough that this has actually now been experimentally demonstrated with the help of the hybrid technique. “The two separate worlds, the discrete and the continuous, are starting to converge,” concluded van Loock.

The researchers have provided an image illustrating quantum teleportation,

Deterministic quantum teleportation of a photonic quantum bit. Each qubit that flies from the left into the teleporter leaves the teleporter on the right with a loss of quality of only around 20 percent, a value not achievable without entanglement. Courtesy University of Tokyo

Deterministic quantum teleportation of a photonic quantum bit. Each qubit that flies from the left into the teleporter leaves the teleporter on the right with a loss of quality of only around 20 percent, a value not achievable without entanglement. Courtesy University of Tokyo

Here’s a citation for and a link to the published paper,

Deterministic quantum teleportation of photonic quantum bits by a hybrid technique by Shuntaro Takeda, Takahiro Mizuta, Maria Fuwa, Peter van Loock & Akira Furusawa. Nature 500, 315–318 (15 August 2013) doi:10.1038/nature12366 Published online 14 August 2013

This article  is behind a paywall although there is a preview capability (ReadCube Access) available.

Special coating eliminates need to de-ice airplanes

There was a big airplane accident years ago where the chief pilot had failed to de-ice the wings just before take off. The plane took off from Dulles Airport (Washington, DC) and crashed minutes later killing the crew and passengers (if memory serves, everyone died).

I read the story in a book about sociolinguistics and work. When the ‘black box’ (a recorder that’s in all airplanes) was recovered, sociolinguists were included in the team that was tasked with trying to establish the cause(s). From the sociolinguists’ perspective, it came down to this. The chief pilot hadn’t flown from Washington, DC very often and was unaware that icing could be as prevalent there as it is more northern airports. He did de-ice the wings but the plane did not take off in its assigned time slot (busy airport). After several minutes and just prior to takeoff, the chief pilot’s second-in-command who was more familiar with Washington’s weather conditions gently suggested de-icing wings a second time and was ignored. (They reproduced some of the dialogue in the text I was reading.) The story made quite an impact on me since I’m very familiar with the phenomenon (confession: I’ve been on both sides of the equation) of comments in the workplace being ignored, although not with such devastating consequences. Predictably, the sociolinguists suggested changing the crew’s communication habits (always a good idea) but it never occurred to them (or to me at the time of reading the text) that technology might help provide an answer.

A Japanese research team (Riho Kamada, Chuo University;  Katsuaki Morita, The University of Tokyo; Koji Okamoto, The University of Tokyo; Akihito Aoki, Kanagawa Institute of Technology; Shigeo Kimura, Kanagawa Institute of Technology; Hirotaka Sakaue, Japan Aerospace Exploration Agency [JAXA]) presented an anti-icing (or de-icing) solution for airplanes at the 65th Annual Meeting of the APS* Division of Fluid Dynamics, November 18–20, 2012 in San Diego, California, from the Nov. 16, 2012 news release on EurekAlert,

To help planes fly safely through cold, wet, and icy conditions, a team of Japanese scientists has developed a new super water-repellent surface that can prevent ice from forming in these harsh atmospheric conditions. Unlike current inflight anti-icing techniques, the researchers envision applying this new anti-icing method to an entire aircraft like a coat of paint.

As airplanes fly through clouds of super-cooled water droplets, areas around the nose, the leading edges of the wings, and the engine cones experience low airflow, says Hirotaka Sakaue, a researcher in the fluid dynamics group at the Japan Aerospace Exploration Agency (JAXA). This enables water droplets to contact the aircraft and form an icy layer. If ice builds up on the wings it can change the way air flows over them, hindering control and potentially making the airplane stall. Other members of the research team are with the University of Tokyo, the Kanagawa Institute of Technology, and Chuo University.

Current anti-icing techniques include diverting hot air from the engines to the wings, preventing ice from forming in the first place, and inflatable membranes known as pneumatic boots, which crack ice off the leading edge of an aircraft’s wings. The super-hydrophobic, or water repelling, coating being developed by Sakaue, Katsuaki Morita – a graduate student at the University of Tokyo – and their colleagues works differently, by preventing the water from sticking to the airplane’s surface in the first place.

The researchers developed a coating containing microscopic particles of a Teflon-based material called polytetrafluoroethylene (PTFE), which reduces the energy needed to detach a drop of water from a surface. “If this energy is small, the droplet is easy to remove,” says Sakaue. “In other words, it’s repelled,” he adds.

The PTFE microscale particles created a rough surface, and the rougher it is, on a microscopic scale, the less energy it takes to detach water from that surface. The researchers varied the size of the PTFE particles in their coatings, from 5 to 30 micrometers, in order to find the most water-repellant size. By measuring the contact angle – the angle between the coating and the drop of water – they could determine how well a surface repelled water.

While this work isn’t occurring at the nanoscale, I thought I’d make an exception due to my interest in the subject.

*APS is the American Physical Society

Sometimes when we touch: Touché, a sensing project from Disnery Research and Carnegie Mellon

Researchers at Carnegie Mellon University and Disney Research, Pittsburgh (Philadelphia, US) have taken capacitive sensing, used for touchscreens such as smartphones, and added new capabilities. From the May 4, 2012 news item on Nanowerk,

A doorknob that knows whether to lock or unlock based on how it is grasped, a smartphone that silences itself if the user holds a finger to her lips and a chair that adjusts room lighting based on recognizing if a user is reclining or leaning forward are among the many possible applications of Touché, a new sensing technique developed by a team at Disney Research, Pittsburgh, and Carnegie Mellon University.

Touché is a form of capacitive touch sensing, the same principle underlying the types of touchscreens used in most smartphones. But instead of sensing electrical signals at a single frequency, like the typical touchscreen, Touché monitors capacitive signals across a broad range of frequencies.

This Swept Frequency Capacitive Sensing (SFCS) makes it possible to not only detect a “touch event,” but to recognize complex configurations of the hand or body that is doing the touching. An object thus could sense how it is being touched, or might sense the body configuration of the person doing the touching.

Disney Research, Pittsburgh made this video describing the technology and speculating on some of the possible applications (this is a research-oriented video, not your standard Disney fare),

Here’s a bit  more about the technology (from the May 4, 2012 news item),

Both Touché and smartphone touchscreens are based on the phenomenon known as capacitive coupling. In a capacitive touchscreen, the surface is coated with a transparent conductor that carries an electrical signal. That signal is altered when a person’s finger touches it, providing an alternative path for the electrical charge.

By monitoring the change in the signal, the device can determine if a touch occurs. By monitoring a range of signal frequencies, however, Touché can derive much more information. Different body tissues have different capacitive properties, so monitoring a range of frequencies can detect a number of different paths that the electrical charge takes through the body.

Making sense of all of that SFCS information, however, requires analyzing hundreds of data points. As microprocessors have become steadily faster and less expensive, it now is feasible to use SFCS in touch interfaces, the researchers said.

“Devices keep getting smaller and increasingly are embedded throughout the environment, which has made it necessary for us to find ways to control or interact with them, and that is where Touché could really shine,” Harrison [Chris Harrison, a Ph.D. student in Carnegie Mellon’s Human-Computer Interaction Institute] said. Sato [Munehiko Sato, a Disney intern and a Ph.D. student in engineering at the University of Tokyo] said Touché could make computer interfaces as invisible to users as the embedded computers themselves. “This might enable us to one day do away with keyboards, mice and perhaps even conventional touchscreens for many applications,” he said.

We’re seeing more of these automatic responses to a gesture or movement. For example, common spelling errors are corrected as you key in (type) text in wordprocessing packages and in search engines. In fact, there are times when an applications insists on its own correction and I have to insist (and I don’t always manage to override the system) if I have something which is nonstandard. As I watch these videos and read about these new technical possibilities, I keep asking myself, Where is the override?