Tag Archives: Austrian Academy of Sciences

Entanglement at 50 km

An August 29, 2019 news item on phys.org broke the news about breaking a record for transferring quantum entanglement between matter and light ,

The quantum internet promises absolutely tap-proof communication and powerful distributed sensor networks for new science and technology. However, because quantum information cannot be copied, it is not possible to send this information over a classical network. Quantum information must be transmitted by quantum particles, and special interfaces are required for this. The Innsbruck-based experimental physicist Ben Lanyon, who was awarded the Austrian START Prize in 2015 for his research, is investigating these important intersections of a future quantum Internet.

Now his team at the Department of Experimental Physics at the University of Innsbruck and at the Institute of Quantum Optics and Quantum Information of the Austrian Academy of Sciences has achieved a record for the transfer of quantum entanglement between matter and light. For the first time, a distance of 50 kilometers was covered using fiber optic cables. “This is two orders of magnitude further than was previously possible and is a practical distance to start building inter-city quantum networks,” says Ben Lanyon.

An August 29, 2019 University of Innsbruck press release (also on EurekAlert), which originated the news item,

Converted photon for transmission

Lanyon’s team started the experiment with a calcium atom trapped in an ion trap. Using laser beams, the researchers write a quantum state onto the ion and simultaneously excite it to emit a photon in which quantum information is stored. As a result, the quantum states of the atom and the light particle are entangled. But the challenge is to transmit the photon over fiber optic cables. “The photon emitted by the calcium ion has a wavelength of 854 nanometers and is quickly absorbed by the optical fiber”, says Ben Lanyon. His team therefore initially sends the light particle through a nonlinear crystal illuminated by a strong laser. Thereby the photon wavelength is converted to the optimal value for long-distance travel: the current telecommunications standard wavelength of 1550 nanometers. The researchers from Innsbruck then send this photon through a 50-kilometer-long optical fiber line. Their measurements show that atom and light particle are still entangled even after the wavelength conversion and this long journey.

Even greater distances in sight

As a next step, Lanyon and his team show that their methods would enable entanglement to be generated between ions 100 kilometers apart and more. Two nodes send each an entangled photon over a distance of 50 kilometers to an intersection where the light particles are measured in such a way that they lose their entanglement with the ions, which in turn would entangle them. With 100-kilometer node spacing now a possibility, one could therefore envisage building the world’s first intercity light-matter quantum network in the coming years: only a handful of trapped ion-systems would be required on the way to establish a quantum internet between Innsbruck and Vienna, for example.

Lanyon’s team is part of the Quantum Internet Alliance, an international project within the Quantum Flagship framework of the European Union. The current results have been published in the Nature journal Quantum Information. Financially supported was the research among others by the Austrian Science Fund FWF and the European Union.

Here’s a link to and a citation for the paper,

Light-matter entanglement over 50 km of optical fibre by V. Krutyanskiy, M. Meraner, J. Schupp, V. Krcmarsky, H. Hainzer & B. P. Lanyon. npj Quantum Information volume 5, Article number: 72 (2019) DOI: https://doi.org/10.1038/s41534-019-0186-3 Published: 27 August 2019

This paper is open access.

Simulating elementary physics in a quantum simulation (particle zoo in a quantum computer?)

Whoever wrote the news release used a very catchy title “Particle zoo in a quantum computer”; I just wish they’d explained it. Looking up the definition for a ‘particle zoo’ didn’t help as much as I’d hoped. From the particle zoo entry on Wikipedia (Note: Links have been removed),

In particle physics, the term particle zoo[1][2] is used colloquially to describe a relatively extensive list of the then known “elementary particles” that almost look like hundreds of species in the zoo.

In the history of particle physics, the situation was particularly confusing in the late 1960s. Before the discovery of quarks, hundreds of strongly interacting particles (hadrons) were known, and believed to be distinct elementary particles in their own right. It was later discovered that they were not elementary particles, but rather composites of the quarks. The set of particles believed today to be elementary is known as the Standard Model, and includes quarks, bosons and leptons.

I believe the writer used the term to indicate that the simulation undertaken involved elementary particles. If you have a better explanation, please feel free to add it to the comments for this post.

Here’s the news from a June 22, 2016 news item on ScienceDaily,

Elementary particles are the fundamental buildings blocks of matter, and their properties are described by the Standard Model of particle physics. The discovery of the Higgs boson at the CERN in 2012 constitutes a further step towards the confirmation of the Standard Model. However, many aspects of this theory are still not understood because their complexity makes it hard to investigate them with classical computers. Quantum computers may provide a way to overcome this obstacle as they can simulate certain aspects of elementary particle physics in a well-controlled quantum system. Physicists from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) at the Austrian Academy of Sciences have now done exactly that: In an international first, Rainer Blatt’s and Peter Zoller’s research teams have simulated lattice gauge theories in a quantum computer. …

A June 23, 2016 University of Innsbruck (Universität Innsbruck) press release, which seems  to have originated the news item, provides more detail,

Gauge theories describe the interaction between elementary particles, such as quarks and gluons, and they are the basis for our understanding of fundamental processes. “Dynamical processes, for example, the collision of elementary particles or the spontaneous creation of particle-antiparticle pairs, are extremely difficult to investigate,” explains Christine Muschik, theoretical physicist at the IQOQI. “However, scientists quickly reach a limit when processing numerical calculations on classical computers. For this reason, it has been proposed to simulate these processes by using a programmable quantum system.” In recent years, many interesting concepts have been proposed, but until now it was impossible to realize them. “We have now developed a new concept that allows us to simulate the spontaneous creation of electron-positron pairs out of the vacuum by using a quantum computer,” says Muschik. The quantum system consists of four electromagnetically trapped calcium ions that are controlled by laser pulses. “Each pair of ions represent a pair of a particle and an antiparticle,” explains experimental physicist Esteban A. Martinez. “We use laser pulses to simulate the electromagnetic field in a vacuum. Then we are able to observe how particle pairs are created by quantum fluctuations from the energy of this field. By looking at the ion’s fluorescence, we see whether particles and antiparticles were created. We are able to modify the parameters of the quantum system, which allows us to observe and study the dynamic process of pair creation.”

Combining different fields of physics

With this experiment, the physicists in Innsbruck have built a bridge between two different fields in physics: They have used atomic physics experiments to study questions in high-energy physics. While hundreds of theoretical physicists work on the highly complex theories of the Standard Model and experiments are carried out at extremely expensive facilities, such as the Large Hadron Collider at CERN, quantum simulations may be carried out by small groups in tabletop experiments. “These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.” Experimental physicist Rainer Blatt adds: “Moreover, we can study new processes by using quantum simulation. For example, in our experiment we also investigated particle entanglement produced during pair creation, which is not possible in a particle collider.” The physicists are convinced that future quantum simulators will potentially be able to solve important questions in high-energy physics that cannot be tackled by conventional methods.

Foundation for a new research field

It was only a few years ago that the idea to combine high-energy and atomic physics was proposed. With this work it has been implemented experimentally for the first time. “This approach is conceptually very different from previous quantum simulation experiments studying many-body physics or quantum chemistry. The simulation of elementary particle processes is theoretically very complex and, therefore, has to satisfy very specific requirements. For this reason it is difficult to develop a suitable protocol,” underlines Zoller. The conditions for the experimental physicists were equally demanding: “This is one of the most complex experiments that has ever been carried out in a trapped-ion quantum computer,” says Blatt. “We are still figuring out how these quantum simulations work and will only gradually be able to apply them to more challenging phenomena.” The great theoretical as well as experimental expertise of the physicists in Innsbruck was crucial for the breakthrough. Both Blatt and Zoller emphasize that they have been doing research on quantum computers for many years now and have gained a lot of experience in their implementation. Innsbruck has become one of the leading centers for research in quantum physics; here, the theoretical and experimental branches work together at an extremely high level, which enables them to gain novel insights into fundamental phenomena.

Here’s a link to and a citation for the paper,

Real-time dynamics of lattice gauge theories with a few-qubit quantum computer by Esteban A. Martinez, Christine A. Muschik, Philipp Schindler, Daniel Nigg, Alexander Erhard, Markus Heyl, Philipp Hauke, Marcello Dalmonte, Thomas Monz, Peter Zoller, & Rainer Blatt.  Nature 534, 516–519 (23 June 2016)  doi:10.1038/nature18318 Published online 22 June 2016

This paper is behind a paywall.

There is a soundcloud audio file featuring an explanation of the work from the lead author, Esteban A. Martinez,

Quantum physics experiments designed by an algorithm

A Feb. 22, 2016 news item on Nanotechnology Now describes research into quantum physics performed by an algorithm,

Quantum physicist Mario Krenn and his colleagues in the group of Anton Zeilinger from the Faculty of Physics at the University of Vienna and the Austrian Academy of Sciences have developed an algorithm which designs new useful quantum experiments. As the computer does not rely on human intuition, it finds novel unfamiliar solutions.

The researchers have provided an image illustrating their work,

Caption: The algorithm Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. Credit: Copyright: Robert Fickler, Universität Wien (University of Vienna)

Caption: The algorithm Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. Credit: Copyright: Robert Fickler, Universität Wien (University of Vienna)

A Feb. 22, 2016 University of Vienna press release (also on EurekAlert), which originated the news item, expands on the theme,

The idea was developed when the physicists wanted to create new quantum states in the laboratory, but were unable to conceive of methods to do so. “After many unsuccessful attempts to come up with an experimental implementation, we came to the conclusion that our intuition about these phenomena seems to be wrong. We realized that in the end we were just trying random arrangements of quantum building blocks. And that is what a computer can do as well – but thousands of times faster”, explains Mario Krenn, PhD student in Anton Zeilinger’s group and first author research.

After a few hours of calculation, their algorithm – which they call Melvin – found the recipe to the question they were unable to solve, and its structure surprised them. Zeilinger says: “Suppose I want build an experiment realizing a specific quantum state I am interested in. Then humans intuitively consider setups reflecting the symmetries of the state. Yet Melvin found out that the most simple realization can be asymmetric and therefore counterintuitive. A human would probably never come up with that solution.”

The physicists applied the idea to several other questions and got dozens of new and surprising answers. “The solutions are difficult to understand, but we were able to extract some new experimental tricks we have not thought of before. Some of these computer-designed experiments are being built at the moment in our laboratories”, says Krenn.

Melvin not only tries random arrangements of experimental components, but also learns from previous successful attempts, which significantly speeds up the discovery rate for more complex solutions. In the future, the authors want to apply their algorithm to even more general questions in quantum physics, and hope it helps to investigate new phenomena in laboratories.

Here’s a link to and a citation for the paper,

Automated Search for new Quantum Experiments by Mario Krenn, Mehul Malik, Robert Fickler, Radek Lapkiewicz, Anton Zeilinger. arXiv (Submitted on 9 Sep 2015 (v1), last revised 20 Feb 2016 (this version, v2))

The version of the paper on arXiv.org is open access. The paper has also been accepted by Physical Review Letters but does not seem to have been published online or in print yet,

Automated search for new quantum experiments
by Mario Krenn, Mehul Malik, Robert Fickler, Radek Lapkiewicz, and Anton Zeilinger. Phys. Rev. Lett. Accepted 27 January 2016

There is a copy of the abstract available on the Physical Review Letters site.

So, you found a quantum trimer. What is it, again?

The University of Innsbruck produced a rather intriguing May 13, 2014 news release (also a May 13, 2014 news item on Nanowerk and on EurekAlert),

Eight years ago Rudolf Grimm’s research group was the first to observe an Efimov state in an ultracold quantum gas. The Russian physicist Vitali Efimov theoretically predicted this exotic bound state of three particles in the 1970s. He forecast that three particles would form a bound state due to their quantum mechanical properties, under conditions when a two-body bound state would be absent. What is even more astounding: When the distance between the particles is increased by factor 22.7, another Efimov state appears, leading to an infinite series of these states. Until now this essential ingredient of the famous scenario has remained elusive and experimentally proving the periodicity of the famous scenario has presented a challenge. “There have been some indications that particles continuously create three-body states if the distance is increased by this factor,” says Rudolf Grimm from the Institute of Experimental Physics of the University of Innsbruck and the Institute of Quantum Physics and Quantum Optics of the Austrian Academy of Sciences. “Proving the scenario was very difficult but we have finally been successful.”

I think the second Efimov state is the quantum trimer but the news release provides no explanation, mentioning the trimer in its headline only.  On the plus side, there’s a very ‘cool’ explanation of quantum gases,

Ultracold quantum gases are highly suited for studying and observing quantum phenomena of particle systems experimentally as the interaction between atoms are well tunable by a magnetic field. However, Rudolf Grimm’s research group got very close to the limits of what is possible experimentally when they had to increase the distance between the particles to one micrometer to be able to observe the second Efimov state. “This corresponds to 20,000 times the radius of a hydrogen atom,” explains Grimm. “Compared to a molecule, this is a gigantic structure.” This meant that the physicists had to be particularly precise with their work. What greatly helped the researchers in Innsbruck was their extensive experience with ultracold quantum gases and their great technical expertise. Their final result shows that the second Efimov state is larger than the first one by a factor of 21.0 with a measurement uncertainty of 1.3. “This small deviation from the factor 22.7 may be attributed to the physics beyond the ideal Efimov state, which is also an exciting topic,” explains Rudolf Grimm.

As for why this Efimov state holds such interest, I found the explanation perplexing but remain intrigued,

The scientific community’s interest in this phenomenon lies in its universal character. The law is equally applicable to nuclear physics, where strong interaction is responsible for the binding of particles in the atomic nucleus, and to molecular interactions that are based on electromagnetic forces. “Interaction between two particles and between many particles is well studied,” says Grimm. “But we still need to investigate and learn about phenomena that arise from the interaction between only a few particles. The Efimov states are the basic example for this.”

Here’s a link to and a citation for the paper,

Observation of the Second Triatomic Resonance in Efimov’s Scenario by
Bo Huang, Leonid A. Sidorenkov, Rudolf Grimm, and Jeremy M. Hutson. Phys. Rev. Lett. 112, 190401 – Published 12 May 2014 DOI: http://dx.doi.org/10.1103/PhysRevLett.112.190401
© 2014 American Physical Society

This article is behind a paywall.

The university provided an illustration of an Efimov state,

The mysterious Efimov scenario (Illustration: IQOQI/Harald Ritsch)

The mysterious Efimov scenario (Illustration: IQOQI/Harald Ritsch)

Beautiful, isn’t it?

European Union’s NanoCode to be extended to all European science?

The Feb. 5, 2013 Nanowerk Spotlight article is given over to a description of a report on the European Union’s NanoCode Project and recommendations from NanoTrust, a project of the Austrian Academy of Sciences, from spotlight article (Note: Footnotes have been removed),

The [European] Commission recommendation for a code of conduct for responsible nanosciences and nanotechnologies research (code of conduct) dates from 2008. Nevertheless, it continues to be a subject of discussion.

Thus the 2012 final report on the NanoCode research project, which has been monitoring the development and implementation of the nanotechnology code over two years, recommends inter alia that the principles and guidelines of the code be extended to all new technologies and to science as a whole. The initiative for a Commission recommendation on “Responsible research and innovation”, launched by the EU Commission in March 2012 adopts the same approach: The principles and guidelines of the code of conduct should be extended to all technologies and also include production and application.

There are difficulties (implementation issues) associated with implementing the NanoCode, which should be obvious from a glance at the responsibilities/obligations, from the NanoTrust dossier no. 36en, December 2012, The EU code of conduct for nanosciences and nanotechnologies research PDF (4 pp),

“Obligations” on the basis of the code

Researchers

• Research in the public interest

• Consideration of fundamental ethical principles and fundamental rights

• Risk research as an element of all applications for funds

• Responsibility for the consequences of research

Research funding bodies

• Research priorities with respect to socially useful research, risk assessment, metrology and standardisation

• Uniformity of standardisation and metrology

• Accountability in the light of research priorities

• Publication of the cost-benefits assessment of funded projects

Member States

• Collaboration between Member States and the Commission

• Monitoring and control systems

• Dissemination

• Encouragement of research in accordance with the code

• Annual report on application and measures within the framework of the code

EU Commission

• Compliance with the code when granting research funding

• Collaboration with the Member States

• Review of the code every two years

• Dissemination (p. 3)

In addition to implementation, there are issues about authority, compatibility within various legal frameworks, and language, from the spotlight article,

The code is the subject matter of discussions in the legal world. Specifically, the discussion addresses (1) whether the Commission has any jurisdiction to issue such a recommendation; (2) in what manner it could take effect de facto and de jure; (3) whether the principles of the code are sufficiently specific; and (4) whether individual guidelines are compatible with the fundamental rights of the freedom of science.

There is also a need to construe the principal responsibility laid down under accountability. In the German version of the code, it is not clear whether this accountability (“Rechenschaftspflicht”) is a legal responsibility or is intended to encourage a “culture of responsibility” (4.1). The term “accountability” in the English version tends not to suggest a legal obligation to render accounts. [emphases mine]

While prospects for implementing the NanoCode are not good, this dossier from NanoTrust provides good insight into the complexities of arriving at agreements of any kind. Documents for the NanoCode project can be found here.

Analysis of German language media coverage of nanotechnology

Austria’s NanoTrust project published, in October 2012, a dossier tittled: Nanotechnology in the media; On the reporting in representative daily newspapers in Austria, Germany and Switzerland which has been highlighted in a Jan. 21, 2013 Nanowerk Spotlight article (Note: Footnotes have been removed),

The media can have a significant influence on the public image of science and technology, in the specific case nanotechnology. This is true in particular if only a small percentage of the population only comes directly into contact with such fields of research. Mass media reporting serves to increase awareness of selected topics, informs about current debates involving a wide variety of actors who need to be heard and thus also prepares a basis for future social debates. The population is introduced to central aspects of technical applications, which also include the opportunities and risks associated with the new technologies.

A media analysis has been conducted of selected quality newspapers within the framework of the “NanoPol” project [cooperation between the Institute for Technology Assessment and Systems Analysis (ITAS) at the Karlsruhe Institute for Technology (KIT), the Institute for Technology Assessment (ITA) at the Austrian Academy of Sciences (OeAW), TA-Swiss in Berne and the Programme for Science Research of the University of Basel], which analyses the nanotechnology policies of Austria, Germany and Switzerland.

Quality newspapers are characterised by their target group, comprising persons who have a specific interest in national events and information and who are of significance as multipliers for opinion formation amongst the national public. At the same time, mass media as an ongoing observer in the public can contribute to determining the significance of the topic for the public discussion. For each country, two print media were investigated, the investigation period extending over ten years (2000-2009):

– Der Standard and Die Presse (A);

– Frankfurter Allgemeine Zeitung and die Süddeutsche Zeitung (D);

– Neue Züricher Zeitung and der Tagesanzeiger (CH).

The media analysis covered almost 2000 articles produced between 2000 and 2009,

Roughly 44 % of all articles were accounted for by the two German print media, while Switzerland and Austria had a share of 29 % and 27 % respectively, with in each case one national newspaper having published significantly more articles with nanotechnology topics. At the beginning of the investigation period, the frequency of articles still varied considerably in the different countries, but converged towards the end of the period.

The reports on nanotechnology are overwhelmingly (88 %) to be found in fact-focused report formats such as news reports or background coverage, while a small percentage of the contributions are drawn up in the form of interviews, comments and essays.

There’s a bit of a surprise (to me) concerning popular topics in that medical applications don’t place first in terms of interest,

Topics related to basic research, which for instance include toxicology and risk research, constituted an in part clear majority in all three countries. Applications in the field of information and communication technology, extending from data media to sensors, were the second most frequently referred to topic. Medical applications, from diagnostics to specific therapies, occupied third place in all three countries, although relatively speaking there were somewhat more reports about medical topics in Austria than in the other two countries.  [emphases mine] Reports from the field of business and politics, dealing above all with companies, research subsidies, environment and economic policies, occupied places four and five.

The conclusion of this Spotlight article seems to hint at a little disappointment,

The reporting on nanotechnology in the media in the three German-speaking countries is largely science-centred and attracts a generally low level of attention amongst the broad public thanks to its less emphasised placing. There is hardly any opinion-focused reporting, with classical news reports and reports relating to current research activities or events predominating. In all three countries, the newspapers’ science departments play a dominant role, and scientists also play a central role as actors.

An event-focused positive representation predominates. A focus on risks and controversial reporting, a concern raised regularly in expert circles, was not proven in the present study. Risk topics play a role in fewer than 20 % of articles; the benefits and opportunities of nanotechnology, on the other hand, are mentioned in 80 % of all articles.

Benefits are seen above all for science. Scientific actors are likewise mentioned relatively frequently, which indicates the close connections between science and business, and the economic expectations of nanotechnology. One would have to examine the extent to which the absence of controversies can be attributed to the hitherto lack of evidence of possible dangers and risks or to well-functioning strategic scientific PR work. [emphasis mine]

Why mention  “well-functioning strategic scientific PR work” in the conclusion when there has been no mention of public relations (PR) in any other section of this dossier?  As well, if strategic scientific PR work was that effective, then nuclear power might not be quite so controversial.

Overall, this study doesn’t break any new ground but does confirm a growing consensus of opinion, the public regardless of which country (with the possible exception of France) we are discussing tends not to be all that interested in nanotechnology.

For those curious about the French controversies, there’s a mention in my March 10, 2010 posting (scroll down about 1/4 of the way) about an Agence Science-Presse radio interview with Celine Lafontaine, a Quebec-based academic who studies the social impact of nanotechnology and was in France during a very contentious series of public debates on the subject.

For anyone who found the reference to ‘actors’ in this research a little unexpected, the term is being used by researchers who are using ‘actor-network theory’ as an analytical tool. You can find out more about actor-network theory in this Wikipedia essay.

A couple of starter articles on nanotechnology and its good/bad possibilities

It’s been a long time since I’ve featured  any explanations of nanotechnology. Julie Deardorff in her July 10, 2012 article for the Chicago Tribune offers an excellent introduction to nanotechnology and the benefits and risks associated with it,

Improved sunscreens are just one of the many innovative uses of nanotechnology, which involves drastically shrinking and fundamentally changing the structure of chemical compounds. But products made with nanomaterials also raise largely unanswered safety questions — such as whether the particles that make them effective can be absorbed into the bloodstream and are toxic to living cells.

Less than two decades old, the nanotech industry is booming. Nanoparticles — measured in billionths of a meter — are already found in thousands of consumer products, including cosmetics, pharmaceuticals, anti-microbial infant toys, sports equipment, food packaging and electronics. In addition to producing transparent sunscreens, nanomaterials help make light and sturdy tennis rackets, clothes that don’t stain and stink-free socks.

The particles can alter how products look or function because matter behaves differently at the nanoscale, taking on unique and mysterious chemical and physical properties. Materials made of nanoparticles may be more conductive, stronger or more chemically reactive than those containing larger particles of the same compound.

If you would like more information and another perspective (Deardorff’s article is US-focussed), you can read the July 11, 2012 Nanowerk Spotlight article submitted by NanoTrust, Austrian Academy of Sciences (Note: I have removed footnotes),

Nanotechnology is often referred to as being a “key technology” of the 21st century, and the expectations for innovative products and new market potentials are high. The prediction is that novel products with new or improved functionality, or revolutionary developments in the field of medicine, will improve our lives in the future. Importantly, these technical innovations have also raised great hopes in the environmental sector.

Rising prices for raw materials and energy, coupled with the increasing environmental awareness of consumers, are responsible for a flood of products on the market that promise certain advantages for environmental and climate protection. Nanomaterials exhibit special physical and chemical properties that make them interesting for novel, environmentally friendly products.

Emphasis is often placed on the sustainable potential of nanotechnology. Nonetheless, this usually reflects unsubstantiated expectations7. Determining the actual effects of a product on the environment – both positive and negative – requires examining the entire life cycle from production of the raw material to disposal at the end of the life cycle. As a rule, the descriptions of environmental benefits fail to consider the amount of resources and energy consumed in producing the products.

While it’s not as friendly as Deardorff’s, this is a good companion piece as it offers a broader range of  nanotechnology topics and issues and a healthy selection of resources. In addition, Nanotrust has a number of dossiers available for more nanotechnology reading.