Tag Archives: Philipp Schindler

Simulating elementary physics in a quantum simulation (particle zoo in a quantum computer?)

Whoever wrote the news release used a very catchy title “Particle zoo in a quantum computer”; I just wish they’d explained it. Looking up the definition for a ‘particle zoo’ didn’t help as much as I’d hoped. From the particle zoo entry on Wikipedia (Note: Links have been removed),

In particle physics, the term particle zoo[1][2] is used colloquially to describe a relatively extensive list of the then known “elementary particles” that almost look like hundreds of species in the zoo.

In the history of particle physics, the situation was particularly confusing in the late 1960s. Before the discovery of quarks, hundreds of strongly interacting particles (hadrons) were known, and believed to be distinct elementary particles in their own right. It was later discovered that they were not elementary particles, but rather composites of the quarks. The set of particles believed today to be elementary is known as the Standard Model, and includes quarks, bosons and leptons.

I believe the writer used the term to indicate that the simulation undertaken involved elementary particles. If you have a better explanation, please feel free to add it to the comments for this post.

Here’s the news from a June 22, 2016 news item on ScienceDaily,

Elementary particles are the fundamental buildings blocks of matter, and their properties are described by the Standard Model of particle physics. The discovery of the Higgs boson at the CERN in 2012 constitutes a further step towards the confirmation of the Standard Model. However, many aspects of this theory are still not understood because their complexity makes it hard to investigate them with classical computers. Quantum computers may provide a way to overcome this obstacle as they can simulate certain aspects of elementary particle physics in a well-controlled quantum system. Physicists from the University of Innsbruck and the Institute for Quantum Optics and Quantum Information (IQOQI) at the Austrian Academy of Sciences have now done exactly that: In an international first, Rainer Blatt’s and Peter Zoller’s research teams have simulated lattice gauge theories in a quantum computer. …

A June 23, 2016 University of Innsbruck (Universität Innsbruck) press release, which seems  to have originated the news item, provides more detail,

Gauge theories describe the interaction between elementary particles, such as quarks and gluons, and they are the basis for our understanding of fundamental processes. “Dynamical processes, for example, the collision of elementary particles or the spontaneous creation of particle-antiparticle pairs, are extremely difficult to investigate,” explains Christine Muschik, theoretical physicist at the IQOQI. “However, scientists quickly reach a limit when processing numerical calculations on classical computers. For this reason, it has been proposed to simulate these processes by using a programmable quantum system.” In recent years, many interesting concepts have been proposed, but until now it was impossible to realize them. “We have now developed a new concept that allows us to simulate the spontaneous creation of electron-positron pairs out of the vacuum by using a quantum computer,” says Muschik. The quantum system consists of four electromagnetically trapped calcium ions that are controlled by laser pulses. “Each pair of ions represent a pair of a particle and an antiparticle,” explains experimental physicist Esteban A. Martinez. “We use laser pulses to simulate the electromagnetic field in a vacuum. Then we are able to observe how particle pairs are created by quantum fluctuations from the energy of this field. By looking at the ion’s fluorescence, we see whether particles and antiparticles were created. We are able to modify the parameters of the quantum system, which allows us to observe and study the dynamic process of pair creation.”

Combining different fields of physics

With this experiment, the physicists in Innsbruck have built a bridge between two different fields in physics: They have used atomic physics experiments to study questions in high-energy physics. While hundreds of theoretical physicists work on the highly complex theories of the Standard Model and experiments are carried out at extremely expensive facilities, such as the Large Hadron Collider at CERN, quantum simulations may be carried out by small groups in tabletop experiments. “These two approaches complement one another perfectly,” says theoretical physicist Peter Zoller. “We cannot replace the experiments that are done with particle colliders. However, by developing quantum simulators, we may be able to understand these experiments better one day.” Experimental physicist Rainer Blatt adds: “Moreover, we can study new processes by using quantum simulation. For example, in our experiment we also investigated particle entanglement produced during pair creation, which is not possible in a particle collider.” The physicists are convinced that future quantum simulators will potentially be able to solve important questions in high-energy physics that cannot be tackled by conventional methods.

Foundation for a new research field

It was only a few years ago that the idea to combine high-energy and atomic physics was proposed. With this work it has been implemented experimentally for the first time. “This approach is conceptually very different from previous quantum simulation experiments studying many-body physics or quantum chemistry. The simulation of elementary particle processes is theoretically very complex and, therefore, has to satisfy very specific requirements. For this reason it is difficult to develop a suitable protocol,” underlines Zoller. The conditions for the experimental physicists were equally demanding: “This is one of the most complex experiments that has ever been carried out in a trapped-ion quantum computer,” says Blatt. “We are still figuring out how these quantum simulations work and will only gradually be able to apply them to more challenging phenomena.” The great theoretical as well as experimental expertise of the physicists in Innsbruck was crucial for the breakthrough. Both Blatt and Zoller emphasize that they have been doing research on quantum computers for many years now and have gained a lot of experience in their implementation. Innsbruck has become one of the leading centers for research in quantum physics; here, the theoretical and experimental branches work together at an extremely high level, which enables them to gain novel insights into fundamental phenomena.

Here’s a link to and a citation for the paper,

Real-time dynamics of lattice gauge theories with a few-qubit quantum computer by Esteban A. Martinez, Christine A. Muschik, Philipp Schindler, Daniel Nigg, Alexander Erhard, Markus Heyl, Philipp Hauke, Marcello Dalmonte, Thomas Monz, Peter Zoller, & Rainer Blatt.  Nature 534, 516–519 (23 June 2016)  doi:10.1038/nature18318 Published online 22 June 2016

This paper is behind a paywall.

There is a soundcloud audio file featuring an explanation of the work from the lead author, Esteban A. Martinez,

Data transmisstion at 1.44 terabits per second

It’s not only the amount of data we have which is increasing but the amount of data we want to transmit from one place to another. An April 14, 2014 news item on ScienceDaily describes a new technique designed to increase data transmission rates,

Miniaturized optical frequency comb sources allow for transmission of data streams of several terabits per second over hundreds of kilometers — this has now been demonstrated by researchers of Karlsruhe Institute of Technology (KIT) and the Swiss École Polytechnique Fédérale de Lausanne (EPFL) in a experiment presented in the journal Nature Photonics. The results may contribute to accelerating data transmission in large computing centers and worldwide communication networks.

In the study presented in Nature Photonics, the scientists of KIT, together with their EPFL colleagues, applied a miniaturized frequency comb as optical source. They reached a data rate of 1.44 terabits per second and the data was transmitted over a distance of 300 km. This corresponds to a data volume of more than 100 million telephone calls or up to 500,000 high-definition (HD) videos. For the first time, the study shows that miniaturized optical frequency comb sources are suited for coherent data transmission in the terabit range.

The April (?) 2014 KIT news release, which originated the news item, describes some of the current transmission technology’s constraints,

The amount of data generated and transmitted worldwide is growing continuously. With the help of light, data can be transmitted rapidly and efficiently. Optical communication is based on glass fibers, through which optical signals can be transmitted over large distances with hardly any losses. So-called wavelength division multiplexing (WDM) techniques allow for the transmission of several data channels independently of each other on a single optical fiber, thereby enabling extremely high data rates. For this purpose, the information is encoded on laser light of different wavelengths, i.e. different colors. However, scalability of such systems is limited, as presently an individual laser is required for each transmission channel. In addition, it is difficult to stabilize the wavelengths of these lasers, which requires additional spectral guard bands between the data channels to prevent crosstalk.

The news release goes on to further describe the new technology using ‘combs’,

Optical frequency combs, for the development of which John Hall and Theodor W. Hänsch received the 2005 Nobel Prize in Physics, consist of many densely spaced spectral lines, the distances of which are identical and exactly known. So far, frequency combs have been used mainly for highly precise optical atomic clocks or optical rulers measuring optical frequencies with utmost precision. However, conventional frequency comb sources are bulky and costly devices and hence not very well suited for use in data transmission. Moreover, spacing of the spectral lines in conventional frequency combs often is too small and does not correspond to the channel spacing used in optical communications, which is typically larger than 20 GHz.

In their joint experiment, the researchers of KIT and the EPFL have now demonstrated that integrated optical frequency comb sources with large line spacings can be realized on photonic chips and applied for the transmission of large data volumes. For this purpose, they use an optical microresonator made of silicon nitride, into which laser light is coupled via a waveguide and stored for a long time. “Due to the high light intensity in the resonator, the so-called Kerr effect can be exploited to produce a multitude of spectral lines from a single continuous-wave laser beam, hence forming a frequency comb,” explains Jörg Pfeifle, who performed the transmission experiment at KIT. This method to generate these so-called Kerr frequency combs was discovered by Tobias Kippenberg, EPFL, in 2007. Kerr combs are characterized by a large optical bandwidth and can feature line spacings that perfectly meet the requirements of data transmission. The underlying microresonators are produced with the help of complex nanofabrication methods by the EPFL Center of Micronanotechnology. “We are among the few university research groups that are able to produce such samples,” comments Kippenberg. Work at EPFL was funded by the Swiss program “NCCR Nanotera” and the European Space Agency [ESA].

Scientists of KIT’s Institute of Photonics and Quantum Electronics (IPQ) and Institute of Microstructure Technology (IMT) are the first to use such Kerr frequency combs for high-speed data transmission. “The use of Kerr combs might revolutionize communication within data centers, where highly compact transmission systems of high capacity are required most urgently,” Christian Koos says.

Here’s a link to and a citation for the paper,

Coherent terabit communications with microresonator Kerr frequency combs by Joerg Pfeifle, Victor Brasch, Matthias Lauermann, Yimin Yu, Daniel Wegner, Tobias Herr, Klaus Hartinger, Philipp Schindler, Jingshi Li, David Hillerkuss, Rene Schmogrow, Claudius Weimann, Ronald Holzwarth, Wolfgang Freude, Juerg Leuthold, Tobias J. Kippenberg, & Christian Koos. Nature Photonics (2014) doi:10.1038/nphoton.2014.57 Published online 13 April 2014

This paper is behind a paywall.