Tag Archives: quantum computing

D-Wave’s new Advantage quantum computer

Thanks to Bob Yirka’s September 30, 2020 article for phys.org there’s an announcement about D-Wave Systems’ latest quantum computer and an explanation of how D-Wave’s quantum computer differs from other quantum computers. Here’s the explanation (Note: Links have been removed),

Over the past several years, several companies have dedicated resources to the development of a true quantum computer that can tackle problems conventional computers cannot handle. Progress on developing such computers has been slow, however, especially when compared with the early development of the conventional computer. As part of the research effort, companies have taken different approaches. Google and IBM, for example, are working on gate-model quantum computer technology, in which qubits are modified as an algorithm is executed. D-Wave, in sharp contrast, has been focused on developing so-called annealer technology, in which qubits are cooled during execution of an algorithm, which allows for passively changing their value.

Comparing the two is next to impossible because of their functional differences. Thus, using 5,000 qubits in the Advantage system does not necessarily mean that it is any more useful than the 100-qubit systems currently being tested by IBM or Google. Still, the announcement suggests that businesses are ready to start taking advantage of the increased capabilities of quantum systems. D-Wave notes that several customers are already using their system for a wide range of applications. Menten AI, for example, has used the system to design new proteins; grocery chain Save-On-Foods has been using it to optimize business operations; Accenture has been using it to develop business applications; Volkswagen has used the system to develop a more efficient car painting system.

Here’s the company’s Sept. 29, 2020 video announcement,

For those who might like some text, there’s a Sept. 29, 2020 D-Wave Systems press release (Note: Links have been removed; this is long),

D-Wave Systems Inc., the leader in quantum computing systems, software, and services, today [Sept. 29, 2020] announced the general availability of its next-generation quantum computing platform, incorporating new hardware, software, and tools to enable and accelerate the delivery of in-production quantum computing applications. Available today in the Leap™ quantum cloud service, the platform includes the Advantage™ quantum system, with more than 5000 qubits and 15-way qubit connectivity, in addition to an expanded hybrid solver service that can run problems with up to one million variables. The combination of the computing power of Advantage and the scale to address real-world problems with the hybrid solver service in Leap enables businesses to run performant, real-time, hybrid quantum applications for the first time.

As part of its commitment to enabling businesses to build in-production quantum applications, the company announced D-Wave Launch™, a jump-start program for businesses who want to get started building hybrid quantum applications today but may need additional support. Bringing together a team of applications experts and a robust partner community, the D-Wave Launch program provides support to help identify the best applications and to translate businesses’ problems into hybrid quantum applications. The extra support helps customers accelerate designing, building, and running their most important and complex applications, while delivering quantum acceleration and performance.

The company also announced a new hybrid solver. The discrete quadratic model (DQM) solver gives developers and businesses the ability to apply the benefits of hybrid quantum computing to new problem classes. Instead of accepting problems with only binary variables (0 or 1), the DQM solver uses other variable sets (e.g. integers from 1 to 500, or red, yellow, and blue), expanding the types of problems that can run on the quantum computer. The DQM solver will be generally available on October 8 [2020].

With support for new solvers and larger problem sizes backed by the Advantage system, customers and partners like Menten AI, Save-On-Foods, Accenture, and Volkswagen are building and running hybrid quantum applications that create solutions with business value today.

  • Protein design pioneer Menten AI has developed the first process using hybrid quantum programs to determine protein structure for de novo protein design with very encouraging results often outperforming classical solvers. Menten AI’s unique protein designs have been computationally validated, chemically synthesized, and are being advanced to live-virus testing against COVID-19.
  • Western Canadian grocery retailer Save-On-Foods is using hybrid quantum algorithms to bring grocery optimization solutions to their business, with pilot tests underway in-store. The company has been able to reduce the time an important optimization task takes from 25 hours to a mere 2 minutes of calculations each week. Even more important than the reduction in time is the ability to optimize performance across and between a significant number of business parameters in a way that is challenging using traditional methods.
  • Accenture, a leading global professional services company, is exploring quantum, quantum-inspired, and hybrid solutions to develop applications across industries. Accenture recently conducted a series of business experiments with a banking client to pilot quantum applications for currency arbitrage, credit scoring, and trading optimization, successfully mapping computationally challenging business problems to quantum formulations, enabling quantum readiness.
  • Volkswagen, an early adopter of D-Wave’s annealing quantum computer, has expanded its quantum use cases with the hybrid solver service to build a paint shop scheduling application. The algorithm is designed to optimize the order in which cars are being painted. By using the hybrid solver service, the number of color switches will be reduced significantly, leading to performance improvements.

The Advantage quantum computer and the Leap quantum cloud service include:

  • New Topology: The topology in Advantage makes it the most connected of any commercial quantum system in the world. In the D-Wave 2000Q™ system, qubits may connect to 6 other qubits. In the new Advantage system, each qubit may connect to 15 other qubits. With two-and-a-half times more connectivity, Advantage enables the embedding of larger problems with fewer physical qubits compared to using the D-Wave 2000Q system. The D-Wave Ocean™ software development kit (SDK) includes tools for using the new topology. Information on the topology in Advantage can be found in this white paper, and a getting started video on how to use the new topology can be found here.
  • Increased Qubit Count: With more than 5000 qubits, Advantage more than doubles the qubit count of the D-Wave 2000Q system. More qubits and richer connectivity provide quantum programmers access to a larger, denser, and more powerful graph for building commercial quantum applications.
  • Greater Performance & Problem Size: With up to one million variables, the hybrid solver service in Leap allows businesses to run large-scale, business-critical problems. This, coupled with the new topology and more than 5000 qubits in the Advantage system, expands the complexity and more than doubles the size of problems that can run directly on the quantum processing unit (QPU). In fact, the hybrid solver outperformed or matched the best of 27 classical optimization solvers on 87% of 45 application-relevant inputs tested in MQLib. Additionally, greater connectivity of the QPU allows for more compact embeddings of complex problems. Advantage can find optimal solutions 10 to 30 times faster in some cases, and can find better quality solutions up to 64% percent of the time, when compared to the D-Wave 2000Q LN QPU.
  • Expansion of Hybrid Software & Tools in Leap: Further investments in the hybrid solver service, new solver classes, ease-of-use, automation, and new tools provide an even more powerful hybrid rapid development environment in Python for business-scale problems.
  • Flexible Access: Advantage, the expanded hybrid solver service, and the upcoming DQM solver are available in the Leap quantum cloud service. All current Leap customers get immediate access with no additional charge, and new customers will benefit from all the new and existing capabilities in Leap. This means that developers and businesses can get started today building in-production hybrid quantum applications. Flexible purchase plans allow developers and forward-thinking businesses to access the D-Wave quantum system in the way that works for them and their business. 
  • Ongoing Releases: D-Wave continues to bring innovations to market with additional hybrid solvers, QPUs, and software updates through the cloud. Interested users and customers can get started today with Advantage and the hybrid solver service, and will benefit from new components of the platform through Leap as they become available.

“Today’s general availability of Advantage delivers the first quantum system built specifically for business, and marks the expansion into production scale commercial applications and new problem types with our hybrid solver services. In combination with our new jump-start program to get customers started, this launch continues what we’ve known at D-Wave for a long time: it’s not about hype, it’s about scaling, and delivering systems that provide real business value on real business applications,” said Alan Baratz, CEO, D-Wave. “We also continue to invest in the science of building quantum systems. Advantage was completely re-engineered from the ground up. We’ll take what we’ve learned about connectivity and scale and continue to push the limits of innovation for the next generations of our quantum computers. I’m incredibly proud of the team that has brought us here and the customers and partners who have collaborated with us to build hundreds of early applications and who now are putting applications into production.”

“We are using quantum to design proteins today. Using hybrid quantum applications, we’re able to solve astronomical protein design problems that help us create new protein structures,” said Hans Melo, Co-founder and CEO, Menten AI. “We’ve seen extremely encouraging results with hybrid quantum procedures often finding better solutions than competing classical solvers for de novo protein design. This means we can create better proteins and ultimately enable new drug discoveries.”

“At Save-On-Foods, we have been committed to bringing innovation to our customers for more than 105 years. To that end, we are always looking for new and creative ways to solve problems, especially in an environment that has gotten increasingly complex,” said Andrew Donaher, Vice President, Digital & Analytics at Save-On-Foods. “We’re new to quantum computing, and in a short period of time, we have seen excellent early results. In fact, the early results we see with Advantage and the hybrid solver service from D-Wave are encouraging enough that our goal is to turn our pilot into an in-production business application. Quantum is emerging as a potential competitive edge for our business.“

“Accenture is committed to helping our clients prepare for the arrival of mainstream quantum computing by exploring relevant use cases and conducting business experiments now,” said Marc Carrel-Billiard, Senior Managing Director and Technology Innovation Lead at Accenture. “We’ve been collaborating with D-Wave for several years and with early access to the Advantage system and hybrid solver service we’ve seen performance improvements and advancements in the platform that are important steps for helping to make quantum a reality for clients across industries, creating new sources of competitive advantage.”

“Embracing quantum computing is nothing new for Volkswagen. We were the first to run a hybrid quantum application in production in Lisbon last November with our bus routing application,” said Florian Neukart, Director of Advanced Technologies at Volkswagen Group of America. “At Volkswagen, we are focusing on building up a deep understanding of meaningful applications of quantum computing in a corporate context. The D-Wave system gives us the opportunity to address optimization tasks with a large number of variables at an impressive speed. With this we are taking a step further towards quantum applications that will be suitable for everyday business use.”

I found the description of D-Wave’s customers and how they’re using quantum computing to be quite interesting. For anyone curious about D-Wave Systems, you can find out more here. BTW, the company is located in metro Vancouver (Canada).

Live music by teleportation? Catch up. It’s already happened.

Dr. Alexis Kirke first graced this blog about four years ago, in a July 8, 2016 posting titled, Cornwall (UK) connects with University of Southern California for performance by a quantum computer (D-Wave) and mezzo soprano Juliette Pochin.

Kirke now returns with a study showing how teleportation helped to create a live performance piece, from a July 2, 2020 news item on ScienceDaily,

Teleportation is most commonly the stuff of science fiction and, for many, would conjure up the immortal phrase “Beam me up, Scotty.”

However, a new study has described how its status in science fact could actually be employed as another, and perhaps unlikely, form of entertainment — live music.

Dr Alexis Kirke, Senior Research Fellow in the Interdisciplinary Centre for Computer Music Research at the University of Plymouth (UK), has for the first time shown that a human musician can communicate directly with a quantum computer via teleportation.

The result is a high-tech jamming session, through which a blend of live human and computer-generated sounds come together to create a unique performance piece.

A July 2, 2020 Plymouth University press release (also on EurekAlert), which originated the news item, offers more detail about this latest work along with some information about the 2016 performance and how it all provides insight into how quantum computing might function in the future,

Speaking about the study, published in the current issue of the Journal of New Music Research, Dr Kirke said: “The world is racing to build the first practical and powerful quantum computers, and whoever succeeds first will have a scientific and military advantage because of the extreme computing power of these machines. This research shows for the first time that this much-vaunted advantage can also be helpful in the world of making and performing music. No other work has shown this previously in the arts, and it demonstrates that quantum power is something everyone can appreciate and enjoy.”

Quantum teleportation is the ability to instantaneously transmit quantum information over vast distances, with scientists having previously used it to send information from Earth to an orbiting satellite over 870 miles away.

In the current study, Dr Kirke describes how he used a system called MIq (Multi-Agent Interactive qgMuse), in which an IBM quantum computer executes a methodology called Grover’s Algorithm.

Discovered by Lov Grover at Bell Labs in 1996, it was the second main quantum algorithm (after Shor’s algorithm) and gave a huge advantage over traditional computing.

In this instance, it allows the dynamic solving of musical logical rules which, for example, could prevent dissonance or keep to ¾ instead of common time.

It is significantly faster than any classical computer algorithm, and Dr Kirke said that speed was essential because there is actually no way to transmit quantum information other than through teleportation.

The result was that when played the theme from Game of Thrones on the piano, the computer – a 14-qubit machine housed at IBM in Melbourne – rapidly generated accompanying music that was transmitted back in response.

Dr Kirke, who in 2016 staged the first ever duet between a live singer and a quantum supercomputer, said: “At the moment there are limits to how complex a real-time computer jamming system can be. The number of musical rules that a human improviser knows intuitively would simply take a computer too long to solve to real-time music. Shortcuts have been invented to speed up this process in rule-based AI music, but using the quantum computer speed-up has not be tried before. So while teleportation cannot move information faster than the speed of light, if remote collaborators want to connect up their quantum computers – which they are using to increase the speed of their musical AIs – it is 100% necessary. Quantum information simply cannot be transmitted using normal digital transmission systems.”

Caption: Dr Alexis Kirke (right) and soprano Juliette Pochin during the first duet between a live singer and a quantum supercomputer. Credit: University of Plymouth

Here’s a link to and a citation for the latest research,

Testing a hybrid hardware quantum multi-agent system architecture that utilizes the quantum speed advantage for interactive computer music by Alexis Kirke. Journal of New Music Research Volume 49, 2020 – Issue 3 Pages 209-230 DOI: https://doi.org/10.1080/09298215.2020.1749672 Published online: 13 Apr 2020

This paper appears to be open access.

Fourth Industrial Revolution and its impact on charity organizations

Andy Levy-Ajzenkopf’s February 21, 2020 article (Technology and innovation: How the Fourth Industrial Revolution is impacting the charitable sector) for Charity Village has an ebullient approach to adoption of new and emerging technologies in the charitable sector (Note: A link has been removed),

Almost daily, new technologies are being developed to help innovate the way people give or the way organizations offer opportunities to advance their causes. There is no going back.

The charitable sector – along with society at large – is now fully in the midst of what is being called the Fourth Industrial Revolution, a term first brought to prominence among CEOs, thought leaders and policy makers at the 2016 World Economic Forum. And if you haven’t heard the phrase yet, get ready to hear it tons more as economies around the world embrace it.

To be clear, the Fourth Industrial Revolution is the newest disruption in the way our world works. When you hear someone talk about it, what they’re describing is the massive technological shift in our business and personal ecosystems that now rely heavily on things like artificial intelligence, quantum computing, 3D printing and the general “Internet of things.”

Still, now more than ever, charitable business is getting done and being advanced by sector pioneers who aren’t afraid to make use of new technologies on offer to help civil society.

It seems like everywhere one turns, the topic of artificial intelligence (A.I.) is increasingly becoming subject of choice.

This is no different in the charitable sector, and particularly so for a new company called Fundraise Wisely (aka Wisely). Its co-founder and CEO, Artiom Komarov, explains a bit about what exactly his tech is doing for the sector.

“We help accelerate fundraising, with A.I. At a product level, we connect to your CRM (content relationship management system) and predict the next gift and next gift date for every donor. We then use that information to help you populate and prioritize donor portfolios,” Komarov states.

He notes that his company is seeing increased demand for innovative technologies from charities over the last while.

“What we’re hearing is that… A.I. tech is compelling because at the end of the day it’s meant to move the bottom line, helping nonprofits grow their revenue. We’ve also found that internally [at a charitable organization] there’s always a champion that sees the potential impact of technology; and that’s a great place to start with change,” Komarov says. “If it’s done right, tech can be an enabler of better work for organizations. From both research and experience, we know that tech adoption usually fails because of culture rather than the underlying technology. We’re here to work with the client closely to help that transition.”

I would like to have seen some numbers. For example, Komarov says that AI is having a positive impact on a charity’s bottom line. So, how much money did one of these charities raise? Was it more money than they would have made without AI? Assuming they did manage to raise greater funds, could another technology been more cost effective?

For another perspective (equally positive) on technology and charity, there’s a November 29, 2012 posting (Why technology and innovation are key to increasing charity donations) on the Guardian blogs by Henna Butt and Renita Shah (Note: Links have been removed),

At the beginning of this year the [UK] Cabinet Office and Nesta [formerly National Endowment for Science, Technology and the Arts {NESTA}] announced a £10m fund to invest in innovation in giving. The first tranche of this money has already been invested in promising initiatives such as Timto which allows you to create a gift list that includes a charity donation and Pennies, whose electronic money box allows customers to donate when paying for something in a shop using a credit card. Small and sizeable organisations alike are now using web and mobile technologies to make giving more convenient, more social and more compelling.

Butt’s and Shah’s focus was on mobile technologies and social networks. Like Levy-Ajzenkopf’s article, there’s no discussion of any possible downside to these technologies, e.g., privacy issues. As well, the inevitability of this move toward more technology for charity is explicitly stated by Levy-Ajzenkopf “There is no going back” and noted less starkly by Butt and Shah “… innovation is becoming increasingly important for the success of charities.” To rephrase my concern, are we utilizing technology in our work or are we serving the needs of our technology?

Finally, for anyone who’s curious about the Fourth Industrial Revolution, I have a December 3, 2015 posting about it.

Cryonaut LEGO ®, quantum computing, and Season’s Greetings for 2019!

Caption: For the first time, LEGO ® has been cooled to the lowest temperature possible in an experiment which reveals a new use for the popular toy. Credit: Josh Chawner

Pretty interesting science and seasonally appropriate for large numbers of people, this video was posted on December 23, 2019 (from YouTube’s The World’s Coolest LEGO Set! webpage),

Hamster Productions 154K subscribers Our LEGO insulator paper: https://nature.com/articles/s41598-01… A world leading team of ultra-low temperature physicists at Lancaster University decided to place a LEGO figure and four LEGO blocks inside their record-breaking dilution refrigerator. This machine – specially made at the University – is the most effective refrigerator in the world, capable of reaching 1.6 millidegrees above absolute zero (minus 273.15 Centigrade), which is about 200,000 times colder than room temperature and 2,000 times colder than deep space. This research was lead by Low Temperature Physicist Dr. Dmitry Zmeev https://twitter.com/dmitry_zmeev ——————————- TRANSLATORS: Chinese (Traditional) – Hsin Hui Chang Russian – Dmitry Zmeev Dutch – Ruben Leenders Spanish – Marta San Juan Mucientes Italian – Leonardo Forcieri Polish – Veronica Letka ——————————– …

From a December 23, 2019 news item on ScienceDaily,

For the first time, LEGO ® has been cooled to the lowest temperature possible in an experiment which reveals a new use for the popular toy.

Its special properties mean it could be useful in the development of quantum computing.

A world leading team of ultra-low temperature physicists at Lancaster University decided to place a LEGO ® figure and four LEGO ® blocks inside their record-breaking dilution refrigerator.

This machine — specially made at the University — is the most effective refrigerator in the world, capable of reaching 1.6 millidegrees above absolute zero (minus 273.15 Centigrade), which is about 200,000 times colder than room temperature and 2,000 times colder than deep space.

The results — published in the journal Scientific Reports — were surprising.

A December 23, 2019 Lancaster University press release (also on EurekAlert), which originated the news item, expands on the theme,

Dr Dmitry Zmeev, who led the research team, said: “”Our results are significant because we found that the clamping arrangement between the LEGO ® blocks causes the LEGO ® structures to behave as an extremely good thermal insulator at cryogenic temperatures.

“This is very desirable for construction materials used for the design of future scientific equipment like dilution refrigerators.”

Invented 50 years ago, the dilution refrigerator is at the centre of a global multi-billion dollar industry and is crucial to the work of modern experimental physics and engineering, including the development of quantum computers.

The use of ABS plastic structures, such as LEGO ®, instead of the solid materials currently in use, means that any future thermal insulator could be produced at a significantly reduced cost.

Researchers say the next step is to design and 3D print a new thermal insulator for the next generation of dilution refrigerators.

Here’s a link to and a citation for the paper,

LEGO® Block Structures as a Sub-Kelvin Thermal Insulator by J. M. A. Chawner, A. T. Jones, M. T. Noble, G. R. Pickett, V. Tsepelin & D. E. Zmeev. Scientific Reports volume 9, Article number: 19642 (2019) doi:10.1038/s41598-019-55616-7 Published 23 December 2019

This paper is open access.

Finally, Joyeux Noël et Bonne année 2020!

‘Superconductivity: The Musical!’ wins the 2018 Dance Your Ph.D. competition

I can’t believe that October 24, 2011 was the last time the Dance Your Ph.D. competition was featured here. Time flies, eh? Here’s the 2018 contest winner’s submission, Superconductivity: The Musical!, (Note: This video is over 11 mins. long),

A February 17, 2019 CBC (Canadian Broadcasting Corporation) news item introduces the video’s writer, producer,s musician, and scientist,

Swing dancing. Songwriting. And theoretical condensed matter physics.

It’s a unique person who can master all three, but a University of Alberta PhD student has done all that and taken it one step further by making a rollicking music video about his academic pursuits — and winning an international competition for his efforts.

Pramodh Senarath Yapa is the winner of the 2018 Dance Your PhD contest, which challenges scientists around the world to explain their research through a jargon-free medium: dance.

The prize is $1,000 and “immortal geek fame.”

Yapa’s video features his friends twirling, swinging and touch-stepping their way through an explanation of his graduate research, called “Non-Local Electrodynamics of Superconducting Wires: Implications for Flux Noise and Inductance.”

Jennifer Ouelette’s February 17, 2019 posting for the ars Technica blog offers more detail (Note: A link has been removed),

Yapa’s research deals with how matter behaves when it’s cooled to very low temperatures, when quantum effects kick in—such as certain metals becoming superconductive, or capable of conducting electricity with zero resistance. That’s useful for any number of practical applications. D-Wave Systems [a company located in metro Vancouver {Canada}], for example, is building quantum computers using loops of superconducting wire. For his thesis, “I had to use the theory of superconductivity to figure out how to build a better quantum computer,” said Yapa.

Condensed matter theory (the precise description of Yapa’s field of research) is a notoriously tricky subfield to make palatable for a non-expert audience. “There isn’t one unifying theory or a single tool that we use,” he said. “Condensed matter theorists study a million different things using a million different techniques.”

His conceptual breakthrough came about when he realized electrons were a bit like “unsociable people” who find joy when they pair up with other electrons. “You can imagine electrons as a free gas, which means they don’t interact with each other,” he said. “The theory of superconductivity says they actually form pairs when cooled below a certain temperature. That was the ‘Eureka!’ moment, when I realized I could totally use swing dancing.”

John Bohannon’s Feb. 15, 2019 article for Science (magazine) offers an update on Yapa’s research interests (it seems that Yapa was dancing his Masters degree) and more information about the contest itself ,

..

“I remember hearing about Dance Your Ph.D. many years ago and being amazed at all the entries,” Yapa says. “This is definitely a longtime dream come true.” His research, meanwhile, has evolved from superconductivity—which he pursued at the University of Victoria in Canada, where he completed a master’s degree—to the physics of superfluids, the focus of his Ph.D. research at the University of Alberta.

This is the 11th year of Dance Your Ph.D. hosted by Science and AAAS. The contest challenges scientists around the world to explain their research through the most jargon-free medium available: interpretive dance.

“Most people would not normally think of interpretive dance as a tool for scientific communication,” says artist Alexa Meade, one of the judges of the contest. “However, the body can express conceptual thoughts through movement in ways that words and data tables cannot. The results are both artfully poetic and scientifically profound.”

Getting back to the February 17, 2019 CBC news item,

Yapa describes his video, filmed in Victoria where he earned his master’s degree, as a “three act, mini-musical.”

“I envisioned it as talking about the social lives of electrons,” he said. “The electrons starts out in a normal metal, at normal temperatures….We say these electrons are non-interacting. They don’t talk to each other. Electrons ignore each other and are very unsociable.”

The electrons — represented by dancers wearing saddle oxfords, poodle skirts, vests and suspenders — shuffle up the dance floor by themselves.

In the second act, the metal is cooled.

“The electrons become very unhappy about being alone. They want to find a partner, some companionship for the cold times,” he said

That’s when the electrons join up into something called Cooper pairs.

The dancers join together, moving to lyrics like, “If we peek/the Coopers are cheek-to-cheek.

In the final act, Yapa gets his dancers to demonstrate what happens when the Cooper pairs meet the impurities of the materials they’re moving in. All of a sudden, a group of black-leather-clad thugs move onto the dance floor.

“The Cooper pairs come dancing near these impurities and they’re like these crotchety old people yelling and shaking their fists at these young dancers,” Yapa explained.

Yapa’s entry to the annual contest swept past 49 other contestants to earn him the win. The competition is sponsored by Science magazine and the American Association for the Advancement of Science.

Congratulations to Pramodh Senarath Yapa.

D-Wave and the first large-scale quantum simulation of topological state of matter

This is all about a local (Burnaby is one of the metro Vancouver municipalities) quantum computing companies, D-Wave Systems. The company has been featured here from time to time. It’s usually about about their quantum technology (they are considered a technology star in local and [I think] other circles) but my March 9, 2018 posting about the SXSW (South by Southwest) festival noted that Bo Ewald, President, D-Wave Systems US, was a member of the ‘Quantum Computing: Science Fiction to Science Fact’ panel.

Now, they’re back making technology announcements like this August 22, 2018 news item on phys.org (Note: Links have been removed),

D-Wave Systems today [August 22, 2018] published a milestone study demonstrating a topological phase transition using its 2048-qubit annealing quantum computer. This complex quantum simulation of materials is a major step toward reducing the need for time-consuming and expensive physical research and development.

The paper, entitled “Observation of topological phenomena in a programmable lattice of 1,800 qubits”, was published in the peer-reviewed journal Nature. This work marks an important advancement in the field and demonstrates again that the fully programmable D-Wave quantum computer can be used as an accurate simulator of quantum systems at a large scale. The methods used in this work could have broad implications in the development of novel materials, realizing Richard Feynman’s original vision of a quantum simulator. This new research comes on the heels of D-Wave’s recent Science paper demonstrating a different type of phase transition in a quantum spin-glass simulation. The two papers together signify the flexibility and versatility of the D-Wave quantum computer in quantum simulation of materials, in addition to other tasks such as optimization and machine learning.

An August 22, 2108 D-Wave Systems news release (also on EurekAlert), which originated the news item, delves further (Note: A link has been removed),

In the early 1970s, theoretical physicists Vadim Berezinskii, J. Michael Kosterlitz and David Thouless predicted a new state of matter characterized by nontrivial topological properties. The work was awarded the Nobel Prize in Physics in 2016. D-Wave researchers demonstrated this phenomenon by programming the D-Wave 2000Q™ system to form a two-dimensional frustrated lattice of artificial spins. The observed topological properties in the simulated system cannot exist without quantum effects and closely agree with theoretical predictions.

“This paper represents a breakthrough in the simulation of physical systems which are otherwise essentially impossible,” said 2016 Nobel laureate Dr. J. Michael Kosterlitz. “The test reproduces most of the expected results, which is a remarkable achievement. This gives hope that future quantum simulators will be able to explore more complex and poorly understood systems so that one can trust the simulation results in quantitative detail as a model of a physical system. I look forward to seeing future applications of this simulation method.”

“The work described in the Nature paper represents a landmark in the field of quantum computation: for the first time, a theoretically predicted state of matter was realized in quantum simulation before being demonstrated in a real magnetic material,” said Dr. Mohammad Amin, chief scientist at D-Wave. “This is a significant step toward reaching the goal of quantum simulation, enabling the study of material properties before making them in the lab, a process that today can be very costly and time consuming.”

“Successfully demonstrating physics of Nobel Prize-winning importance on a D-Wave quantum computer is a significant achievement in and of itself. But in combination with D-Wave’s recent quantum simulation work published in Science, this new research demonstrates the flexibility and programmability of our system to tackle recognized, difficult problems in a variety of areas,” said Vern Brownell, D-Wave CEO.

“D-Wave’s quantum simulation of the Kosterlitz-Thouless transition is an exciting and impactful result. It not only contributes to our understanding of important problems in quantum magnetism, but also demonstrates solving a computationally hard problem with a novel and efficient mapping of the spin system, requiring only a limited number of qubits and opening new possibilities for solving a broader range of applications,” said Dr. John Sarrao, principal associate director for science, technology, and engineering at Los Alamos National Laboratory.

“The ability to demonstrate two very different quantum simulations, as we reported in Science and Nature, using the same quantum processor, illustrates the programmability and flexibility of D-Wave’s quantum computer,” said Dr. Andrew King, principal investigator for this work at D-Wave. “This programmability and flexibility were two key ingredients in Richard Feynman’s original vision of a quantum simulator and open up the possibility of predicting the behavior of more complex engineered quantum systems in the future.”

The achievements presented in Nature and Science join D-Wave’s continued work with world-class customers and partners on real-world prototype applications (“proto-apps”) across a variety of fields. The 70+ proto-apps developed by customers span optimization, machine learning, quantum material science, cybersecurity, and more. Many of the proto-apps’ results show that D-Wave systems are approaching, and sometimes surpassing, conventional computing in terms of performance or solution quality on real problems, at pre-commercial scale. As the power of D-Wave systems and software expands, these proto-apps point to the potential for scaled customer application advantage on quantum computers.

The company has prepared a video describing Richard Feynman’s proposal about quantum computing and celebrating their latest achievement,

Here’s the company’s Youtube video description,

In 1982, Richard Feynman proposed the idea of simulating the quantum physics of complex systems with a programmable quantum computer. In August 2018, his vision was realized when researchers from D-Wave Systems and the Vector Institute demonstrated the simulation of a topological phase transition—the subject of the 2016 Nobel Prize in Physics—in a fully programmable D-Wave 2000Q™ annealing quantum computer. This complex quantum simulation of materials is a major step toward reducing the need for time-consuming and expensive physical research and development.

You may want to check out the comments in response to the video.

Here’s a link to and a citation for the Nature paper,

Observation of topological phenomena in a programmable lattice of 1,800 qubits by Andrew D. King, Juan Carrasquilla, Jack Raymond, Isil Ozfidan, Evgeny Andriyash, Andrew Berkley, Mauricio Reis, Trevor Lanting, Richard Harris, Fabio Altomare, Kelly Boothby, Paul I. Bunyk, Colin Enderud, Alexandre Fréchette, Emile Hoskinson, Nicolas Ladizinsky, Travis Oh, Gabriel Poulin-Lamarre, Christopher Rich, Yuki Sato, Anatoly Yu. Smirnov, Loren J. Swenson, Mark H. Volkmann, Jed Whittaker, Jason Yao, Eric Ladizinsky, Mark W. Johnson, Jeremy Hilton, & Mohammad H. Amin. Nature volume 560, pages456–460 (2018) DOI: https://doi.org/10.1038/s41586-018-0410-x Published 22 August 2018

This paper is behind a paywall but, for those who don’t have access, there is a synopsis here.

For anyone curious about the earlier paper published in July 2018, here’s a link and a citation,

Phase transitions in a programmable quantum spin glass simulator by R. Harris, Y. Sato, A. J. Berkley, M. Reis, F. Altomare, M. H. Amin, K. Boothby, P. Bunyk, C. Deng, C. Enderud, S. Huang, E. Hoskinson, M. W. Johnson, E. Ladizinsky, N. Ladizinsky, T. Lanting, R. Li, T. Medina, R. Molavi, R. Neufeld, T. Oh, I. Pavlov, I. Perminov, G. Poulin-Lamarre, C. Rich, A. Smirnov, L. Swenson, N. Tsai, M. Volkmann, J. Whittaker, J. Yao. Science 13 Jul 2018: Vol. 361, Issue 6398, pp. 162-165 DOI: 10.1126/science.aat2025

This paper too is behind a paywall.

You can find out more about D-Wave here.

More memory, less space and a walk down the cryptocurrency road

Libraries, archives, records management, oral history, etc. there are many institutions and names for how we manage collective and personal memory. You might call it a peculiarly human obsession stretching back into antiquity. For example, there’s the Library of Alexandria (Wikipedia entry) founded in the third, or possibly 2nd, century BCE (before the common era) and reputed to store all the knowledge in the world. It was destroyed although accounts differ as to when and how but its loss remains a potent reminder of memory’s fragility.

These days, the technology community is terribly concerned with storing ever more bits of data on materials that are reaching their limits for storage.I have news of a possible solution,  an interview of sorts with the researchers working on this new technology, and some very recent research into policies for cryptocurrency mining and development. That bit about cryptocurrency makes more sense when you read what the response to one of the interview questions.

Memory

It seems University of Alberta researchers may have found a way to increase memory exponentially, from a July 23, 2018 news item on ScienceDaily,

The most dense solid-state memory ever created could soon exceed the capabilities of current computer storage devices by 1,000 times, thanks to a new technique scientists at the University of Alberta have perfected.

“Essentially, you can take all 45 million songs on iTunes and store them on the surface of one quarter,” said Roshan Achal, PhD student in Department of Physics and lead author on the new research. “Five years ago, this wasn’t even something we thought possible.”

A July 23, 2018 University of Alberta news release (also on EurekAlert) by Jennifer-Anne Pascoe, which originated the news item, provides more information,

Previous discoveries were stable only at cryogenic conditions, meaning this new finding puts society light years closer to meeting the need for more storage for the current and continued deluge of data. One of the most exciting features of this memory is that it’s road-ready for real-world temperatures, as it can withstand normal use and transportation beyond the lab.

“What is often overlooked in the nanofabrication business is actual transportation to an end user, that simply was not possible until now given temperature restrictions,” continued Achal. “Our memory is stable well above room temperature and precise down to the atom.”

Achal explained that immediate applications will be data archival. Next steps will be increasing readout and writing speeds, meaning even more flexible applications.

More memory, less space

Achal works with University of Alberta physics professor Robert Wolkow, a pioneer in the field of atomic-scale physics. Wolkow perfected the art of the science behind nanotip technology, which, thanks to Wolkow and his team’s continued work, has now reached a tipping point, meaning scaling up atomic-scale manufacturing for commercialization.

“With this last piece of the puzzle now in-hand, atom-scale fabrication will become a commercial reality in the very near future,” said Wolkow. Wolkow’s Spin-off [sic] company, Quantum Silicon Inc., is hard at work on commercializing atom-scale fabrication for use in all areas of the technology sector.

To demonstrate the new discovery, Achal, Wolkow, and their fellow scientists not only fabricated the world’s smallest maple leaf, they also encoded the entire alphabet at a density of 138 terabytes, roughly equivalent to writing 350,000 letters across a grain of rice. For a playful twist, Achal also encoded music as an atom-sized song, the first 24 notes of which will make any video-game player of the 80s and 90s nostalgic for yesteryear but excited for the future of technology and society.

As noted in the news release, there is an atom-sized song, which is available in this video,

As for the nano-sized maple leaf, I highlighted that bit of whimsy in a June 30, 2017 posting.

Here’s a link to and a citation for the paper,

Lithography for robust and editable atomic-scale silicon devices and memories by Roshan Achal, Mohammad Rashidi, Jeremiah Croshaw, David Churchill, Marco Taucer, Taleana Huff, Martin Cloutier, Jason Pitters, & Robert A. Wolkow. Nature Communicationsvolume 9, Article number: 2778 (2018) DOI: https://doi.org/10.1038/s41467-018-05171-y Published 23 July 2018

This paper is open access.

For interested parties, you can find Quantum Silicon (QSI) here. My Edmonton geography is all but nonexistent, still, it seems to me the company address on Saskatchewan Drive is a University of Alberta address. It’s also the address for the National Research Council of Canada. Perhaps this is a university/government spin-off company?

The ‘interview’

I sent some questions to the researchers at the University of Alberta who very kindly provided me with the following answers. Roshan Achal passed on one of the questions to his colleague Taleana Huff for her response. Both Achal and Huff are associated with QSI.

Unfortunately I could not find any pictures of all three researchers (Achal, Huff, and Wolkow) together.

Roshan Achal (left) used nanotechnology perfected by his PhD supervisor, Robert Wolkow (right) to create atomic-scale computer memory that could exceed the capacity of today’s solid-state storage drives by 1,000 times. (Photo: Faculty of Science)

(1) SHRINKING THE MANUFACTURING PROCESS TO THE ATOMIC SCALE HAS
ATTRACTED A LOT OF ATTENTION OVER THE YEARS STARTING WITH SCIENCE
FICTION OR RICHARD FEYNMAN OR K. ERIC DREXLER, ETC. IN ANY EVENT, THE
ORIGINS ARE CONTESTED SO I WON’T PUT YOU ON THE SPOT BY ASKING WHO
STARTED IT ALL INSTEAD ASKING HOW DID YOU GET STARTED?

I got started in this field about 6 years ago, when I undertook a MSc
with Dr. Wolkow here at the University of Alberta. Before that point, I
had only ever heard of a scanning tunneling microscope from what was
taught in my classes. I was aware of the famous IBM logo made up from
just a handful of atoms using this machine, but I didn’t know what
else could be done. Here, Dr. Wolkow introduced me to his line of
research, and I saw the immense potential for growth in this area and
decided to pursue it further. I had the chance to interact with and
learn from nanofabrication experts and gain the skills necessary to
begin playing around with my own techniques and ideas during my PhD.

(2) AS I UNDERSTAND IT, THESE ARE THE PIECES YOU’VE BEEN
WORKING ON: (1) THE TUNGSTEN MICROSCOPE TIP, WHICH MAKE[s] (2) THE SMALLEST
QUANTUM DOTS (SINGLE ATOMS OF SILICON), (3) THE AUTOMATION OF THE
QUANTUM DOT PRODUCTION PROCESS, AND (4) THE “MOST DENSE SOLID-STATE
MEMORY EVER CREATED.” WHAT’S MISSING FROM THE LIST AND IS THAT WHAT
YOU’RE WORKING ON NOW?

One of the things missing from the list, that we are currently working
on, is the ability to easily communicate (electrically) from the
macroscale (our world) to the nanoscale, without the use of a scanning
tunneling microscope. With this, we would be able to then construct
devices using the other pieces we’ve developed up to this point, and
then integrate them with more conventional electronics. This would bring
us yet another step closer to the realization of atomic-scale
electronics.

(3) PERHAPS YOU COULD CLARIFY SOMETHING FOR ME. USUALLY WHEN SOLID STATE
MEMORY IS MENTIONED, THERE’S GREAT CONCERN ABOUT MOORE’S LAW. IS
THIS WORK GOING TO CREATE A NEW LAW? AND, WHAT IF ANYTHING DOES
;YOUR MEMORY DEVICE HAVE TO DO WITH QUANTUM COMPUTING?

That is an interesting question. With the density we’ve achieved,
there are not too many surfaces where atomic sites are more closely
spaced to allow for another factor of two improvement. In that sense, it
would be difficult to improve memory densities further using these
techniques alone. In order to continue Moore’s law, new techniques, or
storage methods would have to be developed to move beyond atomic-scale
storage.

The memory design itself does not have anything to do with quantum
computing, however, the lithographic techniques developed through our
work, may enable the development of certain quantum-dot-based quantum
computing schemes.

(4) THIS MAY BE A LITTLE OUT OF LEFT FIELD (OR FURTHER OUT THAN THE
OTHERS), COULD;YOUR MEMORY DEVICE HAVE AN IMPACT ON THE
DEVELOPMENT OF CRYPTOCURRENCY AND BLOCKCHAIN? IF SO, WHAT MIGHT THAT
IMPACT BE?

I am not very familiar with these topics, however, co-author Taleana
Huff has provided some thoughts:

Taleana Huff (downloaded from https://ca.linkedin.com/in/taleana-huff]

“The memory, as we’ve designed it, might not have too much of an
impact in and of itself. Cryptocurrencies fall into two categories.
Proof of Work and Proof of Stake. Proof of Work relies on raw
computational power to solve a difficult math problem. If you solve it,
you get rewarded with a small amount of that coin. The problem is that
it can take a lot of power and energy for your computer to crunch
through that problem. Faster access to memory alone could perhaps
streamline small parts of this slightly, but it would be very slight.
Proof of Stake is already quite power efficient and wouldn’t really
have a drastic advantage from better faster computers.

Now, atomic-scale circuitry built using these new lithographic
techniques that we’ve developed, which could perform computations at
significantly lower energy costs, would be huge for Proof of Work coins.
One of the things holding bitcoin back, for example, is that mining it
is now consuming power on the order of the annual energy consumption
required by small countries. A more efficient way to mine while still
taking the same amount of time to solve the problem would make bitcoin
much more attractive as a currency.”

Thank you to Roshan Achal and Taleana Huff for helping me to further explore the implications of their work with Dr. Wolkow.

Comments

As usual, after receiving the replies I have more questions but these people have other things to do so I’ll content myself with noting that there is something extraordinary in the fact that we can imagine a near future where atomic scale manufacturing is possible and where as Achal says, ” … storage methods would have to be developed to move beyond atomic-scale [emphasis mine] storage”. In decades past it was the stuff of science fiction or of theorists who didn’t have the tools to turn the idea into a reality. With Wolkow’s, Achal’s, Hauff’s, and their colleagues’ work, atomic scale manufacturing is attainable in the foreseeable future.

Hopefully we’ll be wiser than we have been in the past in how we deploy these new manufacturing techniques. Of course, before we need the wisdom, scientists, as  Achal notes,  need to find a new way to communicate between the macroscale and the nanoscale.

As for Huff’s comments about cryptocurrencies and cyptocurrency and blockchain technology, I stumbled across this very recent research, from a July 31, 2018 Elsevier press release (also on EurekAlert),

A study [behind a paywall] published in Energy Research & Social Science warns that failure to lower the energy use by Bitcoin and similar Blockchain designs may prevent nations from reaching their climate change mitigation obligations under the Paris Agreement.

The study, authored by Jon Truby, PhD, Assistant Professor, Director of the Centre for Law & Development, College of Law, Qatar University, Doha, Qatar, evaluates the financial and legal options available to lawmakers to moderate blockchain-related energy consumption and foster a sustainable and innovative technology sector. Based on this rigorous review and analysis of the technologies, ownership models, and jurisdictional case law and practices, the article recommends an approach that imposes new taxes, charges, or restrictions to reduce demand by users, miners, and miner manufacturers who employ polluting technologies, and offers incentives that encourage developers to create less energy-intensive/carbon-neutral Blockchain.

“Digital currency mining is the first major industry developed from Blockchain, because its transactions alone consume more electricity than entire nations,” said Dr. Truby. “It needs to be directed towards sustainability if it is to realize its potential advantages.

“Many developers have taken no account of the environmental impact of their designs, so we must encourage them to adopt consensus protocols that do not result in high emissions. Taking no action means we are subsidizing high energy-consuming technology and causing future Blockchain developers to follow the same harmful path. We need to de-socialize the environmental costs involved while continuing to encourage progress of this important technology to unlock its potential economic, environmental, and social benefits,” explained Dr. Truby.

As a digital ledger that is accessible to, and trusted by all participants, Blockchain technology decentralizes and transforms the exchange of assets through peer-to-peer verification and payments. Blockchain technology has been advocated as being capable of delivering environmental and social benefits under the UN’s Sustainable Development Goals. However, Bitcoin’s system has been built in a way that is reminiscent of physical mining of natural resources – costs and efforts rise as the system reaches the ultimate resource limit and the mining of new resources requires increasing hardware resources, which consume huge amounts of electricity.

Putting this into perspective, Dr. Truby said, “the processes involved in a single Bitcoin transaction could provide electricity to a British home for a month – with the environmental costs socialized for private benefit.

“Bitcoin is here to stay, and so, future models must be designed without reliance on energy consumption so disproportionate on their economic or social benefits.”

The study evaluates various Blockchain technologies by their carbon footprints and recommends how to tax or restrict Blockchain types at different phases of production and use to discourage polluting versions and encourage cleaner alternatives. It also analyzes the legal measures that can be introduced to encourage technology innovators to develop low-emissions Blockchain designs. The specific recommendations include imposing levies to prevent path-dependent inertia from constraining innovation:

  • Registration fees collected by brokers from digital coin buyers.
  • “Bitcoin Sin Tax” surcharge on digital currency ownership.
  • Green taxes and restrictions on machinery purchases/imports (e.g. Bitcoin mining machines).
  • Smart contract transaction charges.

According to Dr. Truby, these findings may lead to new taxes, charges or restrictions, but could also lead to financial rewards for innovators developing carbon-neutral Blockchain.

The press release doesn’t fully reflect Dr. Truby’s thoughtfulness or the incentives he has suggested. it’s not all surcharges, taxes, and fees constitute encouragement.  Here’s a sample from the conclusion,

The possibilities of Blockchain are endless and incentivisation can help solve various climate change issues, such as through the development of digital currencies to fund climate finance programmes. This type of public-private finance initiative is envisioned in the Paris Agreement, and fiscal tools can incentivize innovators to design financially rewarding Blockchain technology that also achieves environmental goals. Bitcoin, for example, has various utilitarian intentions in its White Paper, which may or may not turn out to be as envisioned, but it would not have been such a success without investors seeking remarkable returns. Embracing such technology, and promoting a shift in behaviour with such fiscal tools, can turn the industry itself towards achieving innovative solutions for environmental goals.

I realize Wolkow, et. al, are not focused on cryptocurrency and blockchain technology per se but as Huff notes in her reply, “… new lithographic techniques that we’ve developed, which could perform computations at significantly lower energy costs, would be huge for Proof of Work coins.”

Whether or not there are implications for cryptocurrencies, energy needs, climate change, etc., it’s the kind of innovative work being done by scientists at the University of Alberta which may have implications in fields far beyond the researchers’ original intentions such as more efficient computation and data storage.

ETA Aug. 6, 2018: Dexter Johnson weighed in with an August 3, 2018 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website),

Researchers at the University of Alberta in Canada have developed a new approach to rewritable data storage technology by using a scanning tunneling microscope (STM) to remove and replace hydrogen atoms from the surface of a silicon wafer. If this approach realizes its potential, it could lead to a data storage technology capable of storing 1,000 times more data than today’s hard drives, up to 138 terabytes per square inch.

As a bit of background, Gerd Binnig and Heinrich Rohrer developed the first STM in 1986 for which they later received the Nobel Prize in physics. In the over 30 years since an STM first imaged an atom by exploiting a phenomenon known as tunneling—which causes electrons to jump from the surface atoms of a material to the tip of an ultrasharp electrode suspended a few angstroms above—the technology has become the backbone of so-called nanotechnology.

In addition to imaging the world on the atomic scale for the last thirty years, STMs have been experimented with as a potential data storage device. Last year, we reported on how IBM (where Binnig and Rohrer first developed the STM) used an STM in combination with an iron atom to serve as an electron-spin resonance sensor to read the magnetic pole of holmium atoms. The north and south poles of the holmium atoms served as the 0 and 1 of digital logic.

The Canadian researchers have taken a somewhat different approach to making an STM into a data storage device by automating a known technique that uses the ultrasharp tip of the STM to apply a voltage pulse above an atom to remove individual hydrogen atoms from the surface of a silicon wafer. Once the atom has been removed, there is a vacancy on the surface. These vacancies can be patterned on the surface to create devices and memories.

If you have the time, I recommend reading Dexter’s posting as he provides clear explanations, additional insight into the work, and more historical detail.

New breed of memristors?

This new ‘breed’ of memristor (a component in brain-like/neuromorphic computing) is a kind of thin film. First, here’s an explanation of neuromorphic computing from the Finnish researchers looking into a new kind of memristor, from a January 10, 2018 news item on Nanowerk,

The internet of things [IOT] is coming, that much we know. But still it won’t; not until we have components and chips that can handle the explosion of data that comes with IoT. In 2020, there will already be 50 billion industrial internet sensors in place all around us. A single autonomous device – a smart watch, a cleaning robot, or a driverless car – can produce gigabytes of data each day, whereas an airbus may have over 10 000 sensors in one wing alone.

Two hurdles need to be overcome. First, current transistors in computer chips must be miniaturized to the size of only few nanometres; the problem is they won’t work anymore then. Second, analysing and storing unprecedented amounts of data will require equally huge amounts of energy. Sayani Majumdar, Academy Fellow at Aalto University, along with her colleagues, is designing technology to tackle both issues.

Majumdar has with her colleagues designed and fabricated the basic building blocks of future components in what are called “neuromorphic” computers inspired by the human brain. It’s a field of research on which the largest ICT companies in the world and also the EU are investing heavily. Still, no one has yet come up with a nano-scale hardware architecture that could be scaled to industrial manufacture and use.

An Aalto University January 10, 2018 press release, which originated the news item, provides more detail about the work,

“The technology and design of neuromorphic computing is advancing more rapidly than its rival revolution, quantum computing. There is already wide speculation both in academia and company R&D about ways to inscribe heavy computing capabilities in the hardware of smart phones, tablets and laptops. The key is to achieve the extreme energy-efficiency of a biological brain and mimic the way neural networks process information through electric impulses,” explains Majumdar.

Basic components for computers that work like the brain

In their recent article in Advanced Functional Materials, Majumdar and her team show how they have fabricated a new breed of “ferroelectric tunnel junctions”, that is, few-nanometre-thick ferroelectric thin films sandwiched between two electrodes. They have abilities beyond existing technologies and bode well for energy-efficient and stable neuromorphic computing.

The junctions work in low voltages of less than five volts and with a variety of electrode materials – including silicon used in chips in most of our electronics. They also can retain data for more than 10 years without power and be manufactured in normal conditions.

Tunnel junctions have up to this point mostly been made of metal oxides and require 700 degree Celsius temperatures and high vacuums to manufacture. Ferroelectric materials also contain lead which makes them – and all our computers – a serious environmental hazard.

“Our junctions are made out of organic hydro-carbon materials and they would reduce the amount of toxic heavy metal waste in electronics. We can also make thousands of junctions a day in room temperature without them suffering from the water or oxygen in the air”, explains Majumdar.

What makes ferroelectric thin film components great for neuromorphic computers is their ability to switch between not only binary states – 0 and 1 – but a large number of intermediate states as well. This allows them to ‘memorise’ information not unlike the brain: to store it for a long time with minute amounts of energy and to retain the information they have once received – even after being switched off and on again.

We are no longer talking of transistors, but ‘memristors’. They are ideal for computation similar to that in biological brains.  Take for example the Mars 2020 Rover about to go chart the composition of another planet. For the Rover to work and process data on its own using only a single solar panel as an energy source, the unsupervised algorithms in it will need to use an artificial brain in the hardware.

“What we are striving for now, is to integrate millions of our tunnel junction memristors into a network on a one square centimetre area. We can expect to pack so many in such a small space because we have now achieved a record-high difference in the current between on and off-states in the junctions and that provides functional stability. The memristors could then perform complex tasks like image and pattern recognition and make decisions autonomously,” says Majumdar.

The probe-station device (the full instrument, left, and a closer view of the device connection, right) which measures the electrical responses of the basic components for computers mimicking the human brain. The tunnel junctions are on a thin film on the substrate plate. Photo: Tapio Reinekoski

Here’s a link to and a citation for the paper,

Electrode Dependence of Tunneling Electroresistance and Switching Stability in Organic Ferroelectric P(VDF-TrFE)-Based Tunnel Junctions by Sayani Majumdar, Binbin Chen, Qi Hang Qin, Himadri S. Majumdar, and Sebastiaan van Dijken. Advanced Functional Materials Vol. 28 Issue 2 DOI: 10.1002/adfm.201703273 Version of Record online: 27 NOV 2017

© 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

Quantum computing and more at SXSW (South by Southwest) 2018

It’s that time of year again. The entertainment conference such as South by South West (SXSW) is being held from March 9-18, 2018. The science portion of the conference can be found in the Intelligent Future sessions, from the description,

AI and new technologies embody the realm of possibilities where intelligence empowers and enables technology while sparking legitimate concerns about its uses. Highlighted Intelligent Future sessions include New Mobility and the Future of Our Cities, Mental Work: Moving Beyond Our Carbon Based Minds, Can We Create Consciousness in a Machine?, and more.

Intelligent Future Track sessions are held March 9-15 at the Fairmont.

Last year I focused on the conference sessions on robots, Hiroshi Ishiguro’s work, and artificial intelligence in a  March 27, 2017 posting. This year I’m featuring one of the conference’s quantum computing session, from a March 9, 2018 University of Texas at Austin news release  (also on EurekAlert),

Imagine a new kind of computer that can quickly solve problems that would stump even the world’s most powerful supercomputers. Quantum computers are fundamentally different. They can store information as not only just ones and zeros, but in all the shades of gray in-between. Several companies and government agencies are investing billions of dollars in the field of quantum information. But what will quantum computers be used for?

South by Southwest 2018 hosts a panel on March 10th [2018] called Quantum Computing: Science Fiction to Science Fact. Experts on quantum computing make up the panel, including Jerry Chow of IBM; Bo Ewald of D-Wave Systems; Andrew Fursman of 1QBit; and Antia Lamas-Linares of the Texas Advanced Computing Center at UT Austin.

Antia Lamas-Linares is a Research Associate in the High Performance Computing group at TACC. Her background is as an experimentalist with quantum computing systems, including work done with them at the Centre for Quantum Technologies in Singapore. She joins podcast host Jorge Salazar to talk about her South by Southwest panel and about some of her latest research on quantum information.

Lamas-Linares co-authored a study (doi: 10.1117/12.2290561) in the Proceedings of the SPIE, The International Society for Optical Engineering, that published in February of 2018. The study, “Secure Quantum Clock Synchronization,” proposed a protocol to verify and secure time synchronization of distant atomic clocks, such as those used for GPS signals in cell phone towers and other places. “It’s important work,” explained Lamas-Linares, “because people are worried about malicious parties messing with the channels of GPS. What James Troupe (Applied Research Laboratories, UT Austin) and I looked at was whether we can use techniques from quantum cryptography and quantum information to make something that is inherently unspoofable.”

Antia Lamas-Linares: The most important thing is that quantum technologies is a really exciting field. And it’s exciting in a fundamental sense. We don’t quite know what we’re going to get out of it. We know a few things, and that’s good enough to drive research. But the things we don’t know are much broader than the things we know, and it’s going to be really interesting. Keep your eyes open for this.

Quantum Computing: Science Fiction to Science Fact, March 10, 2018 | 11:00AM – 12:00PM, Fairmont Manchester EFG, SXSW 2018, Austin, TX.

If you look up the session, you will find,

Quantum Computing: Science Fiction to Science Fact

Quantum Computing: Science Fiction to Science Fact

Speakers

Bo Ewald

D-Wave Systems

Antia Lamas-Linares

Texas Advanced Computing Center at University of Texas

Startups and established players have sold 2000 Qubit systems, made freely available cloud access to quantum computer processors, and created large scale open source initiatives, all taking quantum computing from science fiction to science fact. Government labs and others like IBM, Microsoft, Google are developing software for quantum computers. What problems will be solved with this quantum leap in computing power that cannot be solved today with the world’s most powerful supercomputers?

[Programming descriptions are generated by participants and do not necessarily reflect the opinions of SXSW.]

Favorited by (1128)

View all

Primary Entry: Platinum Badge, Interactive Badge

Secondary Entry: Music Badge, Film Badge

Format: Panel

Event Type: Session

Track: Intelligent Future

Level: Intermediate

I wonder what ‘level’ means? I was not able to find an answer (quickly).

It’s was a bit surprising to find someone from D-Wave Systems (a Vancouver-based quantum computing based enterprise) at an entertainment conference. Still, it shouldn’t have been. Two other examples immediately come to mind, the TED (technology, entertainment, and design) conferences have been melding technology, if not science, with creative activities of all kinds for many years (TED 2018: The Age of Amazement, April 10 -14, 2018 in Vancouver [Canada]) and Beakerhead (2018 dates: Sept. 19 – 23) has been melding art, science, and engineering in a festival held in Calgary (Canada) since 2013. One comment about TED, it was held for several years in California (1984, 1990 – 2013) and moved to Vancouver in 2014.

For anyone wanting to browse the 2018 SxSW Intelligent Future sessions online, go here. or wanting to hear Antia Lamas-Linares talk about quantum computing, there’s the interview with Jorge Salazar (mentioned in the news release),

Alberta adds a newish quantum nanotechnology research hub to the Canada’s quantum computing research scene

One of the winners in Canada’s 2017 federal budget announcement of the Pan-Canadian Artificial Intelligence Strategy was Edmonton, Alberta. It’s a fact which sometimes goes unnoticed while Canadians marvel at the wonderfulness found in Toronto and Montréal where it seems new initiatives and monies are being announced on a weekly basis (I exaggerate) for their AI (artificial intelligence) efforts.

Alberta’s quantum nanotechnology hub (graduate programme)

Intriguingly, it seems that Edmonton has higher aims than (an almost unnoticed) leadership in AI. Physicists at the University of Alberta have announced hopes to be just as successful as their AI brethren in a Nov. 27, 2017 article by Juris Graney for the Edmonton Journal,

Physicists at the University of Alberta [U of A] are hoping to emulate the success of their artificial intelligence studying counterparts in establishing the city and the province as the nucleus of quantum nanotechnology research in Canada and North America.

Google’s artificial intelligence research division DeepMind announced in July [2017] it had chosen Edmonton as its first international AI research lab, based on a long-running partnership with the U of A’s 10-person AI lab.

Retaining the brightest minds in the AI and machine-learning fields while enticing a global tech leader to Alberta was heralded as a coup for the province and the university.

It is something U of A physics professor John Davis believes the university’s new graduate program, Quanta, can help achieve in the world of quantum nanotechnology.

The field of quantum mechanics had long been a realm of theoretical science based on the theory that atomic and subatomic material like photons or electrons behave both as particles and waves.

“When you get right down to it, everything has both behaviours (particle and wave) and we can pick and choose certain scenarios which one of those properties we want to use,” he said.

But, Davis said, physicists and scientists are “now at the point where we understand quantum physics and are developing quantum technology to take to the marketplace.”

“Quantum computing used to be realm of science fiction, but now we’ve figured it out, it’s now a matter of engineering,” he said.

Quantum computing labs are being bought by large tech companies such as Google, IBM and Microsoft because they realize they are only a few years away from having this power, he said.

Those making the groundbreaking developments may want to commercialize their finds and take the technology to market and that is where Quanta comes in.

East vs. West—Again?

Ivan Semeniuk in his article, Quantum Supremacy, ignores any quantum research effort not located in either Waterloo, Ontario or metro Vancouver, British Columbia to describe a struggle between the East and the West (a standard Canadian trope). From Semeniuk’s Oct. 17, 2017 quantum article [link follows the excerpts] for the Globe and Mail’s October 2017 issue of the Report on Business (ROB),

 Lazaridis [Mike], of course, has experienced lost advantage first-hand. As co-founder and former co-CEO of Research in Motion (RIM, now called Blackberry), he made the smartphone an indispensable feature of the modern world, only to watch rivals such as Apple and Samsung wrest away Blackberry’s dominance. Now, at 56, he is engaged in a high-stakes race that will determine who will lead the next technology revolution. In the rolling heartland of southwestern Ontario, he is laying the foundation for what he envisions as a new Silicon Valley—a commercial hub based on the promise of quantum technology.

Semeniuk skips over the story of how Blackberry lost its advantage. I came onto that story late in the game when Blackberry was already in serious trouble due to a failure to recognize that the field they helped to create was moving in a new direction. If memory serves, they were trying to keep their technology wholly proprietary which meant that developers couldn’t easily create apps to extend the phone’s features. Blackberry also fought a legal battle in the US with a patent troll draining company resources and energy in proved to be a futile effort.

Since then Lazaridis has invested heavily in quantum research. He gave the University of Waterloo a serious chunk of money as they named their Quantum Nano Centre (QNC) after him and his wife, Ophelia (you can read all about it in my Sept. 25, 2012 posting about the then new centre). The best details for Lazaridis’ investments in Canada’s quantum technology are to be found on the Quantum Valley Investments, About QVI, History webpage,

History-bannerHistory has repeatedly demonstrated the power of research in physics to transform society.  As a student of history and a believer in the power of physics, Mike Lazaridis set out in 2000 to make real his bold vision to establish the Region of Waterloo as a world leading centre for physics research.  That is, a place where the best researchers in the world would come to do cutting-edge research and to collaborate with each other and in so doing, achieve transformative discoveries that would lead to the commercialization of breakthrough  technologies.

Establishing a World Class Centre in Quantum Research:

The first step in this regard was the establishment of the Perimeter Institute for Theoretical Physics.  Perimeter was established in 2000 as an independent theoretical physics research institute.  Mike started Perimeter with an initial pledge of $100 million (which at the time was approximately one third of his net worth).  Since that time, Mike and his family have donated a total of more than $170 million to the Perimeter Institute.  In addition to this unprecedented monetary support, Mike also devotes his time and influence to help lead and support the organization in everything from the raising of funds with government and private donors to helping to attract the top researchers from around the globe to it.  Mike’s efforts helped Perimeter achieve and grow its position as one of a handful of leading centres globally for theoretical research in fundamental physics.

Stephen HawkingPerimeter is located in a Governor-General award winning designed building in Waterloo.  Success in recruiting and resulting space requirements led to an expansion of the Perimeter facility.  A uniquely designed addition, which has been described as space-ship-like, was opened in 2011 as the Stephen Hawking Centre in recognition of one of the most famous physicists alive today who holds the position of Distinguished Visiting Research Chair at Perimeter and is a strong friend and supporter of the organization.

Recognizing the need for collaboration between theorists and experimentalists, in 2002, Mike applied his passion and his financial resources toward the establishment of The Institute for Quantum Computing at the University of Waterloo.  IQC was established as an experimental research institute focusing on quantum information.  Mike established IQC with an initial donation of $33.3 million.  Since that time, Mike and his family have donated a total of more than $120 million to the University of Waterloo for IQC and other related science initiatives.  As in the case of the Perimeter Institute, Mike devotes considerable time and influence to help lead and support IQC in fundraising and recruiting efforts.  Mike’s efforts have helped IQC become one of the top experimental physics research institutes in the world.

Quantum ComputingMike and Doug Fregin have been close friends since grade 5.  They are also co-founders of BlackBerry (formerly Research In Motion Limited).  Doug shares Mike’s passion for physics and supported Mike’s efforts at the Perimeter Institute with an initial gift of $10 million.  Since that time Doug has donated a total of $30 million to Perimeter Institute.  Separately, Doug helped establish the Waterloo Institute for Nanotechnology at the University of Waterloo with total gifts for $29 million.  As suggested by its name, WIN is devoted to research in the area of nanotechnology.  It has established as an area of primary focus the intersection of nanotechnology and quantum physics.

With a donation of $50 million from Mike which was matched by both the Government of Canada and the province of Ontario as well as a donation of $10 million from Doug, the University of Waterloo built the Mike & Ophelia Lazaridis Quantum-Nano Centre, a state of the art laboratory located on the main campus of the University of Waterloo that rivals the best facilities in the world.  QNC was opened in September 2012 and houses researchers from both IQC and WIN.

Leading the Establishment of Commercialization Culture for Quantum Technologies in Canada:

In the Research LabFor many years, theorists have been able to demonstrate the transformative powers of quantum mechanics on paper.  That said, converting these theories to experimentally demonstrable discoveries has, putting it mildly, been a challenge.  Many naysayers have suggested that achieving these discoveries was not possible and even the believers suggested that it could likely take decades to achieve these discoveries.  Recently, a buzz has been developing globally as experimentalists have been able to achieve demonstrable success with respect to Quantum Information based discoveries.  Local experimentalists are very much playing a leading role in this regard.  It is believed by many that breakthrough discoveries that will lead to commercialization opportunities may be achieved in the next few years and certainly within the next decade.

Recognizing the unique challenges for the commercialization of quantum technologies (including risk associated with uncertainty of success, complexity of the underlying science and high capital / equipment costs) Mike and Doug have chosen to once again lead by example.  The Quantum Valley Investment Fund will provide commercialization funding, expertise and support for researchers that develop breakthroughs in Quantum Information Science that can reasonably lead to new commercializable technologies and applications.  Their goal in establishing this Fund is to lead in the development of a commercialization infrastructure and culture for Quantum discoveries in Canada and thereby enable such discoveries to remain here.

Semeniuk goes on to set the stage for Waterloo/Lazaridis vs. Vancouver (from Semeniuk’s 2017 ROB article),

… as happened with Blackberry, the world is once again catching up. While Canada’s funding of quantum technology ranks among the top five in the world, the European Union, China, and the US are all accelerating their investments in the field. Tech giants such as Google [also known as Alphabet], Microsoft and IBM are ramping up programs to develop companies and other technologies based on quantum principles. Meanwhile, even as Lazaridis works to establish Waterloo as the country’s quantum hub, a Vancouver-area company has emerged to challenge that claim. The two camps—one methodically focused on the long game, the other keen to stake an early commercial lead—have sparked an East-West rivalry that many observers of the Canadian quantum scene are at a loss to explain.

Is it possible that some of the rivalry might be due to an influential individual who has invested heavily in a ‘quantum valley’ and has a history of trying to ‘own’ a technology?

Getting back to D-Wave Systems, the Vancouver company, I have written about them a number of times (particularly in 2015; for the full list: input D-Wave into the blog search engine). This June 26, 2015 posting includes a reference to an article in The Economist magazine about D-Wave’s commercial opportunities while the bulk of the posting is focused on a technical breakthrough.

Semeniuk offers an overview of the D-Wave Systems story,

D-Wave was born in 1999, the same year Lazaridis began to fund quantum science in Waterloo. From the start, D-Wave had a more immediate goal: to develop a new computer technology to bring to market. “We didn’t have money or facilities,” says Geordie Rose, a physics PhD who co0founded the company and served in various executive roles. …

The group soon concluded that the kind of machine most scientists were pursing based on so-called gate-model architecture was decades away from being realized—if ever. …

Instead, D-Wave pursued another idea, based on a principle dubbed “quantum annealing.” This approach seemed more likely to produce a working system, even if the application that would run on it were more limited. “The only thing we cared about was building the machine,” says Rose. “Nobody else was trying to solve the same problem.”

D-Wave debuted its first prototype at an event in California in February 2007 running it through a few basic problems such as solving a Sudoku puzzle and finding the optimal seating plan for a wedding reception. … “They just assumed we were hucksters,” says Hilton [Jeremy Hilton, D.Wave senior vice-president of systems]. Federico Spedalieri, a computer scientist at the University of Southern California’s [USC} Information Sciences Institute who has worked with D-Wave’s system, says the limited information the company provided about the machine’s operation provoked outright hostility. “I think that played against them a lot in the following years,” he says.

It seems Lazaridis is not the only one who likes to hold company information tightly.

Back to Semeniuk and D-Wave,

Today [October 2017], the Los Alamos National Laboratory owns a D-Wave machine, which costs about $15million. Others pay to access D-Wave systems remotely. This year , for example, Volkswagen fed data from thousands of Beijing taxis into a machine located in Burnaby [one of the municipalities that make up metro Vancouver] to study ways to optimize traffic flow.

But the application for which D-Wave has the hights hope is artificial intelligence. Any AI program hings on the on the “training” through which a computer acquires automated competence, and the 2000Q [a D-Wave computer] appears well suited to this task. …

Yet, for all the buzz D-Wave has generated, with several research teams outside Canada investigating its quantum annealing approach, the company has elicited little interest from the Waterloo hub. As a result, what might seem like a natural development—the Institute for Quantum Computing acquiring access to a D-Wave machine to explore and potentially improve its value—has not occurred. …

I am particularly interested in this comment as it concerns public funding (from Semeniuk’s article),

Vern Brownell, a former Goldman Sachs executive who became CEO of D-Wave in 2009, calls the lack of collaboration with Waterloo’s research community “ridiculous,” adding that his company’s efforts to establish closer ties have proven futile, “I’ll be blunt: I don’t think our relationship is good enough,” he says. Brownell also point out that, while  hundreds of millions in public funds have flowed into Waterloo’s ecosystem, little funding is available for  Canadian scientists wishing to make the most of D-Wave’s hardware—despite the fact that it remains unclear which core quantum technology will prove the most profitable.

There’s a lot more to Semeniuk’s article but this is the last excerpt,

The world isn’t waiting for Canada’s quantum rivals to forge a united front. Google, Microsoft, IBM, and Intel are racing to develop a gate-model quantum computer—the sector’s ultimate goal. (Google’s researchers have said they will unveil a significant development early next year.) With the U.K., Australia and Japan pouring money into quantum, Canada, an early leader, is under pressure to keep up. The federal government is currently developing  a strategy for supporting the country’s evolving quantum sector and, ultimately, getting a return on its approximately $1-billion investment over the past decade [emphasis mine].

I wonder where the “approximately $1-billion … ” figure came from. I ask because some years ago MP Peter Julian asked the government for information about how much Canadian federal money had been invested in nanotechnology. The government replied with sheets of paper (a pile approximately 2 inches high) that had funding disbursements from various ministries. Each ministry had its own method with different categories for listing disbursements and the titles for the research projects were not necessarily informative for anyone outside a narrow specialty. (Peter Julian’s assistant had kindly sent me a copy of the response they had received.) The bottom line is that it would have been close to impossible to determine the amount of federal funding devoted to nanotechnology using that data. So, where did the $1-billion figure come from?

In any event, it will be interesting to see how the Council of Canadian Academies assesses the ‘quantum’ situation in its more academically inclined, “The State of Science and Technology and Industrial Research and Development in Canada,” when it’s released later this year (2018).

Finally, you can find Semeniuk’s October 2017 article here but be aware it’s behind a paywall.

Whither we goest?

Despite any doubts one might have about Lazaridis’ approach to research and technology, his tremendous investment and support cannot be denied. Without him, Canada’s quantum research efforts would be substantially less significant. As for the ‘cowboys’ in Vancouver, it takes a certain temperament to found a start-up company and it seems the D-Wave folks have more in common with Lazaridis than they might like to admit. As for the Quanta graduate  programme, it’s early days yet and no one should ever count out Alberta.

Meanwhile, one can continue to hope that a more thoughtful approach to regional collaboration will be adopted so Canada can continue to blaze trails in the field of quantum research.