Tag Archives: University of California at Santa Barbara

Quantum supremacy

This supremacy, refers to an engineering milestone and a October 23, 2019 news item on ScienceDaily announces the milestone has been reached,

Researchers in UC [University of California] Santa Barbara/Google scientist John Martinis’ group have made good on their claim to quantum supremacy. Using 53 entangled quantum bits (“qubits”), their Sycamore computer has taken on — and solved — a problem considered intractable for classical computers.

An October 23, 2019 UC Santa Barbara news release (also on EurekAlert) by Sonia Fernandez, which originated the news item, delves further into the work,

“A computation that would take 10,000 years on a classical supercomputer took 200 seconds on our quantum computer,” said Brooks Foxen, a graduate student researcher in the Martinis Group. “It is likely that the classical simulation time, currently estimated at 10,000 years, will be reduced by improved classical hardware and algorithms, but, since we are currently 1.5 trillion times faster, we feel comfortable laying claim to this achievement.”

The feat is outlined in a paper in the journal Nature.

The milestone comes after roughly two decades of quantum computing research conducted by Martinis and his group, from the development of a single superconducting qubit to systems including architectures of 72 and, with Sycamore, 54 qubits (one didn’t perform) that take advantage of the both awe-inspiring and bizarre properties of quantum mechanics.

“The algorithm was chosen to emphasize the strengths of the quantum computer by leveraging the natural dynamics of the device,” said Ben Chiaro, another graduate student researcher in the Martinis Group. That is, the researchers wanted to test the computer’s ability to hold and rapidly manipulate a vast amount of complex, unstructured data.

“We basically wanted to produce an entangled state involving all of our qubits as quickly as we can,” Foxen said, “and so we settled on a sequence of operations that produced a complicated superposition state that, when measured, returns bitstring with a probability determined by the specific sequence of operations used to prepare that particular superposition. The exercise, which was to verify that the circuit’s output correspond to the equence used to prepare the state, sampled the quantum circuit a million times in just a few minutes, exploring all possibilities — before the system could lose its quantum coherence.

‘A complex superposition state’

“We performed a fixed set of operations that entangles 53 qubits into a complex superposition state,” Chiaro explained. “This superposition state encodes the probability distribution. For the quantum computer, preparing this superposition state is accomplished by applying a sequence of tens of control pulses to each qubit in a matter of microseconds. We can prepare and then sample from this distribution by measuring the qubits a million times in 200 seconds.”

“For classical computers, it is much more difficult to compute the outcome of these operations because it requires computing the probability of being in any one of the 2^53 possible states, where the 53 comes from the number of qubits — the exponential scaling is why people are interested in quantum computing to begin with,” Foxen said. “This is done by matrix multiplication, which is expensive for classical computers as the matrices become large.”

According to the new paper, the researchers used a method called cross-entropy benchmarking to compare the quantum circuit’s output (a “bitstring”) to its “corresponding ideal probability computed via simulation on a classical computer” to ascertain that the quantum computer was working correctly.

“We made a lot of design choices in the development of our processor that are really advantageous,” said Chiaro. Among these advantages, he said, are the ability to experimentally tune the parameters of the individual qubits as well as their interactions.

While the experiment was chosen as a proof-of-concept for the computer, the research has resulted in a very real and valuable tool: a certified random number generator. Useful in a variety of fields, random numbers can ensure that encrypted keys can’t be guessed, or that a sample from a larger population is truly representative, leading to optimal solutions for complex problems and more robust machine learning applications. The speed with which the quantum circuit can produce its randomized bit string is so great that there is no time to analyze and “cheat” the system.

“Quantum mechanical states do things that go beyond our day-to-day experience and so have the potential to provide capabilities and application that would otherwise be unattainable,” commented Joe Incandela, UC Santa Barbara’s vice chancellor for research. “The team has demonstrated the ability to reliably create and repeatedly sample complicated quantum states involving 53 entangled elements to carry out an exercise that would take millennia to do with a classical supercomputer. This is a major accomplishment. We are at the threshold of a new era of knowledge acquisition.”

Looking ahead

With an achievement like “quantum supremacy,” it’s tempting to think that the UC Santa Barbara/Google researchers will plant their flag and rest easy. But for Foxen, Chiaro, Martinis and the rest of the UCSB/Google AI Quantum group, this is just the beginning.

“It’s kind of a continuous improvement mindset,” Foxen said. “There are always projects in the works.” In the near term, further improvements to these “noisy” qubits may enable the simulation of interesting phenomena in quantum mechanics, such as thermalization, or the vast amount of possibility in the realms of materials and chemistry.

In the long term, however, the scientists are always looking to improve coherence times, or, at the other end, to detect and fix errors, which would take many additional qubits per qubit being checked. These efforts have been running parallel to the design and build of the quantum computer itself, and ensure the researchers have a lot of work before hitting their next milestone.

“It’s been an honor and a pleasure to be associated with this team,” Chiaro said. “It’s a great collection of strong technical contributors with great leadership and the whole team really synergizes well.”

Here’s a link to and a citation for the paper,

Quantum supremacy using a programmable superconducting processor by Frank Arute, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C. Bardin, Rami Barends, Rupak Biswas, Sergio Boixo, Fernando G. S. L. Brandao, David A. Buell, Brian Burkett, Yu Chen, Zijun Chen, Ben Chiaro, Roberto Collins, William Courtney, Andrew Dunsworth, Edward Farhi, Brooks Foxen, Austin Fowler, Craig Gidney, Marissa Giustina, Rob Graff, Keith Guerin, Steve Habegger, Matthew P. Harrigan, Michael J. Hartmann, Alan Ho, Markus Hoffmann, Trent Huang, Travis S. Humble, Sergei V. Isakov, Evan Jeffrey, Zhang Jiang, Dvir Kafri, Kostyantyn Kechedzhi, Julian Kelly, Paul V. Klimov, Sergey Knysh, Alexander Korotkov, Fedor Kostritsa, David Landhuis, Mike Lindmark, Erik Lucero, Dmitry Lyakh, Salvatore Mandrà, Jarrod R. McClean, Matthew McEwen, Anthony Megrant, Xiao Mi, Kristel Michielsen, Masoud Mohseni, Josh Mutus, Ofer Naaman, Matthew Neeley, Charles Neill, Murphy Yuezhen Niu, Eric Ostby, Andre Petukhov, John C. Platt, Chris Quintana, Eleanor G. Rieffel, Pedram Roushan, Nicholas C. Rubin, Daniel Sank, Kevin J. Satzinger, Vadim Smelyanskiy, Kevin J. Sung, Matthew D. Trevithick, Amit Vainsencher, Benjamin Villalonga, Theodore White, Z. Jamie Yao, Ping Yeh, Adam Zalcman, Hartmut Neven & John M. Martinis. Nature volume 574, pages505–510 (2019) DOI: https://doi.org/10.1038/s41586-019-1666-5 Issue Date 24 October 2019

This paper appears to be open access.

Cosmetics breakthrough for Ulsan National Institute of Science and Technology (UNIST)?

Cosmetics would not have been my first thought on reading the title for the paper (“Rates of cavity filling by liquids”) produced  by scientists from Ulsan National Institute of Science and Technology (UNIST).

A September 17, 2018 news item on Nanowerk announces the research,

A research team, affiliated with Ulsan National Institute of Science and Technology (UNIST) has examined the rates of liquid penetration on rough or patterned surfaces, especially those with pores or cavities. Their findings provide important insights into the development of everyday products, including cosmetics, paints, as well as industrial applications, like enhanced oil recovery.

This study has been jointly led by Professor Dong Woog Lee and his research team in the School of Energy and Chemical Engineering at UNIST and a research team in the University of California, Santa Barbara. Published online in the July 19th issue of the Proceedings of the National Academy of Sciences (“Rates of cavity filling by liquids”), the study identifies five variables that control the cavity-filling (wetting transition) rates, required for liquids to penetrate into the cavities.

A July 26, 2018 UNIST press release (also on EurekAlert but published on September 17, 2018), which originated the news item, delves further into the work,

In the study, Professor Lee fabricated silicon wafers with cylindrical cavities of different geometries. After immersing them in bulk water, they observed the details of, and the rates associated with, water penetration into the cavities from the bulk, using bright-field and confocal fluorescence microscopy. Cylindrical cavities are like skin pores with narrow entrance and specious interior. The cavity filling generally progresses when bulk water is spread above a hydrophilic, reentrant cavity. As described in “Wetting Transition from the Cassie–Baxter State to Wenzel State”, the liquid droplet that sits on top of the textured surface with trapped air underneath will be completely absorbed by the rough surface cavities.

Their findings revealed that the cavity-filling rates are affected by the following variables: (i) the intrinsic contact angle, (ii) the concentration of dissolved air in the bulk water phase, (iii) the liquid volatility that determines the rate of capillary condensation inside the cavities, (iv) the types of surfactants, and (v) the cavity geometry.

“Our results can used in the manufacture of special-purpose cosmetic products,” says Professor Lee. “For instance, pore minimizing face primers and facial cleansers that remove sebum need to reduce the amount of dissolved air, so that they can penetrate into the pores quickly.”

On the other hand, beauty products, like sunscreens should be designed to protect the skin from harmful sun, while preventing pores clogging. Because, clogged pores hinder the skin’s function of breathing or exchange of carbon dioxide and then cause further irritation, pimples, and blemished areas on your skin. In this case, it is better to reduce volatility and increase the amount of dissolved air in the cosmetic products, as opposed to facial cleansers.

“This knowledge of how cavities under bulk water are filled and what variables control the rate of filling can provide insights into the engineering of temporarily or permanently superhydrophobic surfaces, and the designing and manufacturing of various products that are applied to rough, textured, or patterned surfaces,” says Professor Lee. “Many of the fundamental insights gained can also be applied to other liquids (e.g., oils), contact angles, and cavities or pores of different dimensions or geometries.”

This study has been supported by the National Research Foundation of Korea (NRF) grant, funded by the Ministry of Science and ICT.

Here’s a link to and a citation for the paper,

Rates of cavity filling by liquids by Dongjin Seo, Alex M. Schrader, Szu-Ying Chen, Yair Kaufman, Thomas R. Cristiani, Steven H. Page, Peter H. Koenig, Yonas Gizaw, Dong Woog Lee, and Jacob N. Israelachvili. PNAS August 7, 2018 115 (32) 8070-8075 https://doi.org/10.1073/pnas.1804437115 Published ahead of print July 19, 2018

This paper is behind a paywall.

Iridescent giant clams could point the way to safety, climatologically speaking

Giant clams in Palau (Cynthia Barnett)

These don’t look like any clams I’ve ever seen but that is the point of Cynthia Barnett’s absorbing Sept. 10, 2018 article for The Atlantic (Note: A link has been removed),

Snorkeling amid the tree-tangled rock islands of Ngermid Bay in the western Pacific nation of Palau, Alison Sweeney lingers at a plunging coral ledge, photographing every giant clam she sees along a 50-meter transect. In Palau, as in few other places in the world, this means she is going to be underwater for a skin-wrinkling long time.

At least the clams are making it easy for Sweeney, a biophysicist at the University of Pennsylvania. The animals plump from their shells like painted lips, shimmering in blues, purples, greens, golds, and even electric browns. The largest are a foot across and radiate from the sea floor, but most are the smallest of the giant clams, five-inch Tridacna crocea, living higher up on the reef. Their fleshy Technicolor smiles beam in all directions from the corals and rocks of Ngermid Bay.

… Some of the corals are bleached from the conditions in Ngermid Bay, where naturally high temperatures and acidity mirror the expected effects of climate change on the global oceans. (Ngermid Bay is more commonly known as “Nikko Bay,” but traditional leaders and government officials are working to revive the indigenous name of Ngermid.)

Even those clams living on bleached corals are pulsing color, like wildflowers in a white-hot desert. Sweeney’s ponytail flows out behind her as she nears them with her camera. They startle back into their fluted shells. Like bashful fairytale creatures cursed with irresistible beauty, they cannot help but draw attention with their sparkly glow.

Barnett makes them seem magical and perhaps they are (Note: A link has been removed),

It’s the glow that drew Sweeney’s attention to giant clams, and to Palau, a tiny republic of more than 300 islands between the Philippines and Guam. Its sun-laden waters are home to seven of the world’s dozen giant-clam species, from the storied Tridacna gigas—which can weigh an estimated 550 pounds and measure over four feet across—to the elegantly fluted Tridacna squamosa. Sweeney first came to the archipelago in 2009, while working on animal iridescence as a post-doctoral fellow at the University of California at Santa Barbara. Whether shimmering from a blue morpho butterfly’s wings or a squid’s skin, iridescence is almost always associated with a visual signal—one used to attract mates or confuse predators. Giant clams’ luminosity is not such a signal. So, what is it?

In the years since, Sweeney and her colleagues have discovered that the clams’ iridescence is essentially the outer glow of a solar transformer—optimized over millions of years to run on sunlight and algal biofuel. Giant clams reach their cartoonish proportions thanks to an exceptional ability to grow their own photosynthetic algae in vertical farms spread throughout their flesh. Sweeney and other scientists think this evolved expertise may shed light on alternative fuel technologies and other industrial solutions for a warming world.

Barnett goes on to describe Palau’s relationship to the clams and the clams’ environment,

Palau’s islands have been inhabited for at least 3,400 years, and from the start, giant clams were a staple of diet, daily life, and even deity. Many of the islands’ oldest-surviving tools are crafted of thick giant-clam shell: arched-blade adzes, fishhooks, gougers, heavy taro-root pounders. Giant-clam shell makes up more than three-fourths of some of the oldest shell middens in Palau, a percentage that decreases through the centuries. Archaeologists suggest that the earliest islanders depleted the giant clams that crowded the crystalline shallows, then may have self-corrected. Ancient Palauan conservation law, known as bul, prohibited fishing during critical spawning periods, or when a species showed signs of over-harvesting.

Before the Christianity that now dominates Palauan religion sailed in on eighteenth-century mission ships, the culture’s creation lore began with a giant clam called to life in an empty sea. The clam grew bigger and bigger until it sired Latmikaik, the mother of human children, who birthed them with the help of storms and ocean currents.

The legend evokes giant clams in their larval phase, moving with the currents for their first two weeks of life. Before they can settle, the swimming larvae must find and ingest one or two photosynthetic alga, which later multiply, becoming self-replicating fuel cells. After the larvae down the alga and develop a wee shell and a foot, they kick around like undersea farmers, looking for a sunny spot for their crop. When they’ve chosen a well-lit home in a shallow lagoon or reef, they affix to the rock, their shell gaping to the sky. After the sun hits and photosynthesis begins, the microalgae will multiply to millions, or in the case of T. gigas, billions, and clam and algae will live in symbiosis for life.

Giant clam is a beloved staple in Palau and many other Pacific islands, prepared raw with lemon, simmered into coconut soup, baked into a savory pancake, or sliced and sautéed in a dozen other ways. But luxury demand for their ivory-like shells and their adductor muscle, which is coveted as high-end sashimi and an alleged aphrodisiac, has driven T. gigas extinct in China, Taiwan, and other parts of their native habitat. Some of the toughest marine-protection laws in the world, along with giant-clam aquaculture pioneered here, have helped Palau’s wild clams survive. The Palau Mariculture Demonstration Center raises hundreds of thousands of giant clams a year, supplying local clam farmers who sell to restaurants and the aquarium trade and keeping pressure off the wild population. But as other nations have wiped out their clams, Palau’s 230,000-square-mile ocean territory is an increasing target of illegal foreign fishers.

Barnett delves into how the country of Palau is responding to the voracious appetite for the giant clams and other marine life,

Palau, drawing on its ancient conservation tradition of bul, is fighting back. In 2015, President Tommy Remengesau Jr. signed into law the Palau National Marine Sanctuary Act, which prohibits fishing in 80 percent of Palau’s Exclusive Economic Zone and creates a domestic fishing area in the remaining 20 percent, set aside for local fishers selling to local markets. In 2016, the nation received a $6.6 million grant from Japan to launch a major renovation of the Palau Mariculture Demonstration Center. Now under construction at the waterfront on the southern tip of Malakal Island, the new facility will amp up clam-aquaculture research and increase giant-clam production five-fold, to more than a million seedlings a year.

Last year, Palau amended its immigration policy to require that all visitors sign a pledge to behave in an ecologically responsible manner. The pledge, stamped into passports by an immigration officer who watches you sign, is written to the island’s children:

Children of Palau, I take this pledge, as your guest, to preserve and protect your beautiful and unique island home. I vow to tread lightly, act kindly and explore mindfully. I shall not take what is not given. I shall not harm what does not harm me. The only footprints I shall leave are those that will wash away.

The pledge is winning hearts and public-relations awards. But Palau’s existential challenge is still the collective “we,” the world’s rising carbon emissions and the resulting upturns in global temperatures, sea levels, and destructive storms.

F. Umiich Sengebau, Palau’s Minister for Natural Resources, Environment, and Tourism, grew up on Koror and is full of giant-clam proverbs, wisdom and legends from his youth. He tells me a story I also heard from an elder in the state of Airai: that in old times, giant clams were known as “stormy-weather food,” the fresh staple that was easy to collect and have on hand when it was too stormy to go out fishing.

As Palau faces the storms of climate change, Sengebau sees giant clams becoming another sort of stormy-weather food, serving as a secure source of protein; a fishing livelihood; a glowing icon for tourists; and now, an inspiration for alternative energy and other low-carbon technologies. “In the old days, clams saved us,” Sengebau tells me. “I think there’s a lot of power in that, a great power and meaning in the history of clams as food, and now clams as science.”

I highly recommend Barnett’s article, which is one article in a larger series, from a November 6, 2017 The Atlantic press release,

The Atlantic is expanding the global footprint of its science writing today with a multi-year series to investigate life in all of its multitudes. The series, “Life Up Close,” created with support from Howard Hughes Medical Institute’s Department of Science Education (HHMI), begins today at TheAtlantic.com. In the first piece for the project, “The Zombie Diseases of Climate Change,” The Atlantic’s Robinson Meyer travels to Greenland to report on the potentially dangerous microbes emerging from thawing Arctic permafrost.

The project is ambitious in both scope and geographic reach, and will explore how life is adapting to our changing planet. Journalists will travel the globe to examine these changes as they happen to microbes, plants, and animals in oceans, grasslands, forests, deserts, and the icy poles. The Atlantic will question where humans should look for life next: from the Martian subsurface, to Europa’s oceans, to the atmosphere of nearby stars and beyond. “Life Up Close” will feature at least twenty reported pieces continuing through 2018.

“The Atlantic has been around for 160 years, but that’s a mere pinpoint in history when it comes to questions of life and where it started, and where we’re going,” said Ross Andersen, The Atlantic’s senior editor who oversees science, tech, and health. “The questions that this project will set out to tackle are critical; and this support will allow us to cover new territory in new and more ambitious ways.”

About The Atlantic:
Founded in 1857 and today one of the fastest growing media platforms in the industry, The Atlantic has throughout its history championed the power of big ideas and continues to shape global debate across print, digital, events, and video platforms. With its award-winning digital presence TheAtlantic.com and CityLab.com on cities around the world, The Atlantic is a multimedia forum on the most critical issues of our times—from politics, business, urban affairs, and the economy, to technology, arts, and culture. The Atlantic is celebrating its 160th anniversary this year. Bob Cohn is president of The Atlantic and Jeffrey Goldberg is editor in chief.

About the Howard Hughes Medical Institute (HHMI) Department of Science Education:
HHMI is the leading private nonprofit supporter of scientific research and science education in the United States. The Department of Science Education’s BioInteractive division produces free, high quality educational media for science educators and millions of students around the globe, its HHMI Tangled Bank Studios unit crafts powerful stories of scientific discovery for television and big screens, and its grants program aims to transform science education in universities and colleges. For more information, visit www.hhmi.org.

Getting back to the giant clams, sometimes all you can do is marvel, eh?

Thanks for the memory: the US National Institute of Standards and Technology (NIST) and memristors

In January 2018 it seemed like I was tripping across a lot of memristor stories . This came from a January 19, 2018 news item on Nanowerk,

In the race to build a computer that mimics the massive computational power of the human brain, researchers are increasingly turning to memristors, which can vary their electrical resistance based on the memory of past activity. Scientists at the National Institute of Standards and Technology (NIST) have now unveiled the long-mysterious inner workings of these semiconductor elements, which can act like the short-term memory of nerve cells.

A January 18, 2018 NIST news release (also on EurekAlert), which originated the news item, fills in the details,

Just as the ability of one nerve cell to signal another depends on how often the cells have communicated in the recent past, the resistance of a memristor depends on the amount of current that recently flowed through it. Moreover, a memristor retains that memory even when electrical power is switched off.

But despite the keen interest in memristors, scientists have lacked a detailed understanding of how these devices work and have yet to develop a standard toolset to study them.

Now, NIST scientists have identified such a toolset and used it to more deeply probe how memristors operate. Their findings could lead to more efficient operation of the devices and suggest ways to minimize the leakage of current.

Brian Hoskins of NIST and the University of California, Santa Barbara, along with NIST scientists Nikolai Zhitenev, Andrei Kolmakov, Jabez McClelland and their colleagues from the University of Maryland’s NanoCenter (link is external) in College Park and the Institute for Research and Development in Microtechnologies in Bucharest, reported the findings (link is external) in a recent Nature Communications.

To explore the electrical function of memristors, the team aimed a tightly focused beam of electrons at different locations on a titanium dioxide memristor. The beam knocked free some of the device’s electrons, which formed ultrasharp images of those locations. The beam also induced four distinct currents to flow within the device. The team determined that the currents are associated with the multiple interfaces between materials in the memristor, which consists of two metal (conducting) layers separated by an insulator.

“We know exactly where each of the currents are coming from because we are controlling the location of the beam that is inducing those currents,” said Hoskins.

In imaging the device, the team found several dark spots—regions of enhanced conductivity—which indicated places where current might leak out of the memristor during its normal operation. These leakage pathways resided outside the memristor’s core—where it switches between the low and high resistance levels that are useful in an electronic device. The finding suggests that reducing the size of a memristor could minimize or even eliminate some of the unwanted current pathways. Although researchers had suspected that might be the case, they had lacked experimental guidance about just how much to reduce the size of the device.

Because the leakage pathways are tiny, involving distances of only 100 to 300 nanometers, “you’re probably not going to start seeing some really big improvements until you reduce dimensions of the memristor on that scale,” Hoskins said.

To their surprise, the team also found that the current that correlated with the memristor’s switch in resistance didn’t come from the active switching material at all, but the metal layer above it. The most important lesson of the memristor study, Hoskins noted, “is that you can’t just worry about the resistive switch, the switching spot itself, you have to worry about everything around it.” The team’s study, he added, “is a way of generating much stronger intuition about what might be a good way to engineer memristors.”

Here’s a link to and a citation for the paper,

Stateful characterization of resistive switching TiO2 with electron beam induced currents by Brian D. Hoskins, Gina C. Adam, Evgheni Strelcov, Nikolai Zhitenev, Andrei Kolmakov, Dmitri B. Strukov, & Jabez J. McClelland. Nature Communications 8, Article number: 1972 (2017) doi:10.1038/s41467-017-02116-9 Published online: 07 December 2017

This is an open access paper.

It might be my imagination but it seemed like a lot of papers from 2017 were being publicized in early 2018.

Finally, I borrowed much of my headline from the NIST’s headline for its news release, specifically, “Thanks for the memory,” which is a rather old song,

Bob Hope and Shirley Ross in “The Big Broadcast of 1938.”

‘Nano-hashtags’ for Majorana particles?

The ‘nano-hashtags’ are in fact (assuming a minor leap of imagination) nanowires that resemble hashtags.

Scanning electron microscope image of the device wherein clearly a ‘hashtag’ is formed. Credit: Eindhoven University of Technology

An August 23, 2017 news item on ScienceDaily makes the announcement,

In Nature, an international team of researchers from Eindhoven University of Technology [Netherlands], Delft University of Technology [Netherlands] and the University of California — Santa Barbara presents an advanced quantum chip that will be able to provide definitive proof of the mysterious Majorana particles. These particles, first demonstrated in 2012, are their own antiparticle at one and the same time. The chip, which comprises ultrathin networks of nanowires in the shape of ‘hashtags’, has all the qualities to allow Majorana particles to exchange places. This feature is regarded as the smoking gun for proving their existence and is a crucial step towards their use as a building block for future quantum computers.

An August 23, 2017 Eindhoven University press release (also on EurekAlert), which originated the news item, provides some context and information about the work,

In 2012 it was big news: researchers from Delft University of Technology and Eindhoven University of Technology presented the first experimental signatures for the existence of the Majorana fermion. This particle had been predicted in 1937 by the Italian physicist Ettore Majorana and has the distinctive property of also being its own anti-particle. The Majorana particles emerge at the ends of a semiconductor wire, when in contact with a superconductor material.

Smoking gun

While the discovered particles may have properties typical to Majoranas, the most exciting proof could be obtained by allowing two Majorana particles to exchange places, or ‘braid’ as it is scientifically known. “That’s the smoking gun,” suggests Erik Bakkers, one of the researchers from Eindhoven University of Technology. “The behavior we then see could be the most conclusive evidence yet of Majoranas.”

Crossroads

In the Nature paper that is published today [August 23, 2017], Bakkers and his colleagues present a new device that should be able to show this exchanging of Majoranas. In the original experiment in 2012 two Majorana particles were found in a single wire but they were not able to pass each other without immediately destroying the other. Thus the researchers quite literally had to create space. In the presented experiment they formed intersections using the same kinds of nanowire so that four of these intersections form a ‘hashtag’, #, and thus create a closed circuit along which Majoranas are able to move.

Etch and grow

The researchers built their hashtag device starting from scratch. The nanowires are grown from a specially etched substrate such that they form exactly the desired network which they then expose to a stream of aluminium particles, creating layers of aluminium, a superconductor, on specific spots on the wires – the contacts where the Majorana particles emerge. Places that lie ‘in the shadow’ of other wires stay uncovered.

Leap in quality

The entire process happens in a vacuum and at ultra-cold temperature (around -273 degree Celsius). “This ensures very clean, pure contacts,” says Bakkers, “and enables us to make a considerable leap in the quality of this kind of quantum device.” The measurements demonstrate for a number of electronic and magnetic properties that all the ingredients are present for the Majoranas to braid.

Quantum computers

If the researchers succeed in enabling the Majorana particles to braid, they will at once have killed two birds with one stone. Given their robustness, Majoranas are regarded as the ideal building block for future quantum computers that will be able to perform many calculations simultaneously and thus many times faster than current computers. The braiding of two Majorana particles could form the basis for a qubit, the calculation unit of these computers.

Travel around the world

An interesting detail is that the samples have traveled around the world during the fabrication, combining unique and synergetic activities of each research institution. It started in Delft with patterning and etching the substrate, then to Eindhoven for nanowire growth and to Santa Barbara for aluminium contact formation. Finally back to Delft via Eindhoven for the measurements.

Here’s a link to and a citation for the paper,

Epitaxy of advanced nanowire quantum devices by Sasa Gazibegovic, Diana Car, Hao Zhang, Stijn C. Balk, John A. Logan, Michiel W. A. de Moor, Maja C. Cassidy, Rudi Schmits, Di Xu, Guanzhong Wang, Peter Krogstrup, Roy L. M. Op het Veld, Kun Zuo, Yoram Vos, Jie Shen, Daniël Bouman, Borzoyeh Shojaei, Daniel Pennachio, Joon Sue Lee, Petrus J. van Veldhoven, Sebastian Koelling, Marcel A. Verheijen, Leo P. Kouwenhoven, Chris J. Palmstrøm, & Erik P. A. M. Bakkers. Nature 548, 434–438 (24 August 2017) doi:10.1038/nature23468 Published online 23 August 2017

This paper is behind a paywall.

Dexter Johnson has some additional insight (interview with one of the researchers) in an Aug. 29, 2017 posting on his Nanoclast blog (on the IEEE [institute of Electrical and Electronics Engineers] website).

Gamechanging electronics with new ultrafast, flexible, and transparent electronics

There are two news bits about game-changing electronics, one from the UK and the other from the US.

United Kingdom (UK)

An April 3, 2017 news item on Azonano announces the possibility of a future golden age of electronics courtesy of the University of Exeter,

Engineering experts from the University of Exeter have come up with a breakthrough way to create the smallest, quickest, highest-capacity memories for transparent and flexible applications that could lead to a future golden age of electronics.

A March 31, 2017 University of Exeter press release (also on EurekAlert), which originated the news item, expands on the theme (Note: Links have been removed),

Engineering experts from the University of Exeter have developed innovative new memory using a hybrid of graphene oxide and titanium oxide. Their devices are low cost and eco-friendly to produce, are also perfectly suited for use in flexible electronic devices such as ‘bendable’ mobile phone, computer and television screens, and even ‘intelligent’ clothing.

Crucially, these devices may also have the potential to offer a cheaper and more adaptable alternative to ‘flash memory’, which is currently used in many common devices such as memory cards, graphics cards and USB computer drives.

The research team insist that these innovative new devices have the potential to revolutionise not only how data is stored, but also take flexible electronics to a new age in terms of speed, efficiency and power.

Professor David Wright, an Electronic Engineering expert from the University of Exeter and lead author of the paper said: “Using graphene oxide to produce memory devices has been reported before, but they were typically very large, slow, and aimed at the ‘cheap and cheerful’ end of the electronics goods market.

“Our hybrid graphene oxide-titanium oxide memory is, in contrast, just 50 nanometres long and 8 nanometres thick and can be written to and read from in less than five nanoseconds – with one nanometre being one billionth of a metre and one nanosecond a billionth of a second.”

Professor Craciun, a co-author of the work, added: “Being able to improve data storage is the backbone of tomorrow’s knowledge economy, as well as industry on a global scale. Our work offers the opportunity to completely transform graphene-oxide memory technology, and the potential and possibilities it offers.”

Here’s a link to and a citation for the paper,

Multilevel Ultrafast Flexible Nanoscale Nonvolatile Hybrid Graphene Oxide–Titanium Oxide Memories by V. Karthik Nagareddy, Matthew D. Barnes, Federico Zipoli, Khue T. Lai, Arseny M. Alexeev, Monica Felicia Craciun, and C. David Wright. ACS Nano, 2017, 11 (3), pp 3010–3021 DOI: 10.1021/acsnano.6b08668 Publication Date (Web): February 21, 2017

Copyright © 2017 American Chemical Society

This paper appears to be open access.

United States (US)

Researchers from Stanford University have developed flexible, biodegradable electronics.

A newly developed flexible, biodegradable semiconductor developed by Stanford engineers shown on a human hair. (Image credit: Bao lab)

A human hair? That’s amazing and this May 3, 2017 news item on Nanowerk reveals more,

As electronics become increasingly pervasive in our lives – from smart phones to wearable sensors – so too does the ever rising amount of electronic waste they create. A United Nations Environment Program report found that almost 50 million tons of electronic waste were thrown out in 2017–more than 20 percent higher than waste in 2015.

Troubled by this mounting waste, Stanford engineer Zhenan Bao and her team are rethinking electronics. “In my group, we have been trying to mimic the function of human skin to think about how to develop future electronic devices,” Bao said. She described how skin is stretchable, self-healable and also biodegradable – an attractive list of characteristics for electronics. “We have achieved the first two [flexible and self-healing], so the biodegradability was something we wanted to tackle.”

The team created a flexible electronic device that can easily degrade just by adding a weak acid like vinegar. The results were published in the Proceedings of the National Academy of Sciences (“Biocompatible and totally disintegrable semiconducting polymer for ultrathin and ultralightweight transient electronics”).

“This is the first example of a semiconductive polymer that can decompose,” said lead author Ting Lei, a postdoctoral fellow working with Bao.

A May 1, 2017 Stanford University news release by Sarah Derouin, which originated the news item, provides more detail,

In addition to the polymer – essentially a flexible, conductive plastic – the team developed a degradable electronic circuit and a new biodegradable substrate material for mounting the electrical components. This substrate supports the electrical components, flexing and molding to rough and smooth surfaces alike. When the electronic device is no longer needed, the whole thing can biodegrade into nontoxic components.

Biodegradable bits

Bao, a professor of chemical engineering and materials science and engineering, had previously created a stretchable electrode modeled on human skin. That material could bend and twist in a way that could allow it to interface with the skin or brain, but it couldn’t degrade. That limited its application for implantable devices and – important to Bao – contributed to waste.

Flexible, biodegradable semiconductor on an avacado

The flexible semiconductor can adhere to smooth or rough surfaces and biodegrade to nontoxic products. (Image credit: Bao lab)

Bao said that creating a robust material that is both a good electrical conductor and biodegradable was a challenge, considering traditional polymer chemistry. “We have been trying to think how we can achieve both great electronic property but also have the biodegradability,” Bao said.

Eventually, the team found that by tweaking the chemical structure of the flexible material it would break apart under mild stressors. “We came up with an idea of making these molecules using a special type of chemical linkage that can retain the ability for the electron to smoothly transport along the molecule,” Bao said. “But also this chemical bond is sensitive to weak acid – even weaker than pure vinegar.” The result was a material that could carry an electronic signal but break down without requiring extreme measures.

In addition to the biodegradable polymer, the team developed a new type of electrical component and a substrate material that attaches to the entire electronic component. Electronic components are usually made of gold. But for this device, the researchers crafted components from iron. Bao noted that iron is a very environmentally friendly product and is nontoxic to humans.

The researchers created the substrate, which carries the electronic circuit and the polymer, from cellulose. Cellulose is the same substance that makes up paper. But unlike paper, the team altered cellulose fibers so the “paper” is transparent and flexible, while still breaking down easily. The thin film substrate allows the electronics to be worn on the skin or even implanted inside the body.

From implants to plants

The combination of a biodegradable conductive polymer and substrate makes the electronic device useful in a plethora of settings – from wearable electronics to large-scale environmental surveys with sensor dusts.

“We envision these soft patches that are very thin and conformable to the skin that can measure blood pressure, glucose value, sweat content,” Bao said. A person could wear a specifically designed patch for a day or week, then download the data. According to Bao, this short-term use of disposable electronics seems a perfect fit for a degradable, flexible design.

And it’s not just for skin surveys: the biodegradable substrate, polymers and iron electrodes make the entire component compatible with insertion into the human body. The polymer breaks down to product concentrations much lower than the published acceptable levels found in drinking water. Although the polymer was found to be biocompatible, Bao said that more studies would need to be done before implants are a regular occurrence.

Biodegradable electronics have the potential to go far beyond collecting heart disease and glucose data. These components could be used in places where surveys cover large areas in remote locations. Lei described a research scenario where biodegradable electronics are dropped by airplane over a forest to survey the landscape. “It’s a very large area and very hard for people to spread the sensors,” he said. “Also, if you spread the sensors, it’s very hard to gather them back. You don’t want to contaminate the environment so we need something that can be decomposed.” Instead of plastic littering the forest floor, the sensors would biodegrade away.

As the number of electronics increase, biodegradability will become more important. Lei is excited by their advancements and wants to keep improving performance of biodegradable electronics. “We currently have computers and cell phones and we generate millions and billions of cell phones, and it’s hard to decompose,” he said. “We hope we can develop some materials that can be decomposed so there is less waste.”

Other authors on the study include Ming Guan, Jia Liu, Hung-Cheng Lin, Raphael Pfattner, Leo Shaw, Allister McGuire, and Jeffrey Tok of Stanford University; Tsung-Ching Huang of Hewlett Packard Enterprise; and Lei-Lai Shao and Kwang-Ting Cheng of University of California, Santa Barbara.

The research was funded by the Air Force Office for Scientific Research; BASF; Marie Curie Cofund; Beatriu de Pinós fellowship; and the Kodak Graduate Fellowship.

Here’s a link to and a citation for the team’s latest paper,

Biocompatible and totally disintegrable semiconducting polymer for ultrathin and ultralightweight transient electronics by Ting Lei, Ming Guan, Jia Liu, Hung-Cheng Lin, Raphael Pfattner, Leo Shaw, Allister F. McGuire, Tsung-Ching Huang, Leilai Shao, Kwang-Ting Cheng, Jeffrey B.-H. Tok, and Zhenan Bao. PNAS 2017 doi: 10.1073/pnas.1701478114 published ahead of print May 1, 2017

This paper is behind a paywall.

The mention of cellulose in the second item piqued my interest so I checked to see if they’d used nanocellulose. No, they did not. Microcrystalline cellulose powder was used to constitute a cellulose film but they found a way to render this film at the nanoscale. From the Stanford paper (Note: Links have been removed),

… Moreover, cellulose films have been previously used as biodegradable substrates in electronics (28⇓–30). However, these cellulose films are typically made with thicknesses well over 10 μm and thus cannot be used to fabricate ultrathin electronics with substrate thicknesses below 1–2 μm (7, 18, 19). To the best of our knowledge, there have been no reports on ultrathin (1–2 μm) biodegradable substrates for electronics. Thus, to realize them, we subsequently developed a method described herein to obtain ultrathin (800 nm) cellulose films (Fig. 1B and SI Appendix, Fig. S8). First, microcrystalline cellulose powders were dissolved in LiCl/N,N-dimethylacetamide (DMAc) and reacted with hexamethyldisilazane (HMDS) (31, 32), providing trimethylsilyl-functionalized cellulose (TMSC) (Fig. 1B). To fabricate films or devices, TMSC in chlorobenzene (CB) (70 mg/mL) was spin-coated on a thin dextran sacrificial layer. The TMSC film was measured to be 1.2 μm. After hydrolyzing the film in 95% acetic acid vapor for 2 h, the trimethylsilyl groups were removed, giving a 400-nm-thick cellulose film. The film thickness significantly decreased to one-third of the original film thickness, largely due to the removal of the bulky trimethylsilyl groups. The hydrolyzed cellulose film is insoluble in most organic solvents, for example, toluene, THF, chloroform, CB, and water. Thus, we can sequentially repeat the above steps to obtain an 800-nm-thick film, which is robust enough for further device fabrication and peel-off. By soaking the device in water, the dextran layer is dissolved, starting from the edges of the device to the center. This process ultimately releases the ultrathin substrate and leaves it floating on water surface (Fig. 3A, Inset).

Finally, I don’t have any grand thoughts; it’s just interesting to see different approaches to flexible electronics.

The Center for Nanotechnology in Society at the University of California at Santa Barbara offers a ‘swan song’ in three parts

I gather the University of California at Santa Barbara’s (UCSB) Center for Nanotechnology in Society is ‘sunsetting’ as its funding runs out. A Nov. 9, 2016 UCSB news release by Brandon Fastman describes the center’s ‘swan song’,

After more than a decade, the UCSB Center for Nanotechnology in Society research has provided new and deep knowledge of how technological innovation and social change impact one another. Now, as the national center reaches the end of its term, its three primary research groups have published synthesis reports that bring together important findings from their 11 years of activity.

The reports, which include policy recommendations, are available for free download at the CNS web site at

http://www.cns.ucsb.edu/irg-synthesis-reports.

The ever-increasing ability of scientists to manipulate matter on the molecular level brings with it the potential for science fiction-like technologies such as nanoelectronic sensors that would entail “merging tissue with electronics in a way that it becomes difficult to determine where the tissue ends and the electronics begin,” according to a Harvard chemist in a recent CQ Researcher report. While the life-altering ramifications of such technologies are clear, it is less clear how they might impact the larger society to which they are introduced.

CNS research, as detailed the reports, addresses such gaps in knowledge. For instance, when anthropologist Barbara Herr Harthorn and her collaborators at the UCSB Center for Nanotechnology in Society (CNS-UCSB), convened public deliberations to discuss the promises and perils of health and human enhancement nanotechnologies, they thought that participants might be concerned about medical risks. However, that is not exactly what they found.

Participants were less worried about medical or technological mishaps than about the equitable distribution of the risks and benefits of new technologies and fair procedures for addressing potential problems. That is, they were unconvinced that citizens across the socioeconomic spectrum would share equal access to the benefits of therapies or equal exposure to their pitfalls.

In describing her work, Harthorn explained, “Intuitive assumptions of experts and practitioners about public perceptions and concerns are insufficient to understanding the societal contexts of technologies. Relying on intuition often leads to misunderstandings of social and institutional realities. CNS-UCSB has attempted to fill in the knowledge gaps through methodologically sophisticated empirical and theoretical research.”

In her role as Director of CNS-UCSB, Harthorn has overseen a larger effort to promote the responsible development of sophisticated materials and technologies seen as central to the nation’s economic future. By pursuing this goal, researchers at CNS-UCSB, which closed its doors at the end of the summer, have advanced the role for the social, economic, and behavioral sciences in understanding technological innovation.

Harthorn has spent the past 11 years trying to understand public expectations, values, beliefs, and perceptions regarding nanotechnologies. Along with conducting deliberations, she has worked with toxicologists and engineers to examine the environmental and occupational risks of nanotechnologies, determine gaps in the U.S. regulatory system, and survey nanotechnology experts. Work has also expanded to comparative studies of other emerging technologies such as shale oil and gas extraction (fracking).

Along with Harthorn’s research group on risk perception and social response, CNS-UCSB housed two other main research groups. One, led by sociologist Richard Appelbaum, studied the impacts of nanotechnology on the global economy. The other, led by historian Patrick McCray, studied the technologies, communities, and individuals that have shaped the direction of nanotechnology research.

Appelbaum’s research program included studying how state policies regarding nanotechnology – especially in China and Latin America – has impacted commercialization. Research trips to China elicited a great understanding of that nation’s research culture and its capacity to produce original intellectual property. He also studied the role of international collaboration in spurring technological innovation. As part of this research, his collaborators surveyed and interviewed international STEM graduate students in the United States in order to understand the factors that influence their choice whether to remain abroad or return home.

In examining the history of nanotechnology, McCray’s group explained how the microelectronics industry provided a template for what became known as nanotechnology, examined educational policies aimed at training a nano-workforce, and produced a history of the scanning tunneling microscope. They also penned award-winning monographs including McCray’s book, The Visioneers: How a Group of Elite Scientists Pursued Space Colonies, Nanotechnologies, and Limitless Future.

Reaching the Real World

Funded as a National Center by the US National Science Foundation in 2005, CNS-UCSB was explicitly intended to enhance the understanding of the relationship between new technologies and their societal context. After more than a decade of funding, CNS-UCSB research has provided a deep understanding of the relationship between technological innovation and social change.

New developments in nanotechnology, an area of research that has garnered $24 billion in funding from the U.S. federal government since 2001, impact sectors as far ranging as agriculture, medicine, energy, defense, and construction, posing great challenges for policymakers and regulators who must consider questions of equity, sustainability, occupational and environmental health and safety, economic and educational policy, disruptions to privacy, security and even what it means to be human. (A nanometer is roughly 10,000 times smaller than the diameter of a human hair.)  Nanoscale materials are already integrated into food packaging, electronics, solar cells, cosmetics, and pharmaceuticals. They are far in development for drugs that can target specific cells, microscopic spying devices, and quantum computers.

Given such real-world applications, it was important to CNS researchers that the results of their work not remain confined within the halls of academia. Therefore, they have delivered testimony to Congress, federal and state agencies (including the National Academies of Science, the Centers for Disease Control and Prevention, the Presidential Council of Advisors on Science and Technology, the U.S. Presidential Bioethics Commission and the National Nanotechnology Initiative), policy outfits (including the Washington Center for Equitable Growth), and international agencies (including the World Bank, European Commission, and World Economic Forum). They’ve collaborated with nongovernmental organizations. They’ve composed policy briefs and op eds, and their work has been covered by numerous news organizations including, recently, NPR, The New Yorker, and Forbes. They have also given many hundreds of lectures to audiences in community groups, schools, and museums.

Policy Options

Most notably, in their final act before the center closed, each of the three primary research groups published synthesis reports that bring together important findings from their 11 years of activity. Their titles are:

Exploring Nanotechnology’s Origins, Institutions, and Communities: A Ten Year Experiment in Large Scale Collaborative STS Research

Globalization and Nanotechnology: The Role of State Policy and International Collaboration

Understanding Nanotechnologies’ Risks and Benefits: Emergence, Expertise and Upstream Participation.

A sampling of key policy recommendations follows:

1.     Public acceptability of nanotechnologies is driven by: benefit perception, the type of application, and the risk messages transmitted from trusted sources and their stability over time; therefore transparent and responsible risk communication is a critical aspect of acceptability.

2.     Social risks, particularly issues of equity and politics, are primary, not secondary, drivers of perception and need to be fully addressed in any new technology development. We have devoted particular attention to studying how gender and race/ethnicity affect both public and expert risk judgments.

3.     State policies aimed at fostering science and technology development should clearly continue to emphasize basic research, but not to the exclusion of supporting promising innovative payoffs. The National Nanotechnology Initiative, with its overwhelming emphasis on basic research, would likely achieve greater success in spawning thriving businesses and commercialization by investing more in capital programs such as the Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) programs, self-described as “America’s seed fund.”

4.     While nearly half of all international STEM graduate students would like to stay in the U.S. upon graduation, fully 40 percent are undecided — and a main barrier is current U.S. immigration policy.

5.     Although representatives from the nanomaterials industry demonstrate relatively high perceived risk regarding engineered nanomaterials, they likewise demonstrate low sensitivity to variance in risks across type of engineered nanomaterials, and a strong disinclination to regulation. This situation puts workers at significant risk and probably requires regulatory action now (beyond the currently favored voluntary or ‘soft law’ approaches).

6.     The complex nature of technological ecosystems translates into a variety of actors essential for successful innovation. One species is the Visioneer, a person who blends engineering experience with a transformative vision of the technological future and a willingness to promote this vision to the public and policy makers.

Leaving a Legacy

Along with successful outreach efforts, CNS-UCSB also flourished when measured by typical academic metrics, including nearly 400 publications and 1,200 talks.

In addition to producing groundbreaking interdisciplinary research, CNS-UCSB also produced innovative educational programs, reaching 200 professionals-in-training from the undergraduate to postdoctoral levels. The Center’s educational centerpiece was a graduate fellowship program, referred to as “magical” by an NSF reviewer, that integrated doctoral students from disciplines across the UCSB campus into ongoing social science research projects.

For social scientists, working side-by-side with science and engineering students gave them an appreciation for the methods, culture, and ethics of their colleagues in different disciplines. It also led to methodological innovation. For their part, scientists and engineers were able to understand the larger context of their work at the bench.

UCSB graduates who participated in CNS’s educational programs have gone on to work as postdocs and professors at universities (including MIT, Stanford, U Penn), policy experts (at organizations like the Science Technology and Policy Institute and the Canadian Institute for Advanced Research), researchers at government agencies (like the National Institute for Standards and Technology), nonprofits (like the Kauffman Foundation), and NGOs. Others work in industry, and some have become entrepreneurs, starting their own businesses.

CNS has spawned lines of research that will continue at UCSB and the institutions of collaborators around the world, but its most enduring legacy will be the students it trained. They bring a true understanding of the complex interconnections between technology and society — along with an intellectual toolkit for examining them — to every sector of the economy, and they will continue to pursue a world that is as just as it technologically advanced.

I found the policy recommendations interesting especially this one:

5.     Although representatives from the nanomaterials industry demonstrate relatively high perceived risk regarding engineered nanomaterials, they likewise demonstrate low sensitivity to variance in risks across type of engineered nanomaterials, and a strong disinclination to regulation. This situation puts workers at significant risk and probably requires regulatory action now (beyond the currently favored voluntary or ‘soft law’ approaches).

Without having read the documents, I’m not sure how to respond but I do have a question.  Just how much regulation are they suggesting?

I offer all of the people associated with the center my thanks for all their hard work and my gratitude for the support I received from the center when I presented at the Society for the Study of Nanotechnologies and Other Emerging Technology (S.Net) in 2012. I’m glad to see they’re going out with a bang.

The memristor as computing device

An Oct. 27, 2016 news item on Nanowerk both builds on the Richard Feynman legend/myth and announces some new work with memristors,

In 1959 renowned physicist Richard Feynman, in his talk “[There’s] Plenty of Room at the Bottom,” spoke of a future in which tiny machines could perform huge feats. Like many forward-looking concepts, his molecule and atom-sized world remained for years in the realm of science fiction.

And then, scientists and other creative thinkers began to realize Feynman’s nanotechnological visions.

In the spirit of Feynman’s insight, and in response to the challenges he issued as a way to inspire scientific and engineering creativity, electrical and computer engineers at UC Santa Barbara [University of California at Santa Barbara, UCSB] have developed a design for a functional nanoscale computing device. The concept involves a dense, three-dimensional circuit operating on an unconventional type of logic that could, theoretically, be packed into a block no bigger than 50 nanometers on any side.

A figure depicting the structure of stacked memristors with dimensions that could satisfy the Feynman Grand Challenge Photo Credit: Courtesy Image

A figure depicting the structure of stacked memristors with dimensions that could satisfy the Feynman Grand Challenge. Photo Credit: Courtesy Image

An Oct. 27, 2016 UCSB news release (also on EurekAlert) by Sonia Fernandez, which originated the news item, offers a basic explanation of the work (useful for anyone unfamiliar with memristors) along with more detail,

“Novel computing paradigms are needed to keep up with the demand for faster, smaller and more energy-efficient devices,” said Gina Adam, postdoctoral researcher at UCSB’s Department of Computer Science and lead author of the paper “Optimized stateful material implication logic for three dimensional data manipulation,” published in the journal Nano Research. “In a regular computer, data processing and memory storage are separated, which slows down computation. Processing data directly inside a three-dimensional memory structure would allow more data to be stored and processed much faster.”

While efforts to shrink computing devices have been ongoing for decades — in fact, Feynman’s challenges as he presented them in his 1959 talk have been met — scientists and engineers continue to carve out room at the bottom for even more advanced nanotechnology. A nanoscale 8-bit adder operating in 50-by-50-by-50 nanometer dimension, put forth as part of the current Feynman Grand Prize challenge by the Foresight Institute, has not yet been achieved. However, the continuing development and fabrication of progressively smaller components is bringing this virus-sized computing device closer to reality, said Dmitri Strukov, a UCSB professor of computer science.

“Our contribution is that we improved the specific features of that logic and designed it so it could be built in three dimensions,” he said.

Key to this development is the use of a logic system called material implication logic combined with memristors — circuit elements whose resistance depends on the most recent charges and the directions of those currents that have flowed through them. Unlike the conventional computing logic and circuitry found in our present computers and other devices, in this form of computing, logic operation and information storage happen simultaneously and locally. This greatly reduces the need for components and space typically used to perform logic operations and to move data back and forth between operation and memory storage. The result of the computation is immediately stored in a memory element, which prevents data loss in the event of power outages — a critical function in autonomous systems such as robotics.

In addition, the researchers reconfigured the traditionally two-dimensional architecture of the memristor into a three-dimensional block, which could then be stacked and packed into the space required to meet the Feynman Grand Prize Challenge.

“Previous groups show that individual blocks can be scaled to very small dimensions, let’s say 10-by-10 nanometers,” said Strukov, who worked at technology company Hewlett-Packard’s labs when they ramped up development of memristors and material implication logic. By applying those results to his group’s developments, he said, the challenge could easily be met.

The tiny memristors are being heavily researched in academia and in industry for their promising uses in memory storage and neuromorphic computing. While implementations of material implication logic are rather exotic and not yet mainstream, uses for it could pop up any time, particularly in energy scarce systems such as robotics and medical implants.

“Since this technology is still new, more research is needed to increase its reliability and lifetime and to demonstrate large scale three-dimensional circuits tightly packed in tens or hundreds of layers,” Adam said.

HP Labs, mentioned in the news release, announced the ‘discovery’ of memristors and subsequent application of engineering control in two papers in 2008.

Here’s a link to and a citation for the UCSB paper,

Optimized stateful material implication logic for threedimensional data manipulation by Gina C. Adam, Brian D. Hoskins, Mirko Prezioso, &Dmitri B. Strukov. Nano Res. (2016) pp. 1 – 10. doi:10.1007/s12274-016-1260-1 First Online: 29 September 2016

This paper is behind a paywall.

You can find many articles about memristors here by using either ‘memristor’ or ‘memristors’ as your search term.

Ocean-inspired coatings for organic electronics

An Oct. 19, 2016 news item on phys.org describes the advantages a new coating offers and the specific source of inspiration,

In a development beneficial for both industry and environment, UC Santa Barbara [University of California at Santa Barbara] researchers have created a high-quality coating for organic electronics that promises to decrease processing time as well as energy requirements.

“It’s faster, and it’s nontoxic,” said Kollbe Ahn, a research faculty member at UCSB’s Marine Science Institute and corresponding author of a paper published in Nano Letters.

In the manufacture of polymer (also known as “organic”) electronics—the technology behind flexible displays and solar cells—the material used to direct and move current is of supreme importance. Since defects reduce efficiency and functionality, special attention must be paid to quality, even down to the molecular level.

Often that can mean long processing times, or relatively inefficient processes. It can also mean the use of toxic substances. Alternatively, manufacturers can choose to speed up the process, which could cost energy or quality.

Fortunately, as it turns out, efficiency, performance and sustainability don’t always have to be traded against each other in the manufacture of these electronics. Looking no further than the campus beach, the UCSB researchers have found inspiration in the mollusks that live there. Mussels, which have perfected the art of clinging to virtually any surface in the intertidal zone, serve as the model for a molecularly smooth, self-assembled monolayer for high-mobility polymer field-effect transistors—in essence, a surface coating that can be used in the manufacture and processing of the conductive polymer that maintains its efficiency.

An Oct. 18, 2016 UCSB news release by Sonia Fernandez, which originated the news item, provides greater technical detail,

More specifically, according to Ahn, it was the mussel’s adhesion mechanism that stirred the researchers’ interest. “We’re inspired by the proteins at the interface between the plaque and substrate,” he said.

Before mussels attach themselves to the surfaces of rocks, pilings or other structures found in the inhospitable intertidal zone, they secrete proteins through the ventral grove of their feet, in an incremental fashion. In a step that enhances bonding performance, a thin priming layer of protein molecules is first generated as a bridge between the substrate and other adhesive proteins in the plaques that tip the byssus threads of their feet to overcome the barrier of water and other impurities.

That type of zwitterionic molecule — with both positive and negative charges — inspired by the mussel’s native proteins (polyampholytes), can self-assemble and form a sub-nano thin layer in water at ambient temperature in a few seconds. The defect-free monolayer provides a platform for conductive polymers in the appropriate direction on various dielectric surfaces.

Current methods to treat silicon surfaces (the most common dielectric surface), for the production of organic field-effect transistors, requires a batch processing method that is relatively impractical, said Ahn. Although heat can hasten this step, it involves the use of energy and increases the risk of defects.

With this bio-inspired coating mechanism, a continuous roll-to-roll dip coating method of producing organic electronic devices is possible, according to the researchers. It also avoids the use of toxic chemicals and their disposal, by replacing them with water.

“The environmental significance of this work is that these new bio-inspired primers allow for nanofabrication on silicone dioxide surfaces in the absence of organic solvents, high reaction temperatures and toxic reagents,” said co-author Roscoe Lindstadt, a graduate student researcher in UCSB chemistry professor Bruce Lipshutz’s lab. “In order for practitioners to switch to newer, more environmentally benign protocols, they need to be competitive with existing ones, and thankfully device performance is improved by using this ‘greener’ method.”

Here’s a link to and a citation for the research paper,

Molecularly Smooth Self-Assembled Monolayer for High-Mobility Organic Field-Effect Transistors by Saurabh Das, Byoung Hoon Lee, Roscoe T. H. Linstadt, Keila Cunha, Youli Li, Yair Kaufman, Zachary A. Levine, Bruce H. Lipshutz, Roberto D. Lins, Joan-Emma Shea, Alan J. Heeger, and B. Kollbe Ahn. Nano Lett., 2016, 16 (10), pp 6709–6715
DOI: 10.1021/acs.nanolett.6b03860 Publication Date (Web): September 27, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall but the scientists have made an illustration available,

An artist's concept of a zwitterionic molecule of the type secreted by mussels to prime surfaces for adhesion Photo Credit: Peter Allen

An artist’s concept of a zwitterionic molecule of the type secreted by mussels to prime surfaces for adhesion Photo Credit: Peter Allen

Connecting chaos and entanglement

Researchers seem to have stumbled across a link between classical and quantum physics. A July 12, 2016 University of California at Santa Barbara (UCSB) news release (also on EurekAlert) by Sonia Fernandez provides a description of both classical and quantum physics, as well as, the research that connects the two,

Using a small quantum system consisting of three superconducting qubits, researchers at UC Santa Barbara and Google have uncovered a link between aspects of classical and quantum physics thought to be unrelated: classical chaos and quantum entanglement. Their findings suggest that it would be possible to use controllable quantum systems to investigate certain fundamental aspects of nature.

“It’s kind of surprising because chaos is this totally classical concept — there’s no idea of chaos in a quantum system,” Charles Neill, a researcher in the UCSB Department of Physics and lead author of a paper that appears in Nature Physics. “Similarly, there’s no concept of entanglement within classical systems. And yet it turns out that chaos and entanglement are really very strongly and clearly related.”

Initiated in the 15th century, classical physics generally examines and describes systems larger than atoms and molecules. It consists of hundreds of years’ worth of study including Newton’s laws of motion, electrodynamics, relativity, thermodynamics as well as chaos theory — the field that studies the behavior of highly sensitive and unpredictable systems. One classic example of chaos theory is the weather, in which a relatively small change in one part of the system is enough to foil predictions — and vacation plans — anywhere on the globe.

At smaller size and length scales in nature, however, such as those involving atoms and photons and their behaviors, classical physics falls short. In the early 20th century quantum physics emerged, with its seemingly counterintuitive and sometimes controversial science, including the notions of superposition (the theory that a particle can be located in several places at once) and entanglement (particles that are deeply linked behave as such despite physical distance from one another).

And so began the continuing search for connections between the two fields.

All systems are fundamentally quantum systems, according [to] Neill, but the means of describing in a quantum sense the chaotic behavior of, say, air molecules in an evacuated room, remains limited.

Imagine taking a balloon full of air molecules, somehow tagging them so you could see them and then releasing them into a room with no air molecules, noted co-author and UCSB/Google researcher Pedram Roushan. One possible outcome is that the air molecules remain clumped together in a little cloud following the same trajectory around the room. And yet, he continued, as we can probably intuit, the molecules will more likely take off in a variety of velocities and directions, bouncing off walls and interacting with each other, resting after the room is sufficiently saturated with them.

“The underlying physics is chaos, essentially,” he said. The molecules coming to rest — at least on the macroscopic level — is the result of thermalization, or of reaching equilibrium after they have achieved uniform saturation within the system. But in the infinitesimal world of quantum physics, there is still little to describe that behavior. The mathematics of quantum mechanics, Roushan said, do not allow for the chaos described by Newtonian laws of motion.

To investigate, the researchers devised an experiment using three quantum bits, the basic computational units of the quantum computer. Unlike classical computer bits, which utilize a binary system of two possible states (e.g., zero/one), a qubit can also use a superposition of both states (zero and one) as a single state. Additionally, multiple qubits can entangle, or link so closely that their measurements will automatically correlate. By manipulating these qubits with electronic pulses, Neill caused them to interact, rotate and evolve in the quantum analog of a highly sensitive classical system.

The result is a map of entanglement entropy of a qubit that, over time, comes to strongly resemble that of classical dynamics — the regions of entanglement in the quantum map resemble the regions of chaos on the classical map. The islands of low entanglement in the quantum map are located in the places of low chaos on the classical map.

“There’s a very clear connection between entanglement and chaos in these two pictures,” said Neill. “And, it turns out that thermalization is the thing that connects chaos and entanglement. It turns out that they are actually the driving forces behind thermalization.

“What we realize is that in almost any quantum system, including on quantum computers, if you just let it evolve and you start to study what happens as a function of time, it’s going to thermalize,” added Neill, referring to the quantum-level equilibration. “And this really ties together the intuition between classical thermalization and chaos and how it occurs in quantum systems that entangle.”

The study’s findings have fundamental implications for quantum computing. At the level of three qubits, the computation is relatively simple, said Roushan, but as researchers push to build increasingly sophisticated and powerful quantum computers that incorporate more qubits to study highly complex problems that are beyond the ability of classical computing — such as those in the realms of machine learning, artificial intelligence, fluid dynamics or chemistry — a quantum processor optimized for such calculations will be a very powerful tool.

“It means we can study things that are completely impossible to study right now, once we get to bigger systems,” said Neill.

Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image (Courtesy: UCSB)

Experimental link between quantum entanglement (left) and classical chaos (right) found using a small quantum computer. Photo Credit: Courtesy Image (Courtesy: UCSB)

Here’s a link to and a citation for the paper,

Ergodic dynamics and thermalization in an isolated quantum system by C. Neill, P. Roushan, M. Fang, Y. Chen, M. Kolodrubetz, Z. Chen, A. Megrant, R. Barends, B. Campbell, B. Chiaro, A. Dunsworth, E. Jeffrey, J. Kelly, J. Mutus, P. J. J. O’Malley, C. Quintana, D. Sank, A. Vainsencher, J. Wenner, T. C. White, A. Polkovnikov, & J. M. Martinis. Nature Physics (2016)  doi:10.1038/nphys3830 Published online 11 July 2016

This paper is behind a paywall.