Tag Archives: NIST

More of the ‘blackest black’

There’s a very good November 11, 2019 article by Natalie Angier for the New York Times on carbon nanotubes (CNTs) and the colour black,

On a laboratory bench at the National Institute of Standards and Technology was a square tray with two black disks inside, each about the width of the top of a Dixie cup. Both disks were undeniably black, yet they didn’t look quite the same.

Solomon Woods, 49, a trim, dark-haired, soft-spoken physicist, was about to demonstrate how different they were, and how serenely voracious a black could be.

“The human eye is extraordinarily sensitive to light,” Dr. Woods said. Throw a few dozen photons its way, a few dozen quantum-sized packets of light, and the eye can readily track them.

Dr. Woods pulled a laser pointer from his pocket. “This pointer,” he said, “puts out 100 trillion photons per second.” He switched on the laser and began slowly sweeping its bright beam across the surface of the tray.

On hitting the white background, the light bounced back almost unimpeded, as rude as a glaring headlight in a rearview mirror.

The beam moved to the first black disk, a rondel of engineered carbon now more than a decade old. The light dimmed significantly, as a sizable tranche of the incident photons were absorbed by the black pigment, yet the glow remained surprisingly strong.

Finally Dr. Woods trained his pointer on the second black disk, and suddenly the laser’s brilliant beam, its brash photonic probe, simply — disappeared. Trillions of light particles were striking the black disk, and virtually none were winking back up again. It was like watching a circus performer swallow a sword, or a husband “share” your plate of French fries: Hey, where did it all go?

N.I.S.T. disk number two was an example of advanced ultra-black technology: elaborately engineered arrays of tiny carbon cylinders, or nanotubes, designed to capture and muzzle any light they encounter. Blacker is the new black, and researchers here and abroad are working to create ever more efficient light traps, which means fabricating materials that look ever darker, ever flatter, ever more ripped from the void.

The N.I.S.T. ultra-black absorbs at least 99.99 percent of the light that stumbles into its nanotube forest. But scientists at the Massachusetts Institute of Technology reported in September the creation of a carbon nanotube coating that they claim captures better than 99.995 of the incident light.

… The more fastidious and reliable the ultra-black, the more broadly useful it will prove to be — in solar power generators, radiometers, industrial baffles and telescopes primed to detect the faintest light fluxes as a distant planet traverses the face of its star.

Psychology and metaphors

It’s not all technical, Angier goes on to mention the psychological and metaphorical aspects,

Psychologists have gathered evidence that black is among the most metaphorically loaded of all colors, and that we absorb our often contradictory impressions about black at a young age.

Reporting earlier this year in the Quarterly Journal of Experimental Psychology, Robin Kramer and Joanne Prior of the University of Lincoln in the United Kingdom compared color associations in a group of 104 children, aged 5 to 10, with those of 100 university students.

The researchers showed subjects drawings in which a lineup of six otherwise identical images differed only in some aspect of color. The T-shirt of a boy taking a test, for example, was switched from black to blue to green to red to white to yellow. The same for a businessman’s necktie, a schoolgirl’s dress, a dog’s collar, a boxer’s gloves.

Participants were asked to link images with traits. Which boy was likeliest to cheat on the test? Which man was likely to be in charge at work? Which girl was the smartest in her class, which dog the scariest?

Again and again, among both children and young adults, black pulled ahead of nearly every color but red. Black was the color of cheating, and black was the color of cleverness. A black tie was the mark of a boss, a black collar the sign of a pit bull. Black was the color of strength and of winning. Black was the color of rage.

Art

Then, there is the world of art,

For artists, black is basal and nonnegotiable, the source of shadow, line, volume, perspective and mood. “There is a black which is old and a black which is fresh,” Ad Reinhardt, the abstract expressionist artist, said. “Lustrous black and dull black, black in sunlight and black in shadow.”

So essential is black to any aesthetic act that, as David Scott Kastan and Stephen Farthing describe in their scholarly yet highly entertaining book, “On Color,” modern artists have long squabbled over who pioneered the ultimate visual distillation: the all-black painting.

Was it the Russian Constructivist Aleksandr Rodchenko, who in 1918 created a series of eight seemingly all-black canvases? No, insisted the American artist Barnett Newman: Those works were very dark brown, not black. He, Mr. Newman, deserved credit for his 1949 opus, “Abraham,” which in 1966 he described as “the first and still the only black painting in history.”

But what about Kazimir Malevich’s “Black Square” of 1915? True, it was a black square against a white background, but the black part was the point. Then again, the English polymath Robert Fludd had engraved a black square in a white border back in 1617.

Clearly, said Alfred H. Barr, Jr., the first director of the Museum of Modern Art, “Each generation must paint its own black square.”

Structural colour

Solomon and his NIST colleagues and the MIT scientists are all trying to create materials with structural colour, in this case, black. Angier goes on to discuss structural colour in nature mentioning bird feathers and spiders as examples of where you might find superblacks. For anyone unfamiliar with structural colour, the colour is not achieved with pigment or dye but with tiny structures, usually measured at the nanoscale, on a bird’s wing, a spider’s belly, a plant leaf, etc. Structural colour does not fade or change . Still, it’s possible to destroy the structures, i.e., the colour, but light and time will not have any effect since it’s the tiny structures and their optical properties which are producing the colour . (Even after all these years, my favourite structural colour story remains a Feb. 1, 2013 article, Color from Structure, by Cristina Luiggi for The Scientist magazine. For a shorter version, I excerpted parts of Luiggi’s story for my February 7, 2013 posting.)

The examples of structural colour in Angier’s article were new to me. However, there are many, many examples elsewhere,. You can find some here by using the terms ‘structural colour’ or ‘structural color’ in the blog’s search engine.

Angier’s is a really good article and I strongly recommend reading it if you have time but I’m a little surprised she doesn’t mention Vantablack and the artistic feud. More about that in a moment,

Massachusetts Institute of Technology and a ‘blacker black’

According to MIT (Massachusetts Institute of Technology), they have the blackest black. It too is courtesy of carbon nanotubes.

The Redemption of Vanity, is a work of art by MIT artist in residence Diemut Strebe that has been realized together with Brian L. Wardle, Professor of Aeronautics and Astronautics and Director of necstlab and Nano- Engineered Composite aerospace STructures (NECST) Consortium and his team Drs. Luiz Acauan and Estelle Cohen. Strebe’s residency at MIT is supported by the Center for Art, Science & Technology (CAST). Image: Diemut Strebe

What you see in the above ‘The Redemption of Vanity’ was on show at the New York Stock Exchange (NYSE) from September 13 – November 29, 2019. It’s both an art piece and a demonstration of MIT’s blackest black.

There are two new releases from MIT. The first is the more technical one. From a Sept. 12, 2019 MIT news release,

With apologies to “Spinal Tap,” it appears that black can, indeed, get more black.

MIT engineers report today that they have cooked up a material that is 10 times blacker than anything that has previously been reported. The material is made from vertically aligned carbon nanotubes, or CNTs — microscopic filaments of carbon, like a fuzzy forest of tiny trees, that the team grew on a surface of chlorine-etched aluminum foil. The foil captures at least 99.995 percent* of any incoming light, making it the blackest material on record.

The researchers have published their findings today in the journal ACS-Applied Materials and Interfaces. They are also showcasing the cloak-like material as part of a new exhibit today at the New York Stock Exchange, titled “The Redemption of Vanity.”

The artwork, conceived by Diemut Strebe, an artist-in-residence at the MIT Center for Art, Science, and Technology, in collaboration with Brian Wardle, professor of aeronautics and astronautics at MIT, and his group, and MIT Center for Art, Science, and Technology artist-in-residence Diemut Strebe, features a 16.78-carat natural yellow diamond from LJ West Diamonds, estimated to be worth $2 million, which the team coated with the new, ultrablack CNT material. The effect is arresting: The gem, normally brilliantly faceted, appears as a flat, black void.

Wardle says the CNT material, aside from making an artistic statement, may also be of practical use, for instance in optical blinders that reduce unwanted glare, to help space telescopes spot orbiting exoplanets.

“There are optical and space science applications for very black materials, and of course, artists have been interested in black, going back well before the Renaissance,” Wardle says. “Our material is 10 times blacker than anything that’s ever been reported, but I think the blackest black is a constantly moving target. Someone will find a blacker material, and eventually we’ll understand all the underlying mechanisms, and will be able to properly engineer the ultimate black.”

Wardle’s co-author on the paper is former MIT postdoc Kehang Cui, now a professor at Shanghai Jiao Tong University.

Into the void

Wardle and Cui didn’t intend to engineer an ultrablack material. Instead, they were experimenting with ways to grow carbon nanotubes on electrically conducting materials such as aluminum, to boost their electrical and thermal properties.

But in attempting to grow CNTs on aluminum, Cui ran up against a barrier, literally: an ever-present layer of oxide that coats aluminum when it is exposed to air. This oxide layer acts as an insulator, blocking rather than conducting electricity and heat. As he cast about for ways to remove aluminum’s oxide layer, Cui found a solution in salt, or sodium chloride.

At the time, Wardle’s group was using salt and other pantry products, such as baking soda and detergent, to grow carbon nanotubes. In their tests with salt, Cui noticed that chloride ions were eating away at aluminum’s surface and dissolving its oxide layer.

“This etching process is common for many metals,” Cui says. “For instance, ships suffer from corrosion of chlorine-based ocean water. Now we’re using this process to our advantage.”

Cui found that if he soaked aluminum foil in saltwater, he could remove the oxide layer. He then transferred the foil to an oxygen-free environment to prevent reoxidation, and finally, placed the etched aluminum in an oven, where the group carried out techniques to grow carbon nanotubes via a process called chemical vapor deposition.

By removing the oxide layer, the researchers were able to grow carbon nanotubes on aluminum, at much lower temperatures than they otherwise would, by about 100 degrees Celsius. They also saw that the combination of CNTs on aluminum significantly enhanced the material’s thermal and electrical properties — a finding that they expected.

What surprised them was the material’s color.

“I remember noticing how black it was before growing carbon nanotubes on it, and then after growth, it looked even darker,” Cui recalls. “So I thought I should measure the optical reflectance of the sample.

“Our group does not usually focus on optical properties of materials, but this work was going on at the same time as our art-science collaborations with Diemut, so art influenced science in this case,” says Wardle.

Wardle and Cui, who have applied for a patent on the technology, are making the new CNT process freely available to any artist to use for a noncommercial art project.

“Built to take abuse”

Cui measured the amount of light reflected by the material, not just from directly overhead, but also from every other possible angle. The results showed that the material absorbed at least 99.995 percent of incoming light, from every angle. In other words, it reflected 10 times less light than all other superblack materials, including Vantablack. If the material contained bumps or ridges, or features of any kind, no matter what angle it was viewed from, these features would be invisible, obscured in a void of black.  

The researchers aren’t entirely sure of the mechanism contributing to the material’s opacity, but they suspect that it may have something to do with the combination of etched aluminum, which is somewhat blackened, with the carbon nanotubes. Scientists believe that forests of carbon nanotubes can trap and convert most incoming light to heat, reflecting very little of it back out as light, thereby giving CNTs a particularly black shade.

“CNT forests of different varieties are known to be extremely black, but there is a lack of mechanistic understanding as to why this material is the blackest. That needs further study,” Wardle says.

The material is already gaining interest in the aerospace community. Astrophysicist and Nobel laureate John Mather, who was not involved in the research, is exploring the possibility of using Wardle’s material as the basis for a star shade — a massive black shade that would shield a space telescope from stray light.

“Optical instruments like cameras and telescopes have to get rid of unwanted glare, so you can see what you want to see,” Mather says. “Would you like to see an Earth orbiting another star? We need something very black. … And this black has to be tough to withstand a rocket launch. Old versions were fragile forests of fur, but these are more like pot scrubbers — built to take abuse.”

[Note] An earlier version of this story stated that the new material captures more than 99.96 percent of incoming light. That number has been updated to be more precise; the material absorbs at least 99.995 of incoming light.

Here’s an August 29, 2019 news release from MIT announcing the then upcoming show. Usually I’d expect to see a research paper associated with this work but this time it seems to an art exhibit only,

The MIT Center for Art, Science &Technology (CAST) and the New York Stock Exchange (NYSE) will present The Redemption of Vanity,created by artist Diemut Strebe in collaboration with MIT scientist Brian Wardle and his lab, on view at the New York Stock Exchange September 13, 2019 -November 25, 2019. For the work, a 16.78 carat natural yellow diamond valued at $2 million from L.J.West was coated using a new procedure of generating carbon nanotubes (CNTs), recently measured to be the blackest black ever created, which makes the diamond seem to disappear into an invisible void. The patented carbon nanotube technology (CNT) absorbs more than 99.96% of light and was developed by Professor Wardle and his necstlablab at MIT.

“Any object covered with this CNT material loses all its plasticity and appears entirely flat, abbreviated/reduced to a black silhouette. In outright contradiction to this we see that a diamond,while made of the very same element (carbon) performs the most intense reflection of light on earth.Because of the extremely high light absorbtive qualities of the CNTs, any object, in this case a large diamond coated with CNT’s, becomes a kind of black hole absent of shadows,“ explains Strebe.“The unification of extreme opposites in one object and the particular aesthetic features of the CNTs caught my imagination for this art project.”

“Strebe’s art-science collaboration caused us to look at the optical properties of our new CNT growth, and we discovered that these particular CNTs are blacker than all other reported materials by an order of magnitude across the visible spectrum”, says Wardle. The MIT team is offering the process for any artist to use. “We do not believe in exclusive ownership of any material or idea for any artwork and have opened our method to any artist,” say Strebe and Wardle.“

The project explores material and immaterial value attached to objects and concepts in reference to luxury, society and to art. We are presenting the literal devaluation of a diamond, which is highly symbolic and of high economic value.It presents a challenge to art market mechanisms on the one hand, while expressing at the same time questions of the value of art in a broader way. In this sense it manifests an inquiry into the significance of the value of objects of art and the art market,” says Strebe. “We are honored to present this work at The New York Stock Exchange, which I believe to be a most fitting location to consider the ideas embedded in The Redemption of Vanity.”

“The New York Stock Exchange, a center of financial and technological innovation for 227 years, is the perfect venue to display Diemut Strebe and Professor Brian Wardle’s collaboration. Their work brings together cutting-edge nanotube technology and a natural diamond, which is a symbol of both value and longevity,” said John Tuttle, NYSE Group Vice Chairman & Chief Commercial Officer.

“We welcome all scientists and artists to venture into the world of natural color diamonds. The Redemption of Vanity exemplifies the bond between art, science, and luxury. The 16-carat vivid yellow diamond in the exhibit spent millions of years in complete darkness, deep below the earth’s surface. It was only recently unearthed —a once-in-a-lifetime discovery of exquisite size and color. Now the diamond will relive its journey to darkness as it is covered in the blackest of materials. Once again, it will become a reminder that something rare and beautiful can exist even in darkness,”said Larry West.

The “disappearing” diamond in The Redemption of Vanity is a $2 Million Fancy Vivid Yellow SI1 (GIA), Radiant shape, from color diamond specialist, L.J. West Diamonds Inc. of New York.

The Redemption of Vanity, conceived by Diemut Strebe, has been realized with Brian L. Wardle, Professor of Aeronautics and Astronautics and Director of necstlab and Nano-Engineered Composite aerospace STructures (NECST) Consortium and his team Drs. Luiz Acauan and Estelle Cohen, in conjunction with Strebe’s residency at MIT supported by the Center for Art, Science & Technology (CAST).

ABOUT THE ARTISTS

Diemut Strebe is a conceptual artist based in Boston, MA and a MIT CAST Visiting Artist. She has collaborated with several MIT faculty, including Noam Chomsky and Robert Langer on Sugababe (2014), Litmus (2014) and Yeast Expression(2015); Seth Lloyd and Dirk Englund on Wigner’s Friends(2014); Alan Guth on Plötzlich! (2018); researchers in William Tisdale’s Lab on The Origin of the Works of Art(2018); Regina Barzilay and Elchanan Mossel on The Prayer (2019); and Ken Kamrin and John Brisson on The Gymnast (2019). Strebe is represented by the Ronald Feldman Gallery.

Brian L. Wardle is a Professor of Aeronautics and Astronautics at MIT and the director of the necstlab research group and MIT’s Nano-Engineered Composite aerospace STructures (NECST) Consortium. Wardle previously worked with CAST Visiting Artist Trevor Paglen on The Last Picturesproject (2012).

ABOUT THE MIT CENTER FOR ART, SCIENCE & TECHNOLOGY

A major cross-school initiative, the MIT Center for Art, Science & Technology (CAST) creates new opportunities for art, science and technology to thrive as interrelated, mutually informing modes of exploration, knowledge and discovery. CAST’s multidisciplinary platform presents performing and visual arts programs, supports research projects for artists working with science and engineering labs, and sponsors symposia, classes, workshops, design studios, lectures and publications. The Center is funded in part by a generous grant from the Andrew W. Mellon Foundation. Evan Ziporyn is the Faculty Director and Leila W. Kinney is the Executive Director.Since its inception in 2012, CAST has been the catalyst for more than 150 artist residencies and collaborative projects with MIT faculty and students, including numerous cross-disciplinary courses, workshops, concert series, multimedia projects, lectures and symposia. The visiting artists program is a cornerstone of CAST’s activities, which encourages cross-fertilization among disciplines and intensive interaction with MIT’s faculty and students. More info at https://arts.mit.edu/cast/ .

HISTORY OF VISITING ARTISTS AT MIT

Since the late 1960s, MIT has been a leader in integrating the arts and pioneering a model for collaboration among artists, scientists and engineers in a research setting. CAST’s Visiting Artists Program brings internationally acclaimed artists to engage with MIT’s creative community in ways that are mutually enlightening for the artists and for faculty, students and research staff at the Institute. Artists who have worked extensively at MIT include Mel Chin, Olafur Eliasson, Rick Lowe, Vik Muniz, Trevor Paglen, Tomás Saraceno, Maya Beiser, Agnieszka Kurant, and Anicka Yi.

ABOUT L.J. WEST DIAMONDS

L.J. West Diamonds is a three generation natural color diamond whole sale rfounded in the late 1970’s by Larry J. West and based in New York City. L.J. West has established itself as one of the world’s prominent houses for some of the most rare and important exotic natural fancy color diamonds to have ever been unearthed. This collection includes a vast color spectrum of rare pink, blue, yellow, green, orange and red diamonds. L.J. West is an expert in every phase of the jewelry process –from sourcing to the cutting, polishing and final design. Each exceptional jewel is carefully set to become a unique work of art.The Redemption of Vanity is on view at the New York Stock Exchange by appointment only.

Press viewing: September 13, 2019 at 3pmNew York Stock Exchange, 11 Wall Street, New York, NY 10005RSVP required. Please check-in at the blue tent at 2 Broad Street(at the corner of Wall and Broad Streets). All guests are required to show a government issued photo ID and go through airport-like security upon entering the NYSE.NYSE follows a business casual dress code -jeans & sneakers are not permitted.

No word yet if there will be other showings.

An artistic feud (of sorts)

Earlier this year, I updated a story on Vantablack. It was the blackest black, blocking 99.8% of light when I featured it in a March 14, 2016 posting. The UK company making the announcement, Surrey NanoSystems, then laid the groundwork for an artistic feud when it granted exclusive rights to their carbon nanotube-based coating, Vantablack, to Sir Anish Kapoor mentioned here in an April 16, 2016 posting.

This exclusivity outraged some artists notably, Stuart Semple. In his first act of defiance, he created the pinkest pink. Next, came a Kickstarter campaign to fund Semple’s blackest black, which would be available to all artists except Anish Kapoor. You can read all about the pinkest pink and blackest black as per Semple in my February 21, 2019 posting. You can also get a bit of an update in an Oct. 17, 2019 Stuart Semple proffile by Berenice Baker for Verdict,

… so I managed to hire a scientist, Jemima, to work in the studio with me. She got really close to a super black, and we made our own pigment to this recipe and it was awesome, but we couldn’t afford to put it into manufacture because it cost £25,000.”

Semple launched a Kickstarter campaign and was amazed to raise half a million pounds, making it the second most-supported art Kickstarter of all time.

The ‘race to the blackest’ is well underway, with MIT researchers recently announcing a carbon nanotube-based black whose light absorption they tested by coasting a diamond. But Semple is determined that his black should be affordable by all artists and work like a paint, not only perform in laboratory conditions. He’s currently working with Jemima and two chemists to upgrade the recipe for Black 3.2.

I don’t know how Semple arrived at his blackest black. I think it’s unlikely that he achieved the result by working with carbon nanotubes since my understanding is that CNTs aren’t that easy to produce.

Finally

Interesting, eh? In just a few years scientists have progressed from achieving a 99.8% black to 99.999%. It doesn’t seem like that big a difference to me but with Solomon Woods, at the beginning of this post, making the point that our eyes are very sensitive to light, an artistic feud, and a study uncovering deep emotions, getting the blackest black is a much more artistically fraught endeavour than I had imagined.

A solution to the problem of measuring nanoparticles

As you might expect from the US National Institute of Standards and Technology (NIST) this research concerns techniques for measurements. From an August 15, 2019 news item on Nanowerk (Note: Links have been removed),

Tiny nanoparticles play a gargantuan role in modern life, even if most consumers are unaware of their presence. They provide essential ingredients in sunscreen lotions, prevent athlete’s foot fungus in socks, and fight microbes on bandages. They enhance the colors of popular candies and keep the powdered sugar on doughnuts powdery. They are even used in advanced drugs that target specific types of cells in cancer treatments.

When chemists analyze a sample, however, it is challenging to measure the sizes and quantities of these particles — which are often 100,000 times smaller than the thickness of a piece of paper. Technology offers many options for assessing nanoparticles, but experts have not reached a consensus on which technique is best.

In a new paper from the National Institute of Standards and Technology (NIST) and collaborating institutions, researchers have concluded that measuring the range of sizes in nanoparticles — instead of just the average particle size — is optimal for most applications.

An August 14, 2019 NIST news release (also received via email and on EurkAlert), which originated the news item, delves further into the research,

“It seems like a simple choice,” said NIST’s Elijah Petersen, the lead author of the paper, which was published today in Environmental Science: Nano. “But it can have a big impact on the outcome of your assessment.”

As with many measurement questions, precision is key. Exposure to a certain amount of some nanoparticles could have adverse effects. Pharmaceutical researchers often need exactitude to maximize a drug’s efficacy. And environmental scientists need to know, for example, how many nanoparticles of gold, silver or titanium could potentially cause a risk to organisms in soil or water.

Using more nanoparticles than needed in a product because of inconsistent measurements could also waste money for manufacturers.

Although they might sound ultramodern, nanoparticles are neither new nor based solely on high-tech manufacturing processes. A nanoparticle is really just a submicroscopic particle that measures less than 100 nanometers on at least one of its dimensions. It would be possible to place hundreds of thousands of them onto the head of a pin. They are exciting to researchers because many materials act differently at the nanometer scale than they do at larger scales, and nanoparticles can be made to do lots of useful things.

Nanoparticles have been in use since the days of ancient Mesopotamia [emphasis mine], when ceramic artists used extremely small bits of metal to decorate vases and other vessels. In fourth-century Rome, glass artisans ground metal into tiny particles to change the color of their wares under different lighting. These techniques were forgotten for a while but rediscovered in the 1600s by resourceful manufacturers for glassmaking [emphasis mine] again. Then, in the 1850s, scientist Michael Faraday extensively researched ways to use various kinds of wash mixes to change the performance of gold particles.

Modern nanoparticle research advanced quickly in the mid-20th century due to technological innovations in optics. Being able to see the individual particles and study their behavior expanded the possibilities for experimentation. The largest advances came, however, after experimental nanotechnology took off in the 1990s. Suddenly, the behavior of single particles of gold and many other substances could be closely examined and manipulated. Discoveries about the ways that small amounts of a substance would reflect light, absorb light, or change in behavior were numerous, leading to the incorporation of nanoparticles into many more products

Debates have since followed about their measurement. When assessing the response of cells or organisms to nanoparticles, some researchers prefer measuring particle number concentrations (sometimes called PNCs by scientists). Many find PNCs challenging since extra formulas must be employed when determining the final measurement. Others prefer measuring mass or surface area concentrations.

PNCs are often used for characterizing metals in chemistry. The situation for nanoparticles is inherently more complex, however, than it is for dissolved organic or inorganic substances because unlike dissolved chemicals, nanoparticles can come in a wide variety of sizes and sometimes stick together when added to testing materials.

“If you have a dissolved chemical, it’s always going to have the same molecular formula, by definition,” Petersen says. “Nanoparticles don’t just have a certain number of atoms, however. Some will be 9 nanometers, some will be 11, some might be 18, and some might be 3.”

The problem is that each of those particles may be fulfilling an important role. While a simple estimate of particle number is perfectly fine for some industrial applications, therapeutic applications require much more robust measurement. In the case of cancer therapies, for example, each particle, no matter how big or small, may be delivering a needed antidote. And just as with any other kind of dosage, nanoparticle dosage must be exact in order to be safe and effective.

Using the range of particle sizes to calculate the PNC will often be the most helpful in most cases, said Petersen. The size distribution doesn’t use a mean or an average but notes the complete distribution of sizes of particles so that formulas can be used to effectively discover how many particles are in a sample.

But no matter which approach is used, researchers need to make note of it in their papers, for the sake of comparability with other studies. “Don’t assume that different approaches will give you the same result,” he said.

Petersen adds that he and his colleagues were surprised by how much the coatings on nanoparticles could impact measurement. Some coatings, he noted, can have a positive electrical charge, causing clumping.

Petersen worked in collaboration with researchers from federal laboratories in Switzerland, and with scientists from 3M who have previously made many nanoparticle measurements for use in industrial settings. Researchers from Switzerland, like those in much of the rest of Europe, are keen to learn more about measuring nanoparticles because PNCs are required in many regulatory situations. There hasn’t been much information on which techniques are best or more likely to yield the most precise results across many applications.

“Until now we didn’t even know if we could find agreement among labs about particle number concentrations,” Petersen says. “They are complex. But now we are beginning to see it can be done.”

I love the reference to glassmaking and ancient Mesopotamia. Getting back to current times, here’s a link to and a citation for the paper,

Determining what really counts: modeling and measuring nanoparticle number concentrations by Elijah J. Petersen, Antonio R. Montoro Bustos, Blaza Toman, Monique E. Johnson, Mark Ellefson, George C. Caceres, Anna Lena Neuer, Qilin Chan, Jonathan W. Kemling, Brian Mader, Karen Murphy and Matthias Roesslein. Environmental Science: Nano. Published August 14, 2019. DOI: 10.1039/c9en00462a

This paper is behind a paywall.

Analyzing a buckyball’s (buckminsterfullerene) quantum structure

The work was done jointly by the US National Institute of Standards and Technology (NIST) and JILA (Joint Institute for Laboratory Astrophysics), which is operated ‘jointly’ by NIST and the University of Colorado. On to buckyballs, a nickname for buckminsterfullerenes or C60.

From a January 28, 2019 news item on ScienceDaily,

JILA researchers have measured hundreds of individual quantum energy levels in the buckyball, a spherical cage of 60 carbon atoms. It’s the largest molecule that has ever been analyzed at this level of experimental detail in the history of quantum mechanics. Fully understanding and controlling this molecule’s quantum details could lead to new scientific fields and applications, such as an entire quantum computer contained in a single buckyball.

Caption: JILA researchers used frequency combs, or “rulers of light,” to observe individual quantum energy transitions in buckyballs. Credit: Steven Burrows/JILA

There are two types of spherical objects in the image: the smooth blue ones, which are not buckyballs, and the ones with ridged spheres, which are.

A January 28, 2019 NIST news release (also on EurekAlert), which originated the news item, describes the buckyball molecule and the research in more detail,

The buckyball, formally known as buckminsterfullerene, is extremely complex. Due to its enormous 60-atom size, the overall molecule has a staggeringly high number of ways to vibrate–at least 100,000,000,000,000,000,000,000,000 vibrational quantum states when the molecule is warm. That’s in addition to the many different energy states for the buckyball’s rotation and other properties.

As described in the January 4 [2019] issue of Science, the JILA team used an updated version of their frequency comb spectroscopy and cryogenic buffer gas cooling system to observe isolated, individual energy transitions among rotational and vibrational states in cold, gaseous buckyballs. This is the first time anyone has been able to prepare buckyballs in this form to analyze its rotations and vibrations at the quantum level.

JILA is jointly operated by the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

Buckyballs, first discovered in 1984, have created great scientific excitement. But high-resolution spectroscopy, which can reveal the details of the molecule’s rotational and vibrational properties, didn’t work at ordinary room temperatures because the signals were too congested, NIST/JILA Fellow Jun Ye said. Low temperatures (about -138 degrees Celsius, which is -216 degrees Fahrenheit) enabled researchers to concentrate the molecules into a single rotational-vibrational quantum state at the lowest energy level and probe them with high-resolution spectroscopy.

The buckyball is the most symmetric molecule known, with a soccer-ball-like shape known as a modified icosahedron. It is small enough to be fully understood with basic quantum mechanics principles. Yet it is large enough to reveal insights into the extreme quantum complexity that emerges in huge systems.

As an example of practical applications, buckyballs could act as a pristine network of 60 atoms. The core of each atom possesses an identical property known as “nuclear spin,” which enables it to interact magnetically with its environment. Therefore, each spin could act as a magnetically controlled quantum bit or “qubit” in a quantum computer.

“If we had a buckyball made of pure isotopic carbon-13, each atom would have a nuclear spin of 1/2, and each buckyball could serve as a 60-qubit quantum computer,” Ye said. “Of course, we don’t have such capabilities yet; we would need to first capture these buckyballs in traps.”

A key part of the new quantum revolution, a quantum computer using qubits made of atoms or other materials could potentially solve important problems that are intractable using today’s machines. NIST has a major stake in quantum science

“There are also a lot of astrophysics connections,” Ye continued. “There are abundant buckyball signals coming from remote carbon stars,” so the new data will enable scientists to better understand the universe.

After they measured the quantum energy levels, the JILA researchers collected statistics on buckyballs’ nuclear spin values. They confirmed that all 60 atoms were indistinguishable, or virtually identical. Precise measurements of the buckyball’s transition energies between individual quantum states revealed its atoms interacted strongly with one another, providing insights into the complexities of its molecular structure and the forces between atoms.

For the experiments, an oven converted a solid sample of material into gaseous buckyballs. These hot molecules flowed into a cell (container) anchored to a cryogenic cold apparatus, such that the molecules were cooled by collisions with cold argon gas atoms. Then laser light at precise frequencies was aimed at the cold gas molecules, and researchers measured how much light was absorbed. The observed structure in the infrared spectrum encoded details of the quantum-mechanical energy-level structure.

The laser light was produced by an optical frequency comb, or “ruler of light,” and aimed into an optical cavity surrounding the cold cell to enhance the absorption signals. The comb contained about 1000 “teeth” at optical frequencies spanning the full band of buckyball vibrations. The comb light was generated from a single fiber laser.

Here’s a link to and a citation for the paper,

Rovibrational quantum state resolution of the C60 fullerene by P. Bryan Changala, Marissa L. Weichman, Kevin F. Lee, Martin E. Fermann, Jun Ye1. Science 04 Jan 2019: Vol. 363, Issue 6422, pp. 49-54 DOI: 10.1126/science.aav2616

This paper appears to be open access.

Quantum guitar music

The sound quality the physicists at the US National Institute of Standards and Technology (NIST) have achieved is quite good compared to carbon nanotube radio. If you’re curious, the audio file is embedded in both the American Institute of Physics (AIP) June 18, 2019 news release (and in the copy on EurekAlert),

It sounds like an old-school vinyl record, but the distinctive crackle in the music streamed into Chris Holloway’s laboratory is atomic in origin. The group at the National Institute for Standards and Technology, Boulder, Colorado, spent a long six years finding a way to directly measure electric fields using atoms, so who can blame them for then having a little fun with their new technology?

“My vision is to cut a CD in the lab — our studio — at some point and have the first CD recorded with Rydberg atoms,” said Holloway. While he doesn’t expect the atomic-recording’s lower sound quality to replace digital music recordings, the team of research scientists is considering how this “entertaining” example of atomic sensing could be applied in communication devices of the future.

“Atom-based antennas might give us a better way of picking up audio data in the presence of noise, potentially even the very weak signals transmitted in deep space communications,” said Holloway, who describes his atomic receiver in AIP Advances, from AIP Publishing.

The atoms in question — Rydberg atoms — are atoms excited by lasers into a high energy state that responds in a measurable way to radio waves (an electric field). After figuring out how to measure electric field strength using the Rydberg atoms, Holloway said it was a relatively simple step to apply the same atoms to record and play back music — starting with Holloway’s own guitar improvisations in A minor.

They encoded the music onto radio waves in much the same way cellphone conversations are encoded onto radio waves for transmission. The atoms respond to these radio waves, and in turn, the laser beams shined through the Rydberg atoms are affected. These changes are picked up on a photodetector, which feeds an electric signal into the speaker or computer — and voila! The atomic radio was born

The team used their quantum system to pick up stereo — with one atomic species recording the instrumental and another the vocal at two different sets of laser frequencies. They selected a Queen track — “Under Pressure” — to test if their system could handle Freddie Mercury’s extensive vocal range.

“One of the reasons for cutting stereo was to show that this one receiver can pick up two channels simultaneously, which is difficult with conventional receivers,” said Holloway, who explained that although it is the early days for atomic communications, there is potential to use this to improve the security of communications.

For now, Holloway’s team are staying tuned into atomic radio as they try to determine how weak a signal the Rydberg atoms can detect, and what data transfer speeds can be achieved.

They are not forgetting the atomic record they want to produce, with which they hope to inspire the next generation of quantum scientists.

Here’s a link to and a citation for the paper,

A “real-time” guitar recording using Rydberg atoms and electromagnetically induced transparency: Quantum physics meets music by Christopher L. Holloway, Matthew T. Simons, Abdulaziz H. Haddab, Carl J. Williams, and Maxwell W. Holloway. AIP Advances volume 9 (6), 065110 (2019) DOI: 10.1063/1.5099036 https://doi.org/10.1063/1.5099036 Open Published Online: 18 June 2019

This paper is open access and, if you want to hear the guitar music, click on the AIP news release or EurekAlert links at the top of this posting.

Researchers, manufacturers, and administrators need to consider shared quality control challenges to advance the nanoparticle manufacturing industry ‘

Manufacturing remains a bit of an issue where nanotechnology is concerned due to the difficulties of producing nanoparticles of a consistent size and type,


Electron micrograph showing gallium arsenide nanoparticles of varying shapes and sizes. Such heterogeneity [variation]  can increase costs and limit profits when making nanoparticles into products. A new NIST study recommends that researchers, manufacturers and administrators work together to solve this, and other common problems, in nanoparticle manufacturing. Credit: A. Demotiere, E. Shevchenko/Argonne National Laboratory

The US National Institute of Standards and Technology (NIST) has produced a paper focusing on how nanoparticle manufacturing might become more effective, from an August 22, 2018 news item on ScienceDaily,

Nanoparticle manufacturing, the production of material units less than 100 nanometers in size (100,000 times smaller than a marble), is proving the adage that “good things come in small packages.” Today’s engineered nanoparticles are integral components of everything from the quantum dot nanocrystals coloring the brilliant displays of state-of-the-art televisions to the miniscule bits of silver helping bandages protect against infection. However, commercial ventures seeking to profit from these tiny building blocks face quality control issues that, if unaddressed, can reduce efficiency, increase production costs and limit commercial impact of the products that incorporate them.

To help overcome these obstacles, the National Institute of Standards and Technology (NIST) and the nonprofit World Technology Evaluation Center (WTEC) advocate that nanoparticle researchers, manufacturers and administrators “connect the dots” by considering their shared challenges broadly and tackling them collectively rather than individually. This includes transferring knowledge across disciplines, coordinating actions between organizations and sharing resources to facilitate solutions.

The recommendations are presented in a new paper in the journal ACS Applied Nano Materials.

An August 22, 2018 NIST news release, which originated the news item, describes how the authors of the ACS [American Chemical Society) Applied Nano Materials paper developed their recommendations,

“We looked at the big picture of nanoparticle manufacturing to identify problems that are common for different materials, processes and applications,” said NIST physical scientist Samuel Stavis, lead author of the paper. “Solving these problems could advance the entire enterprise.”

The new paper provides a framework to better understand these issues. It is the culmination of a study initiated by a workshop organized by NIST that focused on the fundamental challenge of reducing or mitigating heterogeneity, the inadvertent variations in nanoparticle size, shape and other characteristics that occur during their manufacture.

“Heterogeneity can have significant consequences in nanoparticle manufacturing,” said NIST chemical engineer and co-author Jeffrey Fagan.

In their paper, the authors noted that the most profitable innovations in nanoparticle manufacturing minimize heterogeneity during the early stages of the operation, reducing the need for subsequent processing. This decreases waste, simplifies characterization and improves the integration of nanoparticles into products, all of which save money.

The authors illustrated the point by comparing the production of gold nanoparticles and carbon nanotubes. For gold, they stated, the initial synthesis costs can be high, but the similarity of the nanoparticles produced requires less purification and characterization. Therefore, they can be made into a variety of products, such as sensors, at relatively low costs.

In contrast, the more heterogeneous carbon nanotubes are less expensive to synthesize but require more processing to yield those with desired properties. The added costs during manufacturing currently make nanotubes only practical for high-value applications such as digital logic devices.

“Although these nanoparticles and their end products are very different, the stakeholders in their manufacture can learn much from each other’s best practices,” said NIST materials scientist and co-author J. Alexander Liddle. “By sharing knowledge, they might be able to improve both seemingly disparate operations.”

Finding ways like this to connect the dots, the authors said, is critically important for new ventures seeking to transfer nanoparticle technologies from laboratory to market.

“Nanoparticle manufacturing can become so costly that funding expires before the end product can be commercialized,” said WTEC nanotechnology consultant and co-author Michael Stopa. “In our paper, we outlined several opportunities for improving the odds that new ventures will survive their journeys through this technology transfer ‘valley of death.’”

Finally, the authors considered how manufacturing challenges and innovations are affecting the ever-growing number of applications for nanoparticles, including those in the areas of electronics, energy, health care and materials.

Here’s a link to and a citation for the paper,

Nanoparticle Manufacturing – Heterogeneity through Processes to Products by Samuel M. Stavis, Jeffrey A. Fagan, Michael Stopa, and J. Alexander Liddle. ACS Appl. Nano Mater., Article ASAP DOI: 10.1021/acsanm.8b01239 Publication Date (Web): August 16, 2018

Copyright © 2018 American Chemical Society

This paper is behind a paywall.

I looked at this paper briefly and found it to give a good overview. The focus is on manufacturing and making money. I imagine any discussion about the life cycle of the materials and possible environmental and health risks would have been considered ‘scope creep’.

I have two postings that provide additional information about manufacturing concerns, my February 10, 2014 posting:  ‘Valley of Death’, ‘Manufacturing Middle’, and other concerns in new government report about the future of nanomanufacturing in the US and my September 5, 2016 posting: An examination of nanomanufacturing and nanofabrication.

Announcing the ‘memtransistor’

Yet another advance toward ‘brainlike’ computing (how many times have I written this or a variation thereof in the last 10 years? See: Dexter Johnson’s take on the situation at the end of this post): Northwestern University announced their latest memristor research in a February 21, 2018 news item on Nanowerk,

Computer algorithms might be performing brain-like functions, such as facial recognition and language translation, but the computers themselves have yet to operate like brains.

“Computers have separate processing and memory storage units, whereas the brain uses neurons to perform both functions,” said Northwestern University’s Mark C. Hersam. “Neural networks can achieve complicated computation with significantly lower energy consumption compared to a digital computer.”

A February 21, 2018 Northwestern University news release (also on EurekAlert), which originated the news item, provides more information about the latest work from this team,

In recent years, researchers have searched for ways to make computers more neuromorphic, or brain-like, in order to perform increasingly complicated tasks with high efficiency. Now Hersam, a Walter P. Murphy Professor of Materials Science and Engineering in Northwestern’s McCormick School of Engineering, and his team are bringing the world closer to realizing this goal.

The research team has developed a novel device called a “memtransistor,” which operates much like a neuron by performing both memory and information processing. With combined characteristics of a memristor and transistor, the memtransistor also encompasses multiple terminals that operate more similarly to a neural network.

Supported by the National Institute of Standards and Technology and the National Science Foundation, the research was published online today, February 22 [2018], in Nature. Vinod K. Sangwan and Hong-Sub Lee, postdoctoral fellows advised by Hersam, served as the paper’s co-first authors.

The memtransistor builds upon work published in 2015, in which Hersam, Sangwan, and their collaborators used single-layer molybdenum disulfide (MoS2) to create a three-terminal, gate-tunable memristor for fast, reliable digital memory storage. Memristor, which is short for “memory resistors,” are resistors in a current that “remember” the voltage previously applied to them. Typical memristors are two-terminal electronic devices, which can only control one voltage channel. By transforming it into a three-terminal device, Hersam paved the way for memristors to be used in more complex electronic circuits and systems, such as neuromorphic computing.

To develop the memtransistor, Hersam’s team again used atomically thin MoS2 with well-defined grain boundaries, which influence the flow of current. Similar to the way fibers are arranged in wood, atoms are arranged into ordered domains – called “grains” – within a material. When a large voltage is applied, the grain boundaries facilitate atomic motion, causing a change in resistance.

“Because molybdenum disulfide is atomically thin, it is easily influenced by applied electric fields,” Hersam explained. “This property allows us to make a transistor. The memristor characteristics come from the fact that the defects in the material are relatively mobile, especially in the presence of grain boundaries.”

But unlike his previous memristor, which used individual, small flakes of MoS2, Hersam’s memtransistor makes use of a continuous film of polycrystalline MoS2 that comprises a large number of smaller flakes. This enabled the research team to scale up the device from one flake to many devices across an entire wafer.

“When length of the device is larger than the individual grain size, you are guaranteed to have grain boundaries in every device across the wafer,” Hersam said. “Thus, we see reproducible, gate-tunable memristive responses across large arrays of devices.”

After fabricating memtransistors uniformly across an entire wafer, Hersam’s team added additional electrical contacts. Typical transistors and Hersam’s previously developed memristor each have three terminals. In their new paper, however, the team realized a seven-terminal device, in which one terminal controls the current among the other six terminals.

“This is even more similar to neurons in the brain,” Hersam said, “because in the brain, we don’t usually have one neuron connected to only one other neuron. Instead, one neuron is connected to multiple other neurons to form a network. Our device structure allows multiple contacts, which is similar to the multiple synapses in neurons.”

Next, Hersam and his team are working to make the memtransistor faster and smaller. Hersam also plans to continue scaling up the device for manufacturing purposes.

“We believe that the memtransistor can be a foundational circuit element for new forms of neuromorphic computing,” he said. “However, making dozens of devices, as we have done in our paper, is different than making a billion, which is done with conventional transistor technology today. Thus far, we do not see any fundamental barriers that will prevent further scale up of our approach.”

The researchers have made this illustration available,

Caption: This is the memtransistor symbol overlaid on an artistic rendering of a hypothetical circuit layout in the shape of a brain. Credit; Hersam Research Group

Here’s a link to and a citation for the paper,

Multi-terminal memtransistors from polycrystalline monolayer molybdenum disulfide by Vinod K. Sangwan, Hong-Sub Lee, Hadallia Bergeron, Itamar Balla, Megan E. Beck, Kan-Sheng Chen, & Mark C. Hersam. Nature volume 554, pages 500–504 (22 February 2018 doi:10.1038/nature25747 Published online: 21 February 2018

This paper is behind a paywall.

The team’s earlier work referenced in the news release was featured here in an April 10, 2015 posting.

Dexter Johnson

From a Feb. 23, 2018 posting by Dexter Johnson on the Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website),

While this all seems promising, one of the big shortcomings in neuromorphic computing has been that it doesn’t mimic the brain in a very important way. In the brain, for every neuron there are a thousand synapses—the electrical signal sent between the neurons of the brain. This poses a problem because a transistor only has a single terminal, hardly an accommodating architecture for multiplying signals.

Now researchers at Northwestern University, led by Mark Hersam, have developed a new device that combines memristors—two-terminal non-volatile memory devices based on resistance switching—with transistors to create what Hersam and his colleagues have dubbed a “memtransistor” that performs both memory storage and information processing.

This most recent research builds on work that Hersam and his team conducted back in 2015 in which the researchers developed a three-terminal, gate-tunable memristor that operated like a kind of synapse.

While this work was recognized as mimicking the low-power computing of the human brain, critics didn’t really believe that it was acting like a neuron since it could only transmit a signal from one artificial neuron to another. This was far short of a human brain that is capable of making tens of thousands of such connections.

“Traditional memristors are two-terminal devices, whereas our memtransistors combine the non-volatility of a two-terminal memristor with the gate-tunability of a three-terminal transistor,” said Hersam to IEEE Spectrum. “Our device design accommodates additional terminals, which mimic the multiple synapses in neurons.”

Hersam believes that these unique attributes of these multi-terminal memtransistors are likely to present a range of new opportunities for non-volatile memory and neuromorphic computing.

If you have the time and the interest, Dexter’s post provides more context,

Less is more—a superconducting synapse

It seems the US National Institute of Standards and Technology (NIST) is more deeply invested into developing artificial brains than I had realized (See: April 17, 2018 posting). A January 26, 2018 NIST news release on EurekAlert describes the organization’s latest foray into the field,

Researchers at the National Institute of Standards and Technology (NIST) have built a superconducting switch that “learns” like a biological system and could connect processors and store memories in future computers operating like the human brain.

The NIST switch, described in Science Advances, is called a synapse, like its biological counterpart, and it supplies a missing piece for so-called neuromorphic computers. Envisioned as a new type of artificial intelligence, such computers could boost perception and decision-making for applications such as self-driving cars and cancer diagnosis.

A synapse is a connection or switch between two brain cells. NIST’s artificial synapse–a squat metallic cylinder 10 micrometers in diameter–is like the real thing because it can process incoming electrical spikes to customize spiking output signals. This processing is based on a flexible internal design that can be tuned by experience or its environment. The more firing between cells or processors, the stronger the connection. Both the real and artificial synapses can thus maintain old circuits and create new ones. Even better than the real thing, the NIST synapse can fire much faster than the human brain–1 billion times per second, compared to a brain cell’s 50 times per second–using just a whiff of energy, about one ten-thousandth as much as a human synapse. In technical terms, the spiking energy is less than 1 attojoule, lower than the background energy at room temperature and on a par with the chemical energy bonding two atoms in a molecule.

“The NIST synapse has lower energy needs than the human synapse, and we don’t know of any other artificial synapse that uses less energy,” NIST physicist Mike Schneider said.

The new synapse would be used in neuromorphic computers made of superconducting components, which can transmit electricity without resistance, and therefore, would be more efficient than other designs based on semiconductors or software. Data would be transmitted, processed and stored in units of magnetic flux. Superconducting devices mimicking brain cells and transmission lines have been developed, but until now, efficient synapses–a crucial piece–have been missing.

The brain is especially powerful for tasks like context recognition because it processes data both in sequence and simultaneously and stores memories in synapses all over the system. A conventional computer processes data only in sequence and stores memory in a separate unit.

The NIST synapse is a Josephson junction, long used in NIST voltage standards. These junctions are a sandwich of superconducting materials with an insulator as a filling. When an electrical current through the junction exceeds a level called the critical current, voltage spikes are produced. The synapse uses standard niobium electrodes but has a unique filling made of nanoscale clusters of manganese in a silicon matrix.

The nanoclusters–about 20,000 per square micrometer–act like tiny bar magnets with “spins” that can be oriented either randomly or in a coordinated manner.

“These are customized Josephson junctions,” Schneider said. “We can control the number of nanoclusters pointing in the same direction, which affects the superconducting properties of the junction.”

The synapse rests in a superconducting state, except when it’s activated by incoming current and starts producing voltage spikes. Researchers apply current pulses in a magnetic field to boost the magnetic ordering, that is, the number of nanoclusters pointing in the same direction. This magnetic effect progressively reduces the critical current level, making it easier to create a normal conductor and produce voltage spikes.

The critical current is the lowest when all the nanoclusters are aligned. The process is also reversible: Pulses are applied without a magnetic field to reduce the magnetic ordering and raise the critical current. This design, in which different inputs alter the spin alignment and resulting output signals, is similar to how the brain operates.

Synapse behavior can also be tuned by changing how the device is made and its operating temperature. By making the nanoclusters smaller, researchers can reduce the pulse energy needed to raise or lower the magnetic order of the device. Raising the operating temperature slightly from minus 271.15 degrees C (minus 456.07 degrees F) to minus 269.15 degrees C (minus 452.47 degrees F), for example, results in more and higher voltage spikes.

Crucially, the synapses can be stacked in three dimensions (3-D) to make large systems that could be used for computing. NIST researchers created a circuit model to simulate how such a system would operate.

The NIST synapse’s combination of small size, superfast spiking signals, low energy needs and 3-D stacking capability could provide the means for a far more complex neuromorphic system than has been demonstrated with other technologies, according to the paper.

NIST has prepared an animation illustrating the research,

Caption: This is an animation of how NIST’s artificial synapse works. Credit: Sean Kelley/NIST

Here’s a link to and a citation for the paper,

Ultralow power artificial synapses using nanotextured magnetic Josephson junctions by Michael L. Schneider, Christine A. Donnelly, Stephen E. Russek, Burm Baek, Matthew R. Pufall, Peter F. Hopkins, Paul D. Dresselhaus, Samuel P. Benz, and William H. Rippard. Science Advances 26 Jan 2018: Vol. 4, no. 1, e1701329 DOI: 10.1126/sciadv.1701329

This paper is open access.

Samuel K. Moore in a January 26, 2018 posting on the Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers] website) describes the research and adds a few technical explanations such as this about the Josephson junction,

In a magnetic Josephson junction, that “weak link” is magnetic. The higher the magnetic field, the lower the critical current needed to produce voltage spikes. In the device Schneider and his colleagues designed, the magnetic field is caused by 20,000 or so nanometer-scale clusters of manganese embedded in silicon. …

Moore also provides some additional links including this one to his November 29, 2017 posting where he describes four new approaches to computing including quantum computing and neuromorphic (brain-like) computing.

Thanks for the memory: the US National Institute of Standards and Technology (NIST) and memristors

In January 2018 it seemed like I was tripping across a lot of memristor stories . This came from a January 19, 2018 news item on Nanowerk,

In the race to build a computer that mimics the massive computational power of the human brain, researchers are increasingly turning to memristors, which can vary their electrical resistance based on the memory of past activity. Scientists at the National Institute of Standards and Technology (NIST) have now unveiled the long-mysterious inner workings of these semiconductor elements, which can act like the short-term memory of nerve cells.

A January 18, 2018 NIST news release (also on EurekAlert), which originated the news item, fills in the details,

Just as the ability of one nerve cell to signal another depends on how often the cells have communicated in the recent past, the resistance of a memristor depends on the amount of current that recently flowed through it. Moreover, a memristor retains that memory even when electrical power is switched off.

But despite the keen interest in memristors, scientists have lacked a detailed understanding of how these devices work and have yet to develop a standard toolset to study them.

Now, NIST scientists have identified such a toolset and used it to more deeply probe how memristors operate. Their findings could lead to more efficient operation of the devices and suggest ways to minimize the leakage of current.

Brian Hoskins of NIST and the University of California, Santa Barbara, along with NIST scientists Nikolai Zhitenev, Andrei Kolmakov, Jabez McClelland and their colleagues from the University of Maryland’s NanoCenter (link is external) in College Park and the Institute for Research and Development in Microtechnologies in Bucharest, reported the findings (link is external) in a recent Nature Communications.

To explore the electrical function of memristors, the team aimed a tightly focused beam of electrons at different locations on a titanium dioxide memristor. The beam knocked free some of the device’s electrons, which formed ultrasharp images of those locations. The beam also induced four distinct currents to flow within the device. The team determined that the currents are associated with the multiple interfaces between materials in the memristor, which consists of two metal (conducting) layers separated by an insulator.

“We know exactly where each of the currents are coming from because we are controlling the location of the beam that is inducing those currents,” said Hoskins.

In imaging the device, the team found several dark spots—regions of enhanced conductivity—which indicated places where current might leak out of the memristor during its normal operation. These leakage pathways resided outside the memristor’s core—where it switches between the low and high resistance levels that are useful in an electronic device. The finding suggests that reducing the size of a memristor could minimize or even eliminate some of the unwanted current pathways. Although researchers had suspected that might be the case, they had lacked experimental guidance about just how much to reduce the size of the device.

Because the leakage pathways are tiny, involving distances of only 100 to 300 nanometers, “you’re probably not going to start seeing some really big improvements until you reduce dimensions of the memristor on that scale,” Hoskins said.

To their surprise, the team also found that the current that correlated with the memristor’s switch in resistance didn’t come from the active switching material at all, but the metal layer above it. The most important lesson of the memristor study, Hoskins noted, “is that you can’t just worry about the resistive switch, the switching spot itself, you have to worry about everything around it.” The team’s study, he added, “is a way of generating much stronger intuition about what might be a good way to engineer memristors.”

Here’s a link to and a citation for the paper,

Stateful characterization of resistive switching TiO2 with electron beam induced currents by Brian D. Hoskins, Gina C. Adam, Evgheni Strelcov, Nikolai Zhitenev, Andrei Kolmakov, Dmitri B. Strukov, & Jabez J. McClelland. Nature Communications 8, Article number: 1972 (2017) doi:10.1038/s41467-017-02116-9 Published online: 07 December 2017

This is an open access paper.

It might be my imagination but it seemed like a lot of papers from 2017 were being publicized in early 2018.

Finally, I borrowed much of my headline from the NIST’s headline for its news release, specifically, “Thanks for the memory,” which is a rather old song,

Bob Hope and Shirley Ross in “The Big Broadcast of 1938.”

Watching rust turn into iron

a) Colorized SEM images of iron oxide nanoblades used in the experiment. b) Colorized cross-section of SEM image of the nanoblades. c) Colorized SEM image of nanoblades after 1 hour of reduction reaction at 500 °C in molecular hydrogen, showing the sawtooth shape along the edges (square). d) Colorized SEM image showing the formation of holes after 2 hours of reduction. The scale bar is 1 micrometer. Credit: W. Zhu et al./ACS Nano and K. Irvine/NIST

Here’s more about being able to watch iron transition from one state to the next according to an April 5, 2017 news item on phys.org

Using a state-of-the-art microscopy technique, experimenters at the National Institute of Standards and Technology (NIST) and their colleagues have witnessed a slow-motion, atomic-scale transformation of rust—iron oxide—back to pure iron metal, in all of its chemical steps.

An April 4, 2017 NIST news release describes the role iron plays in modern lifestyles and the purpose of this research,

Among the most abundant minerals on Earth, iron oxides play a leading role in magnetic data storage, cosmetics, the pigmentation of paints and drug delivery. These materials also serve as catalysts for several types of chemical reactions, including the production of ammonia for fertilizer.

To fine-tune the properties of these minerals for each application, scientists work with nanometer-scale particles of the oxides. But to do so, researchers need a detailed, atomic-level understanding of reduction, a key chemical reaction that iron oxides undergo. That knowledge, however, is often lacking because reduction—a process that is effectively the opposite of rusting—proceeds too rapidly for many types of probes to explore at such a fine level.

In a new effort to study the microscopic details of metal oxide reduction, researchers used a specially adapted transmission electron microscope (TEM) at NIST’s NanoLab facility to document the step-by-step transformation of nanocrystals of the iron oxide hematite (Fe2O3) to the iron oxide magnetite (Fe3O4), and finally to iron metal.

“Even though people have studied iron oxide for many years, there have been no dynamic studies at the atomic scale,” said Wenhui Zhu of the State University of New York at Binghamton, who worked on her doctorate in the NanoLab in 2015 and 2016. “We are seeing what’s actually happening during the entire reduction process instead of studying just the initial steps.”

That’s critical, added NIST’s Renu Sharma, “if you want to control the composition or properties of iron oxides and understand the relationships between them.”

By lowering the temperature of the reaction and decreasing the pressure of the hydrogen gas that acted as the reducing agent, the scientists slowed down the reduction process so that it could be captured with an environmental TEM—a specially configured TEM that can study both solids and gas. The instrument enables researchers to perform atomic-resolution imaging of a sample under real-life conditions—in this case the gaseous environment necessary for iron oxides to undergo reduction–rather than under the vacuum needed in ordinary TEMs.

“This is the most powerful tool I’ve used in my research and one of the very few in the United States,” said Zhu. She, Sharma and their colleagues describe their findings in a recent issue of ACS Nano.

The team examined the reduction process in a bicrystal of iron oxide, consisting of two identical iron oxide crystals rotated at 21.8 degrees with respect to each other. The bicrystal structure also served to slow down the reduction process, making it easier to follow with the environmental TEM.

In studying the reduction reaction, the researchers identified a previously unknown intermediate state in the transformation from magnetite to hematite. In the middle stage, the iron oxide retained its original chemical structure, Fe2O3, but changed the crystallographic arrangement of its atoms from rhombohedral (a diagonally stretched cube) to cubic.

This intermediate state featured a defect in which oxygen atoms fail to populate some of the sites in the crystal that they normally would. This so-called oxygen vacancy defect is not uncommon and is known to strongly influence the electrical and catalytic properties of oxides. But the researchers were surprised to find that the defects occurred in an ordered pattern, which had never been found before in the reduction of Fe2O3 to Fe3O4, Sharma said.

The significance of the intermediate state remains under study, but it may be important for controlling the reduction rate and other properties of the reduction process, she adds. “The more we understand, the better we can manipulate the microstructure of these oxides,” said Zhu. By manipulating the microstructure, researchers may be able to enhance the catalytic activity of iron oxides.

Even though a link has already been provided for the paper, I will give it again along with a citation,

In Situ Atomic-Scale Probing of the Reduction Dynamics of Two-Dimensional Fe2O3 Nanostructures by Wenhui Zhu, Jonathan P. Winterstein, Wei-Chang David Yang, Lu Yuan, Renu Sharma, and Guangwen Zhou. ACS Nano, 2017, 11 (1), pp 656–664 DOI: 10.1021/acsnano.6b06950 Publication Date (Web): December 13, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Transparent silver

This March 21, 2017 news item on Nanowerk is the first I’ve heard of transparent silver; it’s usually transparent aluminum (Note: A link has been removed),

The thinnest, smoothest layer of silver that can survive air exposure has been laid down at the University of Michigan, and it could change the way touchscreens and flat or flexible displays are made (Advanced Materials, “High-performance Doped Silver Films: Overcoming Fundamental Material Limits for Nanophotonic Applications”).

It could also help improve computing power, affecting both the transfer of information within a silicon chip and the patterning of the chip itself through metamaterial superlenses.

A March 21, 2017 University of Michigan  news release, which originated the news item, provides details about the research and features a mention about aluminum,

By combining the silver with a little bit of aluminum, the U-M researchers found that it was possible to produce exceptionally thin, smooth layers of silver that are resistant to tarnishing. They applied an anti-reflective coating to make one thin metal layer up to 92.4 percent transparent.

The team showed that the silver coating could guide light about 10 times as far as other metal waveguides—a property that could make it useful for faster computing. And they layered the silver films into a metamaterial hyperlens that could be used to create dense patterns with feature sizes a fraction of what is possible with ordinary ultraviolet methods, on silicon chips, for instance.

Screens of all stripes need transparent electrodes to control which pixels are lit up, but touchscreens are particularly dependent on them. A modern touch screen is made of a transparent conductive layer covered with a nonconductive layer. It senses electrical changes where a conductive object—such as a finger—is pressed against the screen.

“The transparent conductor market has been dominated to this day by one single material,” said L. Jay Guo, professor of electrical engineering and computer science.

This material, indium tin oxide, is projected to become expensive as demand for touch screens continues to grow; there are relatively few known sources of indium, Guo said.

“Before, it was very cheap. Now, the price is rising sharply,” he said.

The ultrathin film could make silver a worthy successor.

Usually, it’s impossible to make a continuous layer of silver less than 15 nanometers thick, or roughly 100 silver atoms. Silver has a tendency to cluster together in small islands rather than extend into an even coating, Guo said.

By adding about 6 percent aluminum, the researchers coaxed the metal into a film of less than half that thickness—seven nanometers. What’s more, when they exposed it to air, it didn’t immediately tarnish as pure silver films do. After several months, the film maintained its conductive properties and transparency. And it was firmly stuck on, whereas pure silver comes off glass with Scotch tape.

In addition to their potential to serve as transparent conductors for touch screens, the thin silver films offer two more tricks, both having to do with silver’s unparalleled ability to transport visible and infrared light waves along its surface. The light waves shrink and travel as so-called surface plasmon polaritons, showing up as oscillations in the concentration of electrons on the silver’s surface.

Those oscillations encode the frequency of the light, preserving it so that it can emerge on the other side. While optical fibers can’t scale down to the size of copper wires on today’s computer chips, plasmonic waveguides could allow information to travel in optical rather than electronic form for faster data transfer. As a waveguide, the smooth silver film could transport the surface plasmons over a centimeter—enough to get by inside a computer chip.

Here’s a link to and a citation for the paper,

High-Performance Doped Silver Films: Overcoming Fundamental Material Limits for Nanophotonic Applications by Cheng Zhang, Nathaniel Kinsey, Long Chen, Chengang Ji, Mingjie Xu, Marcello Ferrera, Xiaoqing Pan, Vladimir M. Shalaev, Alexandra Boltasseva, and Jay Guo. Advanced Materials DOI: 10.1002/adma.201605177 Version of Record online: 20 MAR 2017

© 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.