Tag Archives: microscopy

Peering into the nanoworld with a microscope that has a resolution of better than five nanometres (five billionths of a metre)

This August 7, 2024 news item on phys.org explains what it means for a microscope to have a resolution of better than five nanometers, Note: A link has been removed,

What does the inside of a cell really look like? In the past, standard microscopes were limited in how well they could answer this question. Now, researchers from the Universities of Göttingen [Netherlands] and Oxford [UK[, in collaboration with the University Medical Center Göttingen (UMG), have succeeded in developing a microscope with resolutions better than five nanometers (five billionths of a meter). This is roughly equivalent to the width of a hair split into 10,000 strands. Their new method was published in Nature Photonics.

An August 2, 2024 University of Göttingen press release (also on EurekAlert but published August 7, 2024), which originated the news item, provides more detail,

Many structures in cells are so small that standard microscopes can only produce fragmented images. Their resolution only begins at around 200 nanometres. However, human cells for instance contain a kind of scaffold of fine tubes that are only around seven nanometres wide. The synaptic cleft, meaning the distance between two nerve cells or between a nerve cell and a muscle cell, is just 10 to 50 nanometres – too small for conventional microscopes. The new microscope, which researchers at the University of Göttingen have helped to develop, promises much richer information. It benefits from a resolution better than five nanometres, enabling it to capture even the tiniest cell structures. It is difficult to imagine something so tiny, but if we were to compare one nanometre with one metre, it would be the equivalent of comparing the diameter of a hazelnut with the diameter of the Earth.

This type of microscope is known as a fluorescence microscope. Their function relies on “single-molecule localization microscopy”, in which individual fluorescent molecules in a sample are switched on and off and their individual positions are then determined very precisely. The entire structure of the sample can then be modelled from the positions of these molecules. The current process enables resolutions of around 10 to 20 nanometres. Professor Jörg Enderlein’s research group at the University of Göttingen’s Faculty of Physics has now been able to double this resolution again – with the help of a highly sensitive detector and special data analysis. This means that even the tiniest details of protein organization in the connecting area between two nerve cells can be very precisely revealed.

“This newly developed technology is a milestone in the field of high-resolution microscopy. It not only offers resolutions in the single-digit nanometre range, but it is also particularly cost-effective and easy to use compared to other methods,” explains Enderlein. The scientists also developed an open-source software package for data processing in the course of publishing their findings. This means that this type of microscopy will be available to a wide range of specialists in the future.

Here’s a link to and a citation for the paper,

Doubling the resolution of fluorescence-lifetime single-molecule localization microscopy with image scanning microscopy by Niels Radmacher, Oleksii Nevskyi, José Ignacio Gallea, Jan Christoph Thiele, Ingo Gregor, Silvio O. Rizzoli & Jörg Enderlein. DOI: https://doi.org/10.1038/s41566-024-01481-4 Nature Photonics (2024) Published02 August 2024

This paper is behind a paywall.

A brainlike (neuromorphic) camera can go beyond diffraction limit of light

Just when I think I’m getting caught up with my backlog along comes something like this. A February 21, 2023 news item on Nanowerk announces research that combines neuromorphic (brainlike) engineering and nanotechnology, Note: A link has been removed,

In a new study, researchers at the Indian Institute of Science (IISc) show how a brain-inspired image sensor can go beyond the diffraction limit of light to detect miniscule objects such as cellular components or nanoparticles invisible to current microscopes. Their novel technique, which combines optical microscopy with a neuromorphic camera and machine learning algorithms, presents a major step forward in pinpointing objects smaller than 50 nanometers in size.

A February 21, 2023 (?) Indian Institute of Science (IISc) press release (also on EurekAlert), which originated the news item, describes the nature of the task and provides some technical details,

Since the invention of optical microscopes, scientists have strived to surpass a barrier called the diffraction limit, which means that the microscope cannot distinguish between two objects if they are smaller than a certain size (typically 200-300 nanometers). Their efforts have largely focused on either modifying the molecules being imaged, or developing better illumination strategies – some of which led to the 2014 Nobel Prize in Chemistry. “But very few have actually tried to use the detector itself to try and surpass this detection limit,” says Deepak Nair, Associate Professor at the Centre for Neuroscience (CNS), IISc, and corresponding author of the study.  

Measuring roughly 40 mm (height) by 60 mm (width) by 25 mm (diameter), and weighing about 100 grams, the neuromorphic camera used in the study mimics the way the human retina converts light into electrical impulses, and has several advantages over conventional cameras. In a typical camera, each pixel captures the intensity of light falling on it for the entire exposure time that the camera focuses on the object, and all these pixels are pooled together to reconstruct an image of the object. In neuromorphic cameras, each pixel operates independently and asynchronously, generating events or spikes only when there is a change in the intensity of light falling on that pixel. This generates sparse and lower amount of data compared to traditional cameras, which capture every pixel value at a fixed rate, regardless of whether there is any change in the scene. This functioning of a neuromorphic camera is similar to how the human retina works, and allows the camera to “sample” the environment with much higher temporal resolution – because it is not limited by a frame rate like normal cameras – and also perform background suppression.  

“Such neuromorphic cameras have a very high dynamic range (>120 dB), which means that you can go from a very low-light environment to very high-light conditions. The combination of the asynchronous nature, high dynamic range, sparse data, and high temporal resolution of neuromorphic cameras make them well-suited for use in neuromorphic microscopy,” explains Chetan Singh Thakur, Assistant Professor at the Department of Electronic Systems Engineering (DESE), IISc, and co-author. 

In the current study, the group used their neuromorphic camera to pinpoint individual fluorescent beads smaller than the limit of diffraction, by shining laser pulses at both high and low intensities, and measuring the variation in the fluorescence levels. As the intensity increases, the camera captures the signal as an “ON” event, while an “OFF” event is reported when the light intensity decreases. The data from these events were pooled together to reconstruct frames. 

To accurately locate the fluorescent particles within the frames, the team used two methods. The first was a deep learning algorithm, trained on about one and a half million image simulations that closely represented the experimental data, to predict where the centroid of the object could be, explains Rohit Mangalwedhekar, former research intern at CNS and first author of the study. A wavelet segmentation algorithm was also used to determine the centroids of the particles separately for the ON and the OFF events. Combining the predictions from both allowed the team to zero in on the object’s precise location with greater accuracy than existing techniques.  

“In biological processes like self-organisation, you have molecules that are alternating between random or directed movement, or that are immobilised,” explains Nair. “Therefore, you need to have the ability to locate the centre of this molecule with the highest precision possible so that we can understand the thumb rules that allow the self-organisation.” The team was able to closely track the movement of a fluorescent bead moving freely in an aqueous solution using this technique. This approach can, therefore, have widespread applications in precisely tracking and understanding stochastic processes in biology, chemistry and physics.  

Caption: Transformation of cumulative probability density of ON and OFF processes allows localisation below the limit of classical single particle detection. Credit: Mangalwedhekar et al

Here’s a link to and a citation for the paper,

Achieving nanoscale precision using neuromorphic localization microscopy by Rohit Mangalwedhekar, Nivedita Singh, Chetan Singh Thakur, Chandra Sekhar Seelamantula, Mini Jose & Deepak Nair. Nature Nanotechnology volume 18, pages 380–389 (2023) DOI: https://doi.org/10.1038/s41565-022-01291-1 Published online: 23 January 2023 Issue Date: April 2023

This paper is behind a paywall.

A roly-poly (woodlouse) gold rush

This environmental monitoring story focused on the roly-poly was announced in an April 18, 2023 news item on Statnano,

The woodlouse goes by many names: roly-poly, pill bug, potato bug, tomato bug, butchy boy, cheesy bob, and chiggy pig, to name just a few. It is best known for contracting into a ball when agitated. This crustacean (yes, it’s a crustacean, not an insect) thrives in heavily metal-contaminated areas due to its specialized digestive organ, called a hepatopancreas, that stores and expels unwanted metals.

Metal nanoparticles are common in industrial and research plants. However, they can leach into the surrounding environment. Currently, little is known about the toxicity of metal nanoparticles for nearby organisms because detecting metal nanoparticles, particularly gold, requires microscopic, 3D imaging that cannot be done in the field

….

Caption: (a) Cartoon of a woodlouse depicting the hepatopancreas (HP) and the hind gut (HG). (b) Transmission overview of a single HP tubule, showing the helical structure. (c) Section from a HP tubule with the nuclei fluorescently labeled in blue. Credit: Iestyn Pope, Nuno G.C. Ferreira, Peter Kille, Wolfgang Langbein, and Paola Borri

An April 11, 2023 American Institute of Physics (AIP) news release (also on EurekAlert), which originated the news item, describes a new approach to detecting gold nanoparticles in roly-polys,

In Applied Physics Letters, by AIP Publishing, researchers from Cardiff University in the U.K. introduce a novel imaging method to detect gold nanoparticles in woodlice. With information about the quantity, location, and impact of gold nanoparticles within the organism, scientists can better understand the potential harm other metals may have on nature.

“Gold nanoparticles are used extensively for biological research applications owing to their biocompatibility and photostability and are available in a large range of shapes and sizes,” said author Wolfgang Langbein. “By using gold nanoparticles, which would not normally be present in the woodlice diet, we can study the journey of nanoparticles inside complex biological systems.”

The researchers developed an imaging method known as four-wave mixing microscopy, which flashes light that the gold nanoparticles absorb. The light flashes again and the subsequent scattering reveals the nanoparticles’ locations. Using this state-of-the-art technique, they locate the individual gold nanoparticles in the 3D cellular environment.

“By precisely pinpointing the fate of individual gold nanoparticles in the hepatopancreas of woodlice, we can gain a better understanding of how these organisms sequester and respond to metals ingested from the environment,” said Langbein. “Tracking this metal within these organisms is the first step enabling further study to determine, for example, if gold is collected within specific cells, or if it can interfere with the metabolisms in high doses.”

The use of gold nanoparticles in medical devices is increasing and with it, their abundance in the environment. This imaging technique will provide clarity into the little-understood mechanisms in the woodlice hepatopancreas and will also provide helpful environmental monitoring.

In the future, background-free four-wave mixing microscopy could be used to detect other metal nanoparticles and may be applied to organisms like fish larvae and even human cell cultures.

Here’s a link to and a citation for the paper,

Background-free four-wave mixing microscopy of small gold nanoparticles inside a multi-cellular organ by Iestyn Pope, Nuno G.C. Ferreira, Peter Kille, Wolfgang Langbein, and Paola Borri. Appl. Phys. Lett. 122, 153701 (2023) DOI: https://doi.org/10.1063/5.0140651Published online April 11, 2023

This paper is open access.

Frugal science, foldable microscopes, and curiosity: a talk on June 3, 2019 at Simon Fraser University (Burnaby, Canada) … it’s in Metro Vancouver

This is the second frugal science item* I’m publishing today (May 29, 2019) which means that I’ve gone from complete ignorance on the topic to collecting news items about it. Manu Prakash, the developer behind a usable paper microscope than can be folded and kept in your pocket, is going to be giving a talk locally according to a May 28, 2019 announcement (received via email) from Simon Fraser University’s (SFU) Faculty of Science,

On June 3rd [2019], at 7:30 pmManu Prakash from Stanford University will give the Herzberg Public Lecture in conjunction with this year’s Canadian Association of Physicists (CAP) conference that the department is hosting. Dr. Prakash’s lecture is entitled “Frugal Science in the Age of Curiosity”. Tickets are free and can be obtained through Eventbrite: https://t.co/WNrPh9fop5 . 

This presentation will be held at the Shrum Science Centre Chemistry C9001 Lecture Theatre, Burnaby campus (instead of the Diamond Family Auditorium).

There’s a synopsis of the talk on the Herzbergy Public Lecture: Frugal Science in the Age of Curiosity webpage,

Science faces an accessibility challenge. Although information/knowledge is fast becoming available to everyone around the world, the experience of science is significantly limited. One approach to solving this challenge is to democratize access to scientific tools. Manu Prakash believes this can be achieved via “Frugal science”; a philosophy that inspires design, development, and deployment of ultra-affordable yet powerful scientific tools for the masses. Using examples from his own work (Foldscope: one-dollar origami microscope, Paperfuge: a twenty-cent high-speed centrifuge), Dr. Prakash will describe the process of identifying challenges, designing solutions, and deploying these tools globally to enable open ended scientific curiosity/inquiries in communities around the world. By connecting the dots between science education, global health and environmental monitoring, he will explore the role of “simple” tools in advancing access to better human and planetary health in a resource limited world.

If you’re curious there is a Foldscope website where you can find out more and/or get a Foldscope for yourself.

In addition to the talk, there is a day-long workshop for teachers (as part of the 2019 CAP Congress) with Dr. Donna Strickland the University of Waterloo researcher who won the 2018 Nobel Prize for physics. If you want to learn how to make a Foldscope, t here is also a one hour session for which you can register separately from the day-long event,. (I featured Strickland and her win in an October 3, 2018 posting.)

Getting back to the main event. Dr. Prakash’s evening talk, you can register here.

*ETA May 29, 2019 at 1120 hours PDT: My first posting on frugal science is Frugal science: ancient toys for state-of-the-art science. It’s about a 3D printable centrifuge based on a toy known (in English) as a whirligig.

“Innovation and its enemies” and “Science in Wonderland”: a commentary on two books and a few thoughts about fish (1 of 2)

There’s more than one way to approach the introduction of emerging technologies and sciences to ‘the public’. Calestous Juma in his 2016 book, ”Innovation and Its Enemies; Why People Resist New Technologies” takes a direct approach, as can be seen from the title while Melanie Keene’s 2015 book, “Science in Wonderland; The Scientific Fairy Tales of Victorian Britain” presents a more fantastical one. The fish in the headline tie together, thematically and tenuously, both books with a real life situation.

Innovation and Its Enemies

Calestous Juma, the author of “Innovation and Its Enemies” has impressive credentials,

  • Professor of the Practice of International Development,
  • Director of the Science, Technology, and Globalization Project at Harvard Kennedy School’s Better Science and International Affairs,
  • Founding Director of the African Centre for Technology Studies in Nairobi (Kenya),
  • Fellow of the Royal Society of London, and
  • Foreign Associate of the US National Academy of Sciences.

Even better, Juma is an excellent storyteller perhaps too much so for a book which presents a series of science and technology adoption case histories. (Given the range of historical time periods, geography, and the innovations themselves, he always has to stop short.)  The breadth is breathtaking and Juma manages with aplomb. For example, the innovations covered include: coffee, electricity, mechanical refrigeration, margarine, recorded sound, farm mechanization, and the printing press. He also covers two recently emerging technologies/innovations: transgenic crops and AquAdvantage salmon (more about the salmon later).

Juma provides an analysis of the various ways in which the public and institutions panic over innovation and goes on to offer solutions. He also injects a subtle note of humour from time to time. Here’s how Juma describes various countries’ response to risks and benefits,

In the United States products are safe until proven risky.

In France products are risky until proven safe.

In the United Kingdom products are risky even when proven safe.

In India products are safe when proven risky.

In Canada products are neither safe nor risky.

In Japan products are either safe or risky.

In Brazil products are both safe and risky.

In sub-Saharan Africa products are risky even if they do not exist. (pp. 4-5)

To Calestous Juma, thank you for mentioning Canada and for so aptly describing the quintessentially Canadian approach to not just products and innovation but to life itself, ‘we just don’t know; it could be this or it could be that or it could be something entirely different; we just don’t know and probably will never know.’.

One of the aspects that I most appreciated in this book was the broadening of the geographical perspective on innovation and emerging technologies to include the Middle East, China, and other regions/countries. As I’ve  noted in past postings, much of the discussion here in Canada is Eurocentric and/or UScentric. For example, the Council of Canadian Academies which conducts assessments of various science questions at the request of Canadian and regional governments routinely fills the ‘international’ slot(s) for their expert panels with academics from Europe (mostly Great Britain) and/or the US (or sometimes from Australia and/or New Zealand).

A good example of Juma’s expanded perspective on emerging technology is offered in Art Carden’s July 7, 2017 book review for Forbes.com (Note: A link has been removed),

In the chapter on coffee, Juma discusses how Middle Eastern and European societies resisted the beverage and, in particular, worked to shut down coffeehouses. Islamic jurists debated whether the kick from coffee is the same as intoxication and therefore something to be prohibited. Appealing to “the principle of original permissibility — al-ibaha, al-asliya — under which products were considered acceptable until expressly outlawed,” the fifteenth-century jurist Muhamad al-Dhabani issued several fatwas in support of keeping coffee legal.

This wasn’t the last word on coffee, which was banned and permitted and banned and permitted and banned and permitted in various places over time. Some rulers were skeptical of coffee because it was brewed and consumed in public coffeehouses — places where people could indulge in vices like gambling and tobacco use or perhaps exchange unorthodox ideas that were a threat to their power. It seems absurd in retrospect, but political control of all things coffee is no laughing matter.

The bans extended to Europe, where coffee threatened beverages like tea, wine, and beer. Predictably, and all in the name of public safety (of course!), European governments with the counsel of experts like brewers, vintners, and the British East India Tea Company regulated coffee importation and consumption. The list of affected interest groups is long, as is the list of meddlesome governments. Charles II of England would issue A Proclamation for the Suppression of Coffee Houses in 1675. Sweden prohibited coffee imports on five separate occasions between 1756 and 1817. In the late seventeenth century, France required that all coffee be imported through Marseilles so that it could be more easily monopolized and taxed.

Carden who teaches economics at Stanford University (California, US) focuses on issues of individual liberty and the rule of law with regards to innovation. I can appreciate the need to focus tightly when you have a limited word count but Carden could have a spared a few words to do more justice to Juma’s comprehensive and focused work.

At the risk of being accused of the fault I’ve attributed to Carden, I must mention the printing press chapter. While it was good to see a history of the printing press and attendant social upheavals noting its impact and discovery in regions other than Europe; it was shocking to someone educated in Canada to find Marshall McLuhan entirely ignored. Even now, I believe it’s virtually impossible to discuss the printing press as a technology, in Canada anyway, without mentioning our ‘communications god’ Marshall McLuhan and his 1962 book, The Gutenberg Galaxy.

Getting back to Juma’s book, his breadth and depth of knowledge, history, and geography is packaged in a relatively succinct 316 pp. As a writer, I admire his ability to distill the salient points and to devote chapters on two emerging technologies. It’s notoriously difficult to write about a currently emerging technology and Juma even managed to include a reference published only months (in early 2016) before “Innovation and its enemires” was published in July 2016.

Irrespective of Marshall McLuhan, I feel there are a few flaws. The book is intended for policy makers and industry (lobbyists, anyone?), he reaffirms (in academia, industry, government) a tendency toward a top-down approach to eliminating resistance. From Juma’s perspective, there needs to be better science education because no one who is properly informed should have any objections to an emerging/new technology. Juma never considers the possibility that resistance to a new technology might be a reasonable response. As well, while there was some mention of corporate resistance to new technologies which might threaten profits and revenue, Juma didn’t spare any comments about how corporate sovereignty and/or intellectual property issues are used to stifle innovation and quite successfully, by the way.

My concerns aside, testimony to the book’s worth is Carden’s review almost a year after publication. As well, Sir Peter Gluckman, Chief Science Advisor to the federal government of New Zealand, mentions Juma’s book in his January 16, 2017 talk, Science Advice in a Troubled World, for the Canadian Science Policy Centre.

Science in Wonderland

Melanie Keene’s 2015 book, “Science in Wonderland; The scientific fairy tales of Victorian Britain” provides an overview of the fashion for writing and reading scientific and mathematical fairy tales and, inadvertently, provides an overview of a public education programme,

A fairy queen (Victoria) sat on the throne of Victoria’s Britain, and she presided over a fairy tale age. The nineteenth century witnessed an unprecedented interest in fairies and in their tales, as they were used as an enchanted mirror in which to reflection question, and distort contemporary society.30  …  Fairies could be found disporting themselves thought the century on stage and page, in picture and print, from local haunts to global transports. There were myriad ways in which authors, painters, illustrators, advertisers, pantomime performers, singers, and more, capture this contemporary enthusiasm and engaged with fairyland and folklore; books, exhibitions, and images for children were one of the most significant. (p. 13)

… Anthropologists even made fairies the subject of scientific analysis, as ‘fairyology’ determined whether fairies should be part of natural history or part of supernatural lore; just on aspect of the revival of interest in folklore. Was there a tribe of fairy creatures somewhere out thee waiting to be discovered, across the globe of in the fossil record? Were fairies some kind of folks memory of any extinct race? (p. 14)

Scientific engagements with fairyland was widespread, and not just as an attractive means of packaging new facts for Victorian children.42 … The fairy tales of science had an important role to play in conceiving of new scientific disciplines; in celebrating new discoveries; in criticizing lofty ambitions; in inculcating habits of mind and body; in inspiring wonder; in positing future directions; and in the consideration of what the sciences were, and should be. A close reading of these tales provides a more sophisticated understanding of the content and status of the Victorian sciences; they give insights into what these new scientific disciplines were trying to do; how they were trying to cement a certain place in the world; and how they hoped to recruit and train new participants. (p. 18)

Segue: Should you be inclined to believe that society has moved on from fairies; it is possible to become a certified fairyologist (check out the fairyologist.com website).

“Science in Wonderland,” the title being a reference to Lewis Carroll’s Alice, was marketed quite differently than “innovation and its enemies”. There is no description of the author, as is the protocol in academic tomes, so here’s more from her webpage on the University of Cambridge (Homerton College) website,

Role:
Fellow, Graduate Tutor, Director of Studies for History and Philosophy of Science

Getting back to Keene’s book, she makes the point that the fairy tales were based on science and integrated scientific terminology in imaginative ways although some books with more success than other others. Topics ranged from paleontology, botany, and astronomy to microscopy and more.

This book provides a contrast to Juma’s direct focus on policy makers with its overview of the fairy narratives. Keene is primarily interested in children but her book casts a wider net  “… they give insights into what these new scientific disciplines were trying to do; how they were trying to cement a certain place in the world; and how they hoped to recruit and train new participants.”

In a sense both authors are describing how technologies are introduced and integrated into society. Keene provides a view that must seem almost halcyon for many contemporary innovation enthusiasts. As her topic area is children’s literature any resistance she notes is primarily literary invoking a debate about whether or not science was killing imagination and whimsy.

It would probably help if you’d taken a course in children’s literature of the 19th century before reading Keene’s book is written . Even if you haven’t taken a course, it’s still quite accessible, although I was left wondering about ‘Alice in Wonderland’ and its relationship to mathematics (see Melanie Bayley’s December 16, 2009 story for the New Scientist for a detailed rundown).

As an added bonus, fairy tale illustrations are included throughout the book along with a section of higher quality reproductions.

One of the unexpected delights of Keene’s book was the section on L. Frank Baum and his electricity fairy tale, “The Master Key.” She stretches to include “The Wizard of Oz,” which doesn’t really fit but I can’t see how she could avoid mentioning Baum’s most famous creation. There’s also a surprising (to me) focus on water, which when it’s paired with the interest in microscopy makes sense. Keene isn’t the only one who has to stretch to make things fit into her narrative and so from water I move onto fish bringing me back to one of Juma’s emerging technologies

Part 2: Fish and final comments

Getting a more complete picture of aerosol particles at the nanoscale

What is in the air we breathe? In addition to the gases we learned about in school there are particles, not just the dust particles you can see, but micro- and nanoparticles too and scientists would like to know more about them.

An August 23, 2017 news item on Nanowerk features work which may help scientists in their quest,

They may be tiny and invisible, says Xiaoji Xu, but the aerosol particles suspended in gases play a role in cloud formation and environmental pollution and can be detrimental to human health.

Aerosol particles, which are found in haze, dust and vehicle exhaust, measure in the microns. One micron is one-millionth of a meter; a thin human hair is about 30 microns thick.

The particles, says Xu, are among the many materials whose chemical and mechanical properties cannot be fully measured until scientists develop a better method of studying materials at the microscale as well as the much smaller nanoscale (1 nm is one-billionth of a meter).

Xu, an assistant professor of chemistry, has developed such a method and utilized it to perform noninvasive chemical imaging of a variety of materials, as well as mechanical mapping with a spatial resolution of 10 nanometers.

The technique, called peak force infrared (PFIR) microscopy, combines spectroscopy and scanning probe microscopy. In addition to shedding light on aerosol particles, Xu says, PFIR will help scientists study micro- and nanoscale phenomena in a variety of inhomogeneous materials.

The lower portion of this image by Xiaoji Xu’s group shows the operational scheme of peak force infrared (PFIR) microscopy. The upper portion shows the topography of nanoscale PS-b-PMMA polymer islands on a gold substrate. (Image courtesy of Xiaoji Xu)

An August 22, 2017 Lehih University news release by Kurt Pfitzer (also on EurekAlert), which originated the news item, explains the research in more detail (Note: A link has been removed),

“Materials in nature are rarely homogeneous,” says Xu. “Functional polymer materials often consist of nanoscale domains that have specific tasks. Cellular membranes are embedded with proteins that are nanometers in size. Nanoscale defects of materials exist that affect their mechanical and chemical properties.

“PFIR microscopy represents a fundamental breakthrough that will enable multiple innovations in areas ranging from the study of aerosol particles to the investigation of heterogeneous and biological materials,” says Xu.

Xu and his group recently reported their results in an article titled “Nanoscale simultaneous chemical and mechanical imaging via peak force infrared microscopy.” The article was published in Science Advances, a journal of the American Association for the Advancement of Science, which also publishes Science magazine.

The article’s lead author is Le Wang, a Ph.D. student at Lehigh. Coauthors include Xu and Lehigh Ph.D. students Haomin Wang and Devon S. Jakob, as well as Martin Wagner of Bruker Nano in Santa Barbara, Calif., and Yong Yan of the New Jersey Institute of Technology.

“PFIR microscopy enables reliable chemical imaging, the collection of broadband spectra, and simultaneous mechanical mapping in one simple setup with a spatial resolution of ~10 nm,” the group wrote.

“We have investigated three types of representative materials, namely, soft polymers, perovskite crystals and boron nitride nanotubes, all of which provide a strong PFIR resonance for unambiguous nanochemical identification. Many other materials should be suited as well for the multimodal characterization that PFIR microscopy has to offer.

“In summary, PFIR microscopy will provide a powerful analytical tool for explorations at the nanoscale across wide disciplines.”

Xu and Le Wang also published a recent article about the use of PFIR to study aerosols. Titled “Nanoscale spectroscopic and mechanical characterization of individual aerosol particles using peak force infrared microscopy,” the article appeared in an “Emerging Investigators” issue of Chemical Communications, a journal of the Royal Society of Chemistry. Xu was featured as one of the emerging investigators in the issue. The article was coauthored with researchers from the University of Macau and the City University of Hong Kong, both in China.

PFIR simultaneously obtains chemical and mechanical information, says Xu. It enables researchers to analyze a material at various places, and to determine its chemical compositions and mechanical properties at each of these places, at the nanoscale.

“A material is not often homogeneous,” says Xu. “Its mechanical properties can vary from one region to another. Biological systems such as cell walls are inhomogeneous, and so are materials with defects. The features of a cell wall measure about 100 nanometers in size, placing them well within range of PFIR and its capabilities.”

PFIR has several advantages over scanning near-field optical microscopy (SNOM), the current method of measuring material properties, says Xu. First, PFIR obtains a fuller infrared spectrum and a sharper image—6-nm spatial resolution—of a wider variety of materials than does SNOM. SNOM works well with inorganic materials, but does not obtain as strong an infrared signal as the Lehigh technique does from softer materials such as polymers or biological materials.

“Our technique is more robust,” says Xu. “It works better with soft materials, chemical as well as biological.”

The second advantage of PFIR is that it can perform what Xu calls point spectroscopy.

“If there is something of interest chemically on a surface,” Xu says, “I put an AFM [atomic force microscopy] probe to that location to measure the peak-force infrared response.

“It is very difficult to obtain these spectra with current scattering-type scanning near-field optical microscopy. It can be done, but it requires very expensive light sources. Our method uses a narrow-band infrared laser and costs about $100,000. The existing method uses a broadband light source and costs about $300,000.”

A third advantage, says Xu, is that PFIR obtains a mechanical as well as a chemical response from a material.

“No other spectroscopy method can do this,” says Xu. “Is a material rigid or soft? Is it inhomogeneous—is it soft in one area and rigid in another? How does the composition vary from the soft to the rigid areas? A material can be relatively rigid and have one type of chemical composition in one area, and be relatively soft with another type of composition in another area.

“Our method simultaneously obtains chemical and mechanical information. It will be useful for analyzing a material at various places and determining its compositions and mechanical properties at each of these places, at the nanoscale.”

A fourth advantage of PFIR is its size, says Xu.

“We use a table-top laser to get infrared spectra. Ours is a very compact light source, as opposed to the much larger sizes of competing light sources. Our laser is responsible for gathering information concerning chemical composition. We get mechanical information from the AFM [atomic force microscope]. We integrate the two types of measurements into one device to simultaneously obtain two channels of information.”

Although PFIR does not work with liquid samples, says Xu, it can measure the properties of dried biological samples, including cell walls and protein aggregates, achieving a 10-nm spatial resolution without staining or genetic modification.

This looks like very exciting work.

Here are links and citations for both studies mentioned in the news release (the most recently published being cited first),

Nanoscale simultaneous chemical and mechanical imaging via peak force infrared microscopy by Le Wang, Haomin Wang, Martin Wagner, Yong Yan, Devon S. Jakob, and Xiaoji G. Xu. Science Advances 23 Jun 2017: Vol. 3, no. 6, e1700255 DOI: 10.1126/sciadv.1700255

Nanoscale spectroscopic and mechanical characterization of individual aerosol particles using peak force infrared microscopy by Le Wang, Dandan Huang, Chak K. Chan, Yong Jie Li, and Xiaoji G. Xu. Chem. Commun., 2017,53, 7397-7400 DOI: 10.1039/C7CC02301D First published on 16 Jun 2017

The June 23, 2017 paper is open access while the June 16, 2017 paper is behind a paywall.

Watching rust turn into iron

a) Colorized SEM images of iron oxide nanoblades used in the experiment. b) Colorized cross-section of SEM image of the nanoblades. c) Colorized SEM image of nanoblades after 1 hour of reduction reaction at 500 °C in molecular hydrogen, showing the sawtooth shape along the edges (square). d) Colorized SEM image showing the formation of holes after 2 hours of reduction. The scale bar is 1 micrometer. Credit: W. Zhu et al./ACS Nano and K. Irvine/NIST

Here’s more about being able to watch iron transition from one state to the next according to an April 5, 2017 news item on phys.org

Using a state-of-the-art microscopy technique, experimenters at the National Institute of Standards and Technology (NIST) and their colleagues have witnessed a slow-motion, atomic-scale transformation of rust—iron oxide—back to pure iron metal, in all of its chemical steps.

An April 4, 2017 NIST news release describes the role iron plays in modern lifestyles and the purpose of this research,

Among the most abundant minerals on Earth, iron oxides play a leading role in magnetic data storage, cosmetics, the pigmentation of paints and drug delivery. These materials also serve as catalysts for several types of chemical reactions, including the production of ammonia for fertilizer.

To fine-tune the properties of these minerals for each application, scientists work with nanometer-scale particles of the oxides. But to do so, researchers need a detailed, atomic-level understanding of reduction, a key chemical reaction that iron oxides undergo. That knowledge, however, is often lacking because reduction—a process that is effectively the opposite of rusting—proceeds too rapidly for many types of probes to explore at such a fine level.

In a new effort to study the microscopic details of metal oxide reduction, researchers used a specially adapted transmission electron microscope (TEM) at NIST’s NanoLab facility to document the step-by-step transformation of nanocrystals of the iron oxide hematite (Fe2O3) to the iron oxide magnetite (Fe3O4), and finally to iron metal.

“Even though people have studied iron oxide for many years, there have been no dynamic studies at the atomic scale,” said Wenhui Zhu of the State University of New York at Binghamton, who worked on her doctorate in the NanoLab in 2015 and 2016. “We are seeing what’s actually happening during the entire reduction process instead of studying just the initial steps.”

That’s critical, added NIST’s Renu Sharma, “if you want to control the composition or properties of iron oxides and understand the relationships between them.”

By lowering the temperature of the reaction and decreasing the pressure of the hydrogen gas that acted as the reducing agent, the scientists slowed down the reduction process so that it could be captured with an environmental TEM—a specially configured TEM that can study both solids and gas. The instrument enables researchers to perform atomic-resolution imaging of a sample under real-life conditions—in this case the gaseous environment necessary for iron oxides to undergo reduction–rather than under the vacuum needed in ordinary TEMs.

“This is the most powerful tool I’ve used in my research and one of the very few in the United States,” said Zhu. She, Sharma and their colleagues describe their findings in a recent issue of ACS Nano.

The team examined the reduction process in a bicrystal of iron oxide, consisting of two identical iron oxide crystals rotated at 21.8 degrees with respect to each other. The bicrystal structure also served to slow down the reduction process, making it easier to follow with the environmental TEM.

In studying the reduction reaction, the researchers identified a previously unknown intermediate state in the transformation from magnetite to hematite. In the middle stage, the iron oxide retained its original chemical structure, Fe2O3, but changed the crystallographic arrangement of its atoms from rhombohedral (a diagonally stretched cube) to cubic.

This intermediate state featured a defect in which oxygen atoms fail to populate some of the sites in the crystal that they normally would. This so-called oxygen vacancy defect is not uncommon and is known to strongly influence the electrical and catalytic properties of oxides. But the researchers were surprised to find that the defects occurred in an ordered pattern, which had never been found before in the reduction of Fe2O3 to Fe3O4, Sharma said.

The significance of the intermediate state remains under study, but it may be important for controlling the reduction rate and other properties of the reduction process, she adds. “The more we understand, the better we can manipulate the microstructure of these oxides,” said Zhu. By manipulating the microstructure, researchers may be able to enhance the catalytic activity of iron oxides.

Even though a link has already been provided for the paper, I will give it again along with a citation,

In Situ Atomic-Scale Probing of the Reduction Dynamics of Two-Dimensional Fe2O3 Nanostructures by Wenhui Zhu, Jonathan P. Winterstein, Wei-Chang David Yang, Lu Yuan, Renu Sharma, and Guangwen Zhou. ACS Nano, 2017, 11 (1), pp 656–664 DOI: 10.1021/acsnano.6b06950 Publication Date (Web): December 13, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Café Scientifique (Vancouver, Canada) April 25, 2017 talk: No Small Feat: Seeing Atoms and Molecules

I thought I’d been knocked off the list but finally I have a notice for an upcoming Café Scientifique talk that arrived and before the event, at that.  From an April 12, 2017 notice (received via email),

Our next café will happen on TUESDAY APRIL 25TH, 7:30PM in the back
room at YAGGER’S DOWNTOWN (433 W Pender). Our speaker for the
evening will be DR. SARAH BURKE, an Assistant Professor in the
Department of Physics and Astronomy/ Department of Chemistry at UBC [University of British Columbia]. The title of her talk is:

NO SMALL FEAT: SEEING ATOMS AND MOLECULES

From solar cells to superconductivity, the properties of materials and
the devices we make from them arise from the atomic scale structure of
the atoms that make up the material, their electrons, and how they all
interact.  Seeing this takes a microscope, but not like the one you may
have had as a kid or used in a university lab, which are limited to
seeing objects on the scale of the wavelength of visible light: still
thousands of times bigger than the size of an atom.  Scanning probe
microscopes operate more like a nanoscale record player, scanning a very
sharp tip over a surface and measuring interactions between the tip and
surface to create atomically resolved images.  These techniques show us
where atoms and electrons live at surfaces, on nanostructures, and in
molecules.  I will describe how these techniques give us a powerful
glimpse into a tiny world.

I have a little more about Sarah Burke from her webpage in the UBC Physics and Astronomy webspace,

Building an understanding of important electronic and optoelectronic processes in nanoscale materials from the atomic scale up will pave the way for next generation materials and technologies.

My research interests broadly encompass the study of electronic processes where nanoscale structure influences or reveals the underlying physics. Using scanning probe microscopy (SPM) techniques, my group investigates materials for organic electronics and optoelectronics, graphene and other carbon-based nanomaterials, and other materials where a nanoscale view offers the potential for new understanding. We also work to expand the SPM toolbox; developing new methods in order to probe different aspects of materials, and working to understand leading edge techniques.

For the really curious, you can find more information about her research group, UBC Laboratory for Atomic Imaging Research (LAIR) here.

Atomic force microscope (AFM) shrunk down to a dime-sized device?

Before getting to the announcement, here’s a little background from Dexter Johnson’s Feb. 21, 2017 posting on his NanoClast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website; Note: Links have been removed),

Ever since the 1980s, when Gerd Binnig of IBM first heard that “beautiful noise” made by the tip of the first scanning tunneling microscope (STM) dragging across the surface of an atom, and he later developed the atomic force microscope (AFM), these microscopy tools have been the bedrock of nanotechnology research and development.

AFMs have continued to evolve over the years, and at one time, IBM even looked into using them as the basis of a memory technology in the company’s Millipede project. Despite all this development, AFMs have remained bulky and expensive devices, costing as much as $50,000 [or more].

Now, here’s the announcement in a Feb. 15, 2017 news item on Nanowerk,

Researchers at The University of Texas at Dallas have created an atomic force microscope on a chip, dramatically shrinking the size — and, hopefully, the price tag — of a high-tech device commonly used to characterize material properties.

“A standard atomic force microscope is a large, bulky instrument, with multiple control loops, electronics and amplifiers,” said Dr. Reza Moheimani, professor of mechanical engineering at UT Dallas. “We have managed to miniaturize all of the electromechanical components down onto a single small chip.”

A Feb. 15, 2017 University of Texas at Dallas news release, which originated the news item, provides more detail,

An atomic force microscope (AFM) is a scientific tool that is used to create detailed three-dimensional images of the surfaces of materials, down to the nanometer scale — that’s roughly on the scale of individual molecules.

The basic AFM design consists of a tiny cantilever, or arm, that has a sharp tip attached to one end. As the apparatus scans back and forth across the surface of a sample, or the sample moves under it, the interactive forces between the sample and the tip cause the cantilever to move up and down as the tip follows the contours of the surface. Those movements are then translated into an image.

“An AFM is a microscope that ‘sees’ a surface kind of the way a visually impaired person might, by touching. You can get a resolution that is well beyond what an optical microscope can achieve,” said Moheimani, who holds the James Von Ehr Distinguished Chair in Science and Technology in the Erik Jonsson School of Engineering and Computer Science. “It can capture features that are very, very small.”

The UT Dallas team created its prototype on-chip AFM using a microelectromechanical systems (MEMS) approach.

“A classic example of MEMS technology are the accelerometers and gyroscopes found in smartphones,” said Dr. Anthony Fowler, a research scientist in Moheimani’s Laboratory for Dynamics and Control of Nanosystems and one of the article’s co-authors. “These used to be big, expensive, mechanical devices, but using MEMS technology, accelerometers have shrunk down onto a single chip, which can be manufactured for just a few dollars apiece.”

The MEMS-based AFM is about 1 square centimeter in size, or a little smaller than a dime. It is attached to a small printed circuit board, about half the size of a credit card, which contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device.

Conventional AFMs operate in various modes. Some map out a sample’s features by maintaining a constant force as the probe tip drags across the surface, while others do so by maintaining a constant distance between the two.

“The problem with using a constant height approach is that the tip is applying varying forces on a sample all the time, which can damage a sample that is very soft,” Fowler said. “Or, if you are scanning a very hard surface, you could wear down the tip,”

The MEMS-based AFM operates in “tapping mode,” which means the cantilever and tip oscillate up and down perpendicular to the sample, and the tip alternately contacts then lifts off from the surface. As the probe moves back and forth across a sample material, a feedback loop maintains the height of that oscillation, ultimately creating an image.

“In tapping mode, as the oscillating cantilever moves across the surface topography, the amplitude of the oscillation wants to change as it interacts with sample,” said Dr. Mohammad Maroufi, a research associate in mechanical engineering and co-author of the paper. “This device creates an image by maintaining the amplitude of oscillation.”

Because conventional AFMs require lasers and other large components to operate, their use can be limited. They’re also expensive.

“An educational version can cost about $30,000 or $40,000, and a laboratory-level AFM can run $500,000 or more,” Moheimani said. “Our MEMS approach to AFM design has the potential to significantly reduce the complexity and cost of the instrument.

“One of the attractive aspects about MEMS is that you can mass produce them, building hundreds or thousands of them in one shot, so the price of each chip would only be a few dollars. As a result, you might be able to offer the whole miniature AFM system for a few thousand dollars.”

A reduced size and price tag also could expand the AFMs’ utility beyond current scientific applications.

“For example, the semiconductor industry might benefit from these small devices, in particular companies that manufacture the silicon wafers from which computer chips are made,” Moheimani said. “With our technology, you might have an array of AFMs to characterize the wafer’s surface to find micro-faults before the product is shipped out.”

The lab prototype is a first-generation device, Moheimani said, and the group is already working on ways to improve and streamline the fabrication of the device.

“This is one of those technologies where, as they say, ‘If you build it, they will come.’ We anticipate finding many applications as the technology matures,” Moheimani said.

In addition to the UT Dallas researchers, Michael Ruppert, a visiting graduate student from the University of Newcastle in Australia, was a co-author of the journal article. Moheimani was Ruppert’s doctoral advisor.

So, an AFM that could cost as much as $500,000 for a laboratory has been shrunk to this size and become far less expensive,

A MEMS-based atomic force microscope developed by engineers at UT Dallas is about 1 square centimeter in size (top center). Here it is attached to a small printed circuit board that contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device. Courtesy: University of Texas at Dallas

Of course, there’s still more work to be done as you’ll note when reading Dexter’s Feb. 21, 2017 posting where he features answers to questions he directed to the researchers.

Here’s a link to and a citation for the paper,

On-Chip Dynamic Mode Atomic Force Microscopy: A Silicon-on-Insulator MEMS Approach by  Michael G. Ruppert, Anthony G. Fowler, Mohammad Maroufi, S. O. Reza Moheimani. IEEE Journal of Microelectromechanical Systems Volume: 26 Issue: 1  Feb. 2017 DOI: 10.1109/JMEMS.2016.2628890 Date of Publication: 06 December 2016

This paper is behind a paywall.

Mapping 23,000 atoms in a nanoparticle

Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab

The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,

In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.

The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.

A Feb. 1, 2017 UCLA news release, which originated the news item, provides more detail about the work,

Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.

“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.

Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.

By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.

“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.

The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.

“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”

The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.

“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.

In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.

Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.

That means that radiation-sensitive objects can be imaged with lower doses of radiation.

The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),

Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.

The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.

What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.

Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …

Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.

“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.

Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.

Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.

“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.

A TEAM approach

The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.

The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.

They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.

Translating the data into scientific insights

Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.

“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.

To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.

“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.

Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”

The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),

The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,

… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.

“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.

A Feb. 2, 2017 ORNL news release on EurekAlert, which originated the news item, elucidates further on how their team added to the research,

Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.

Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.

Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.

“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”

The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,

A Supercomputing Milestone

Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.

For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.

“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.

To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.

To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.

“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.

As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.

Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.

“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.

Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.

In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.

Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.

“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.

Finally, here’s a link to and a citation for the paper,

Deciphering chemical order/disorder and material properties at the single-atom level by Yongsoo Yang, Chien-Chun Chen, M. C. Scott, Colin Ophus, Rui Xu, Alan Pryor, Li Wu, Fan Sun, Wolfgang Theis, Jihan Zhou, Markus Eisenbach, Paul R. C. Kent, Renat F. Sabirianov, Hao Zeng, Peter Ercius, & Jianwei Miao. Nature 542, 75–79 (02 February 2017) doi:10.1038/nature21042 Published online 01 February 2017

This paper is behind a paywall.