This is the second frugal science item* I’m publishing today (May 29, 2019) which means that I’ve gone from complete ignorance on the topic to collecting news items about it. Manu Prakash, the developer behind a usable paper microscope than can be folded and kept in your pocket, is going to be giving a talk locally according to a May 28, 2019 announcement (received via email) from Simon Fraser University’s (SFU) Faculty of Science,
On June 3rd , at 7:30 pm, Manu Prakash from Stanford University will give the Herzberg Public Lecture in conjunction with this year’s Canadian Association of Physicists (CAP) conference that the department is hosting. Dr. Prakash’s lecture is entitled “Frugal Science in the Age of Curiosity”. Tickets are free and can be obtained through Eventbrite: https://t.co/WNrPh9fop5 .
This presentation will be held at the Shrum Science Centre Chemistry C9001 Lecture Theatre, Burnaby campus (instead of the Diamond Family Auditorium).
Science faces an accessibility challenge. Although information/knowledge is fast becoming available to everyone around the world, the experience of science is significantly limited. One approach to solving this challenge is to democratize access to scientific tools. Manu Prakash believes this can be achieved via “Frugal science”; a philosophy that inspires design, development, and deployment of ultra-affordable yet powerful scientific tools for the masses. Using examples from his own work (Foldscope: one-dollar origami microscope, Paperfuge: a twenty-cent high-speed centrifuge), Dr. Prakash will describe the process of identifying challenges, designing solutions, and deploying these tools globally to enable open ended scientific curiosity/inquiries in communities around the world. By connecting the dots between science education, global health and environmental monitoring, he will explore the role of “simple” tools in advancing access to better human and planetary health in a resource limited world.
If you’re curious there is a Foldscope website where you can find out more and/or get a Foldscope for yourself.
In addition to the talk, there is a day-long workshop for teachers (as part of the 2019 CAP Congress) with Dr. Donna Strickland the University of Waterloo researcher who won the 2018 Nobel Prize for physics. If you want to learn how to make a Foldscope, t here is also a one hour session for which you can register separately from the day-long event,. (I featured Strickland and her win in an October 3, 2018 posting.)
Getting back to the main event. Dr. Prakash’s evening talk, you can register here.
There’s more than one way to approach the introduction of emerging technologies and sciences to ‘the public’. Calestous Juma in his 2016 book, ”Innovation and Its Enemies; Why People Resist New Technologies” takes a direct approach, as can be seen from the title while Melanie Keene’s 2015 book, “Science in Wonderland; The Scientific Fairy Tales of Victorian Britain” presents a more fantastical one. The fish in the headline tie together, thematically and tenuously, both books with a real life situation.
Innovation and Its Enemies
Calestous Juma, the author of “Innovation and Its Enemies” has impressive credentials,
Professor of the Practice of International Development,
Director of the Science, Technology, and Globalization Project at Harvard Kennedy School’s Better Science and International Affairs,
Founding Director of the African Centre for Technology Studies in Nairobi (Kenya),
Fellow of the Royal Society of London, and
Foreign Associate of the US National Academy of Sciences.
Even better, Juma is an excellent storyteller perhaps too much so for a book which presents a series of science and technology adoption case histories. (Given the range of historical time periods, geography, and the innovations themselves, he always has to stop short.) The breadth is breathtaking and Juma manages with aplomb. For example, the innovations covered include: coffee, electricity, mechanical refrigeration, margarine, recorded sound, farm mechanization, and the printing press. He also covers two recently emerging technologies/innovations: transgenic crops and AquAdvantage salmon (more about the salmon later).
Juma provides an analysis of the various ways in which the public and institutions panic over innovation and goes on to offer solutions. He also injects a subtle note of humour from time to time. Here’s how Juma describes various countries’ response to risks and benefits,
In the United States products are safe until proven risky.
In France products are risky until proven safe.
In the United Kingdom products are risky even when proven safe.
In India products are safe when proven risky.
In Canada products are neither safe nor risky.
In Japan products are either safe or risky.
In Brazil products are both safe and risky.
In sub-Saharan Africa products are risky even if they do not exist. (pp. 4-5)
To Calestous Juma, thank you for mentioning Canada and for so aptly describing the quintessentially Canadian approach to not just products and innovation but to life itself, ‘we just don’t know; it could be this or it could be that or it could be something entirely different; we just don’t know and probably will never know.’.
One of the aspects that I most appreciated in this book was the broadening of the geographical perspective on innovation and emerging technologies to include the Middle East, China, and other regions/countries. As I’ve noted in past postings, much of the discussion here in Canada is Eurocentric and/or UScentric. For example, the Council of Canadian Academies which conducts assessments of various science questions at the request of Canadian and regional governments routinely fills the ‘international’ slot(s) for their expert panels with academics from Europe (mostly Great Britain) and/or the US (or sometimes from Australia and/or New Zealand).
A good example of Juma’s expanded perspective on emerging technology is offered in Art Carden’s July 7, 2017 book review for Forbes.com (Note: A link has been removed),
In the chapter on coffee, Juma discusses how Middle Eastern and European societies resisted the beverage and, in particular, worked to shut down coffeehouses. Islamic jurists debated whether the kick from coffee is the same as intoxication and therefore something to be prohibited. Appealing to “the principle of original permissibility — al-ibaha, al-asliya — under which products were considered acceptable until expressly outlawed,” the fifteenth-century jurist Muhamad al-Dhabani issued several fatwas in support of keeping coffee legal.
This wasn’t the last word on coffee, which was banned and permitted and banned and permitted and banned and permitted in various places over time. Some rulers were skeptical of coffee because it was brewed and consumed in public coffeehouses — places where people could indulge in vices like gambling and tobacco use or perhaps exchange unorthodox ideas that were a threat to their power. It seems absurd in retrospect, but political control of all things coffee is no laughing matter.
The bans extended to Europe, where coffee threatened beverages like tea, wine, and beer. Predictably, and all in the name of public safety (of course!), European governments with the counsel of experts like brewers, vintners, and the British East India Tea Company regulated coffee importation and consumption. The list of affected interest groups is long, as is the list of meddlesome governments. Charles II of England would issue A Proclamation for the Suppression of Coffee Houses in 1675. Sweden prohibited coffee imports on five separate occasions between 1756 and 1817. In the late seventeenth century, France required that all coffee be imported through Marseilles so that it could be more easily monopolized and taxed.
Carden who teaches economics at Stanford University (California, US) focuses on issues of individual liberty and the rule of law with regards to innovation. I can appreciate the need to focus tightly when you have a limited word count but Carden could have a spared a few words to do more justice to Juma’s comprehensive and focused work.
At the risk of being accused of the fault I’ve attributed to Carden, I must mention the printing press chapter. While it was good to see a history of the printing press and attendant social upheavals noting its impact and discovery in regions other than Europe; it was shocking to someone educated in Canada to find Marshall McLuhan entirely ignored. Even now, I believe it’s virtually impossible to discuss the printing press as a technology, in Canada anyway, without mentioning our ‘communications god’ Marshall McLuhan and his 1962 book, The Gutenberg Galaxy.
Getting back to Juma’s book, his breadth and depth of knowledge, history, and geography is packaged in a relatively succinct 316 pp. As a writer, I admire his ability to distill the salient points and to devote chapters on two emerging technologies. It’s notoriously difficult to write about a currently emerging technology and Juma even managed to include a reference published only months (in early 2016) before “Innovation and its enemires” was published in July 2016.
Irrespective of Marshall McLuhan, I feel there are a few flaws. The book is intended for policy makers and industry (lobbyists, anyone?), he reaffirms (in academia, industry, government) a tendency toward a top-down approach to eliminating resistance. From Juma’s perspective, there needs to be better science education because no one who is properly informed should have any objections to an emerging/new technology. Juma never considers the possibility that resistance to a new technology might be a reasonable response. As well, while there was some mention of corporate resistance to new technologies which might threaten profits and revenue, Juma didn’t spare any comments about how corporate sovereignty and/or intellectual property issues are used to stifle innovation and quite successfully, by the way.
My concerns aside, testimony to the book’s worth is Carden’s review almost a year after publication. As well, Sir Peter Gluckman, Chief Science Advisor to the federal government of New Zealand, mentions Juma’s book in his January 16, 2017 talk, Science Advice in a Troubled World, for the Canadian Science Policy Centre.
Science in Wonderland
Melanie Keene’s 2015 book, “Science in Wonderland; The scientific fairy tales of Victorian Britain” provides an overview of the fashion for writing and reading scientific and mathematical fairy tales and, inadvertently, provides an overview of a public education programme,
A fairy queen (Victoria) sat on the throne of Victoria’s Britain, and she presided over a fairy tale age. The nineteenth century witnessed an unprecedented interest in fairies and in their tales, as they were used as an enchanted mirror in which to reflection question, and distort contemporary society.30 … Fairies could be found disporting themselves thought the century on stage and page, in picture and print, from local haunts to global transports. There were myriad ways in which authors, painters, illustrators, advertisers, pantomime performers, singers, and more, capture this contemporary enthusiasm and engaged with fairyland and folklore; books, exhibitions, and images for children were one of the most significant. (p. 13)
… Anthropologists even made fairies the subject of scientific analysis, as ‘fairyology’ determined whether fairies should be part of natural history or part of supernatural lore; just on aspect of the revival of interest in folklore. Was there a tribe of fairy creatures somewhere out thee waiting to be discovered, across the globe of in the fossil record? Were fairies some kind of folks memory of any extinct race? (p. 14)
Scientific engagements with fairyland was widespread, and not just as an attractive means of packaging new facts for Victorian children.42 … The fairy tales of science had an important role to play in conceiving of new scientific disciplines; in celebrating new discoveries; in criticizing lofty ambitions; in inculcating habits of mind and body; in inspiring wonder; in positing future directions; and in the consideration of what the sciences were, and should be. A close reading of these tales provides a more sophisticated understanding of the content and status of the Victorian sciences; they give insights into what these new scientific disciplines were trying to do; how they were trying to cement a certain place in the world; and how they hoped to recruit and train new participants. (p. 18)
Segue: Should you be inclined to believe that society has moved on from fairies; it is possible to become a certified fairyologist (check out the fairyologist.com website).
“Science in Wonderland,” the title being a reference to Lewis Carroll’s Alice, was marketed quite differently than “innovation and its enemies”. There is no description of the author, as is the protocol in academic tomes, so here’s more from her webpage on the University of Cambridge (Homerton College) website,
Fellow, Graduate Tutor, Director of Studies for History and Philosophy of Science
Getting back to Keene’s book, she makes the point that the fairy tales were based on science and integrated scientific terminology in imaginative ways although some books with more success than other others. Topics ranged from paleontology, botany, and astronomy to microscopy and more.
This book provides a contrast to Juma’s direct focus on policy makers with its overview of the fairy narratives. Keene is primarily interested in children but her book casts a wider net “… they give insights into what these new scientific disciplines were trying to do; how they were trying to cement a certain place in the world; and how they hoped to recruit and train new participants.”
In a sense both authors are describing how technologies are introduced and integrated into society. Keene provides a view that must seem almost halcyon for many contemporary innovation enthusiasts. As her topic area is children’s literature any resistance she notes is primarily literary invoking a debate about whether or not science was killing imagination and whimsy.
It would probably help if you’d taken a course in children’s literature of the 19th century before reading Keene’s book is written . Even if you haven’t taken a course, it’s still quite accessible, although I was left wondering about ‘Alice in Wonderland’ and its relationship to mathematics (see Melanie Bayley’s December 16, 2009 story for the New Scientist for a detailed rundown).
As an added bonus, fairy tale illustrations are included throughout the book along with a section of higher quality reproductions.
One of the unexpected delights of Keene’s book was the section on L. Frank Baum and his electricity fairy tale, “The Master Key.” She stretches to include “The Wizard of Oz,” which doesn’t really fit but I can’t see how she could avoid mentioning Baum’s most famous creation. There’s also a surprising (to me) focus on water, which when it’s paired with the interest in microscopy makes sense. Keene isn’t the only one who has to stretch to make things fit into her narrative and so from water I move onto fish bringing me back to one of Juma’s emerging technologies
What is in the air we breathe? In addition to the gases we learned about in school there are particles, not just the dust particles you can see, but micro- and nanoparticles too and scientists would like to know more about them.
They may be tiny and invisible, says Xiaoji Xu, but the aerosol particles suspended in gases play a role in cloud formation and environmental pollution and can be detrimental to human health.
Aerosol particles, which are found in haze, dust and vehicle exhaust, measure in the microns. One micron is one-millionth of a meter; a thin human hair is about 30 microns thick.
The particles, says Xu, are among the many materials whose chemical and mechanical properties cannot be fully measured until scientists develop a better method of studying materials at the microscale as well as the much smaller nanoscale (1 nm is one-billionth of a meter).
Xu, an assistant professor of chemistry, has developed such a method and utilized it to perform noninvasive chemical imaging of a variety of materials, as well as mechanical mapping with a spatial resolution of 10 nanometers.
The technique, called peak force infrared (PFIR) microscopy, combines spectroscopy and scanning probe microscopy. In addition to shedding light on aerosol particles, Xu says, PFIR will help scientists study micro- and nanoscale phenomena in a variety of inhomogeneous materials.
The lower portion of this image by Xiaoji Xu’s group shows the operational scheme of peak force infrared (PFIR) microscopy. The upper portion shows the topography of nanoscale PS-b-PMMA polymer islands on a gold substrate. (Image courtesy of Xiaoji Xu)
“Materials in nature are rarely homogeneous,” says Xu. “Functional polymer materials often consist of nanoscale domains that have specific tasks. Cellular membranes are embedded with proteins that are nanometers in size. Nanoscale defects of materials exist that affect their mechanical and chemical properties.
“PFIR microscopy represents a fundamental breakthrough that will enable multiple innovations in areas ranging from the study of aerosol particles to the investigation of heterogeneous and biological materials,” says Xu.
Xu and his group recently reported their results in an article titled “Nanoscale simultaneous chemical and mechanical imaging via peak force infrared microscopy.” The article was published in Science Advances, a journal of the American Association for the Advancement of Science, which also publishes Science magazine.
The article’s lead author is Le Wang, a Ph.D. student at Lehigh. Coauthors include Xu and Lehigh Ph.D. students Haomin Wang and Devon S. Jakob, as well as Martin Wagner of Bruker Nano in Santa Barbara, Calif., and Yong Yan of the New Jersey Institute of Technology.
“PFIR microscopy enables reliable chemical imaging, the collection of broadband spectra, and simultaneous mechanical mapping in one simple setup with a spatial resolution of ~10 nm,” the group wrote.
“We have investigated three types of representative materials, namely, soft polymers, perovskite crystals and boron nitride nanotubes, all of which provide a strong PFIR resonance for unambiguous nanochemical identification. Many other materials should be suited as well for the multimodal characterization that PFIR microscopy has to offer.
“In summary, PFIR microscopy will provide a powerful analytical tool for explorations at the nanoscale across wide disciplines.”
Xu and Le Wang also published a recent article about the use of PFIR to study aerosols. Titled “Nanoscale spectroscopic and mechanical characterization of individual aerosol particles using peak force infrared microscopy,” the article appeared in an “Emerging Investigators” issue of Chemical Communications, a journal of the Royal Society of Chemistry. Xu was featured as one of the emerging investigators in the issue. The article was coauthored with researchers from the University of Macau and the City University of Hong Kong, both in China.
PFIR simultaneously obtains chemical and mechanical information, says Xu. It enables researchers to analyze a material at various places, and to determine its chemical compositions and mechanical properties at each of these places, at the nanoscale.
“A material is not often homogeneous,” says Xu. “Its mechanical properties can vary from one region to another. Biological systems such as cell walls are inhomogeneous, and so are materials with defects. The features of a cell wall measure about 100 nanometers in size, placing them well within range of PFIR and its capabilities.”
PFIR has several advantages over scanning near-field optical microscopy (SNOM), the current method of measuring material properties, says Xu. First, PFIR obtains a fuller infrared spectrum and a sharper image—6-nm spatial resolution—of a wider variety of materials than does SNOM. SNOM works well with inorganic materials, but does not obtain as strong an infrared signal as the Lehigh technique does from softer materials such as polymers or biological materials.
“Our technique is more robust,” says Xu. “It works better with soft materials, chemical as well as biological.”
The second advantage of PFIR is that it can perform what Xu calls point spectroscopy.
“If there is something of interest chemically on a surface,” Xu says, “I put an AFM [atomic force microscopy] probe to that location to measure the peak-force infrared response.
“It is very difficult to obtain these spectra with current scattering-type scanning near-field optical microscopy. It can be done, but it requires very expensive light sources. Our method uses a narrow-band infrared laser and costs about $100,000. The existing method uses a broadband light source and costs about $300,000.”
A third advantage, says Xu, is that PFIR obtains a mechanical as well as a chemical response from a material.
“No other spectroscopy method can do this,” says Xu. “Is a material rigid or soft? Is it inhomogeneous—is it soft in one area and rigid in another? How does the composition vary from the soft to the rigid areas? A material can be relatively rigid and have one type of chemical composition in one area, and be relatively soft with another type of composition in another area.
“Our method simultaneously obtains chemical and mechanical information. It will be useful for analyzing a material at various places and determining its compositions and mechanical properties at each of these places, at the nanoscale.”
A fourth advantage of PFIR is its size, says Xu.
“We use a table-top laser to get infrared spectra. Ours is a very compact light source, as opposed to the much larger sizes of competing light sources. Our laser is responsible for gathering information concerning chemical composition. We get mechanical information from the AFM [atomic force microscope]. We integrate the two types of measurements into one device to simultaneously obtain two channels of information.”
Although PFIR does not work with liquid samples, says Xu, it can measure the properties of dried biological samples, including cell walls and protein aggregates, achieving a 10-nm spatial resolution without staining or genetic modification.
This looks like very exciting work.
Here are links and citations for both studies mentioned in the news release (the most recently published being cited first),
a) Colorized SEM images of iron oxide nanoblades used in the experiment. b) Colorized cross-section of SEM image of the nanoblades. c) Colorized SEM image of nanoblades after 1 hour of reduction reaction at 500 °C in molecular hydrogen, showing the sawtooth shape along the edges (square). d) Colorized SEM image showing the formation of holes after 2 hours of reduction. The scale bar is 1 micrometer. Credit: W. Zhu et al./ACS Nano and K. Irvine/NIST
Here’s more about being able to watch iron transition from one state to the next according to an April 5, 2017 news item on phys.org
Using a state-of-the-art microscopy technique, experimenters at the National Institute of Standards and Technology (NIST) and their colleagues have witnessed a slow-motion, atomic-scale transformation of rust—iron oxide—back to pure iron metal, in all of its chemical steps.
Among the most abundant minerals on Earth, iron oxides play a leading role in magnetic data storage, cosmetics, the pigmentation of paints and drug delivery. These materials also serve as catalysts for several types of chemical reactions, including the production of ammonia for fertilizer.
To fine-tune the properties of these minerals for each application, scientists work with nanometer-scale particles of the oxides. But to do so, researchers need a detailed, atomic-level understanding of reduction, a key chemical reaction that iron oxides undergo. That knowledge, however, is often lacking because reduction—a process that is effectively the opposite of rusting—proceeds too rapidly for many types of probes to explore at such a fine level.
In a new effort to study the microscopic details of metal oxide reduction, researchers used a specially adapted transmission electron microscope (TEM) at NIST’s NanoLab facility to document the step-by-step transformation of nanocrystals of the iron oxide hematite (Fe2O3) to the iron oxide magnetite (Fe3O4), and finally to iron metal.
“Even though people have studied iron oxide for many years, there have been no dynamic studies at the atomic scale,” said Wenhui Zhu of the State University of New York at Binghamton, who worked on her doctorate in the NanoLab in 2015 and 2016. “We are seeing what’s actually happening during the entire reduction process instead of studying just the initial steps.”
That’s critical, added NIST’s Renu Sharma, “if you want to control the composition or properties of iron oxides and understand the relationships between them.”
By lowering the temperature of the reaction and decreasing the pressure of the hydrogen gas that acted as the reducing agent, the scientists slowed down the reduction process so that it could be captured with an environmental TEM—a specially configured TEM that can study both solids and gas. The instrument enables researchers to perform atomic-resolution imaging of a sample under real-life conditions—in this case the gaseous environment necessary for iron oxides to undergo reduction–rather than under the vacuum needed in ordinary TEMs.
“This is the most powerful tool I’ve used in my research and one of the very few in the United States,” said Zhu. She, Sharma and their colleagues describe their findings in a recent issue of ACS Nano.
The team examined the reduction process in a bicrystal of iron oxide, consisting of two identical iron oxide crystals rotated at 21.8 degrees with respect to each other. The bicrystal structure also served to slow down the reduction process, making it easier to follow with the environmental TEM.
In studying the reduction reaction, the researchers identified a previously unknown intermediate state in the transformation from magnetite to hematite. In the middle stage, the iron oxide retained its original chemical structure, Fe2O3, but changed the crystallographic arrangement of its atoms from rhombohedral (a diagonally stretched cube) to cubic.
This intermediate state featured a defect in which oxygen atoms fail to populate some of the sites in the crystal that they normally would. This so-called oxygen vacancy defect is not uncommon and is known to strongly influence the electrical and catalytic properties of oxides. But the researchers were surprised to find that the defects occurred in an ordered pattern, which had never been found before in the reduction of Fe2O3 to Fe3O4, Sharma said.
The significance of the intermediate state remains under study, but it may be important for controlling the reduction rate and other properties of the reduction process, she adds. “The more we understand, the better we can manipulate the microstructure of these oxides,” said Zhu. By manipulating the microstructure, researchers may be able to enhance the catalytic activity of iron oxides.
Even though a link has already been provided for the paper, I will give it again along with a citation,
I thought I’d been knocked off the list but finally I have a notice for an upcoming Café Scientifique talk that arrived and before the event, at that. From an April 12, 2017 notice (received via email),
Our next café will happen on TUESDAY APRIL 25TH, 7:30PM in the back
room at YAGGER’S DOWNTOWN (433 W Pender). Our speaker for the
evening will be DR. SARAH BURKE, an Assistant Professor in the
Department of Physics and Astronomy/ Department of Chemistry at UBC [University of British Columbia]. The title of her talk is:
NO SMALL FEAT: SEEING ATOMS AND MOLECULES
From solar cells to superconductivity, the properties of materials and
the devices we make from them arise from the atomic scale structure of
the atoms that make up the material, their electrons, and how they all
interact. Seeing this takes a microscope, but not like the one you may
have had as a kid or used in a university lab, which are limited to
seeing objects on the scale of the wavelength of visible light: still
thousands of times bigger than the size of an atom. Scanning probe
microscopes operate more like a nanoscale record player, scanning a very
sharp tip over a surface and measuring interactions between the tip and
surface to create atomically resolved images. These techniques show us
where atoms and electrons live at surfaces, on nanostructures, and in
molecules. I will describe how these techniques give us a powerful
glimpse into a tiny world.
I have a little more about Sarah Burke from her webpage in the UBC Physics and Astronomy webspace,
Building an understanding of important electronic and optoelectronic processes in nanoscale materials from the atomic scale up will pave the way for next generation materials and technologies.
My research interests broadly encompass the study of electronic processes where nanoscale structure influences or reveals the underlying physics. Using scanning probe microscopy (SPM) techniques, my group investigates materials for organic electronics and optoelectronics, graphene and other carbon-based nanomaterials, and other materials where a nanoscale view offers the potential for new understanding. We also work to expand the SPM toolbox; developing new methods in order to probe different aspects of materials, and working to understand leading edge techniques.
Before getting to the announcement, here’s a little background from Dexter Johnson’s Feb. 21, 2017 posting on his NanoClast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website; Note: Links have been removed),
Ever since the 1980s, when Gerd Binnig of IBM first heard that “beautiful noise” made by the tip of the first scanning tunneling microscope (STM) dragging across the surface of an atom, and he later developed the atomic force microscope (AFM), these microscopy tools have been the bedrock of nanotechnology research and development.
AFMs have continued to evolve over the years, and at one time, IBM even looked into using them as the basis of a memory technology in the company’s Millipede project. Despite all this development, AFMs have remained bulky and expensive devices, costing as much as $50,000 [or more].
Researchers at The University of Texas at Dallas have created an atomic force microscope on a chip, dramatically shrinking the size — and, hopefully, the price tag — of a high-tech device commonly used to characterize material properties.
“A standard atomic force microscope is a large, bulky instrument, with multiple control loops, electronics and amplifiers,” said Dr. Reza Moheimani, professor of mechanical engineering at UT Dallas. “We have managed to miniaturize all of the electromechanical components down onto a single small chip.”
An atomic force microscope (AFM) is a scientific tool that is used to create detailed three-dimensional images of the surfaces of materials, down to the nanometer scale — that’s roughly on the scale of individual molecules.
The basic AFM design consists of a tiny cantilever, or arm, that has a sharp tip attached to one end. As the apparatus scans back and forth across the surface of a sample, or the sample moves under it, the interactive forces between the sample and the tip cause the cantilever to move up and down as the tip follows the contours of the surface. Those movements are then translated into an image.
“An AFM is a microscope that ‘sees’ a surface kind of the way a visually impaired person might, by touching. You can get a resolution that is well beyond what an optical microscope can achieve,” said Moheimani, who holds the James Von Ehr Distinguished Chair in Science and Technology in the Erik Jonsson School of Engineering and Computer Science. “It can capture features that are very, very small.”
The UT Dallas team created its prototype on-chip AFM using a microelectromechanical systems (MEMS) approach.
“A classic example of MEMS technology are the accelerometers and gyroscopes found in smartphones,” said Dr. Anthony Fowler, a research scientist in Moheimani’s Laboratory for Dynamics and Control of Nanosystems and one of the article’s co-authors. “These used to be big, expensive, mechanical devices, but using MEMS technology, accelerometers have shrunk down onto a single chip, which can be manufactured for just a few dollars apiece.”
The MEMS-based AFM is about 1 square centimeter in size, or a little smaller than a dime. It is attached to a small printed circuit board, about half the size of a credit card, which contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device.
Conventional AFMs operate in various modes. Some map out a sample’s features by maintaining a constant force as the probe tip drags across the surface, while others do so by maintaining a constant distance between the two.
“The problem with using a constant height approach is that the tip is applying varying forces on a sample all the time, which can damage a sample that is very soft,” Fowler said. “Or, if you are scanning a very hard surface, you could wear down the tip,”
The MEMS-based AFM operates in “tapping mode,” which means the cantilever and tip oscillate up and down perpendicular to the sample, and the tip alternately contacts then lifts off from the surface. As the probe moves back and forth across a sample material, a feedback loop maintains the height of that oscillation, ultimately creating an image.
“In tapping mode, as the oscillating cantilever moves across the surface topography, the amplitude of the oscillation wants to change as it interacts with sample,” said Dr. Mohammad Maroufi, a research associate in mechanical engineering and co-author of the paper. “This device creates an image by maintaining the amplitude of oscillation.”
Because conventional AFMs require lasers and other large components to operate, their use can be limited. They’re also expensive.
“An educational version can cost about $30,000 or $40,000, and a laboratory-level AFM can run $500,000 or more,” Moheimani said. “Our MEMS approach to AFM design has the potential to significantly reduce the complexity and cost of the instrument.
“One of the attractive aspects about MEMS is that you can mass produce them, building hundreds or thousands of them in one shot, so the price of each chip would only be a few dollars. As a result, you might be able to offer the whole miniature AFM system for a few thousand dollars.”
A reduced size and price tag also could expand the AFMs’ utility beyond current scientific applications.
“For example, the semiconductor industry might benefit from these small devices, in particular companies that manufacture the silicon wafers from which computer chips are made,” Moheimani said. “With our technology, you might have an array of AFMs to characterize the wafer’s surface to find micro-faults before the product is shipped out.”
The lab prototype is a first-generation device, Moheimani said, and the group is already working on ways to improve and streamline the fabrication of the device.
“This is one of those technologies where, as they say, ‘If you build it, they will come.’ We anticipate finding many applications as the technology matures,” Moheimani said.
In addition to the UT Dallas researchers, Michael Ruppert, a visiting graduate student from the University of Newcastle in Australia, was a co-author of the journal article. Moheimani was Ruppert’s doctoral advisor.
So, an AFM that could cost as much as $500,000 for a laboratory has been shrunk to this size and become far less expensive,
A MEMS-based atomic force microscope developed by engineers at UT Dallas is about 1 square centimeter in size (top center). Here it is attached to a small printed circuit board that contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device. Courtesy: University of Texas at Dallas
Of course, there’s still more work to be done as you’ll note when reading Dexter’s Feb. 21, 2017 posting where he features answers to questions he directed to the researchers.
Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab
The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,
In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.
The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.
Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.
“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.
Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.
By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.
“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.
The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.
“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”
The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.
“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.
In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.
Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.
That means that radiation-sensitive objects can be imaged with lower doses of radiation.
The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),
Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.
The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.
What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.
Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …
Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.
“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.
Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.
Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.
“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.
A TEAM approach
The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.
The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.
They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.
Translating the data into scientific insights
Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.
“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.
To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.
“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.
Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”
The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),
The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,
… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.
“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.
Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.
Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.
Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.
“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”
The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,
A Supercomputing Milestone
Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.
For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.
“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.
To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.
To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.
“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.
As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.
Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.
“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.
Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.
In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.
Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.
“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.
Finally, here’s a link to and a citation for the paper,
This is exciting news for Canadian science and the second time there has been a breakthrough development from the province of Alberta within the last five months (see Sept. 21, 2016 posting on quantum teleportation). From a Feb. 21, 2017 news item on ScienceDaily,
For the first time ever, scientists have captured images of terahertz electron dynamics of a semiconductor surface on the atomic scale. The successful experiment indicates a bright future for the new and quickly growing sub-field called terahertz scanning tunneling microscopy (THz-STM), pioneered by the University of Alberta in Canada. THz-STM allows researchers to image electron behaviour at extremely fast timescales and explore how that behaviour changes between different atoms.
“We can essentially zoom in to observe very fast processes with atomic precision and over super fast time scales,” says Vedran Jelic, PhD student at the University of Alberta and lead author on the new study. “THz-STM provides us with a new window into the nanoworld, allowing us to explore ultrafast processes on the atomic scale. We’re talking a picosecond, or a millionth millionth of a second. It’s something that’s never been done before.”
Jelic and his collaborators used their scanning tunneling microscope (STM) to capture images of silicon atoms by raster scanning a very sharp tip across the surface and recording the tip height as it follows the atomic corrugations of the surface. While the original STM can measure and manipulate single atoms–for which its creators earned a Nobel Prize in 1986–it does so using wired electronics and is ultimately limited in speed and thus time resolution.
Modern lasers produce very short light pulses that can measure a whole range of ultra-fast processes, but typically over length scales limited by the wavelength of light at hundreds of nanometers. Much effort has been expended to overcome the challenges of combining ultra-fast lasers with ultra-small microscopy. The University of Alberta scientists addressed these challenges by working in a unique terahertz frequency range of the electromagnetic spectrum that allows wireless implementation. Normally the STM needs an applied voltage in order to operate, but Jelic and his collaborators are able to drive their microscope using pulses of light instead. These pulses occur over really fast timescales, which means the microscope is able to see really fast events.
By incorporating the THz-STM into an ultrahigh vacuum chamber, free from any external contamination or vibration, they are able to accurately position their tip and maintain a perfectly clean surface while imaging ultrafast dynamics of atoms on surfaces. Their next step is to collaborate with fellow material scientists and image a variety of new surfaces on the nanoscale that may one day revolutionize the speed and efficiency of current technology, ranging from solar cells to computer processing.
“Terahertz scanning tunneling microscopy is opening the door to an unexplored regime in physics,” concludes Jelic, who is studying in the Ultrafast Nanotools Lab with University of Alberta professor Frank Hegmann, a world expert in ultra-fast terahertz science and nanophysics.
Here’s are links to and citations for the team’s 2013 paper and their latest,
An ultrafast terahertz scanning tunnelling microscope by Tyler L. Cocker, Vedran Jelic, Manisha Gupta, Sean J. Molesky, Jacob A. J. Burgess, Glenda De Los Reyes, Lyubov V. Titova, Ying Y. Tsui, Mark R. Freeman, & Frank A. Hegmann. Nature Photonics 7, 620–625 (2013) doi:10.1038/nphoton.2013.151 Published online 07 July 2013
I have two nanotech business news bits, one from Turkey and one from Northern Ireland.
A Turkish company has sold one of its microscopes to the US National Aeronautics and Space Administration (NASA), according to a Jan. 20, 2017 news item on dailysabah.com,
Turkish nanotechnology company Nanomanyetik has begun selling a powerful microscope to the U.S. space agency NASA, the company’s general director told Anadolu Agency on Thursday [Jan. 19, 2017].
Dr. Ahmet Oral, who also teaches physics at Middle East Technical University, said Nanomanyetik developed a microscope that is able to map surfaces on the nanometric and atomic levels, or extremely small particles.
Nanomanyetik’s foreign customers are drawn to the microscope because of its higher quality yet cheaper price compared to its competitors.
“There are almost 30 firms doing this work,” according to Oral. “Ten of them are active and we are among these active firms. Our aim is to be in the top three,” he said, adding that Nanomanyetik jumps to the head of the line because of its after-sell service.
In addition to sales to NASA, the Ankara-based firm exports the microscope to Brazil, Chile, France, Iran, Israel, Italy, Japan, Poland, South Korea and Spain.
Electronics giant Samsung is also a customer.
“Where does Samsung use this product? There are pixels in the smartphones’ displays. These pixels are getting smaller each year. Now the smallest pixel is 15X10 microns,” he said. Human hair is between 10 and 100 microns in diameter.
“They are figuring inner sides of pixels so that these pixels can operate much better. These patterns are on the nanometer level. They are using these microscopes to see the results of their works,” Oral said.
Nanomanyetik’s microscopes produces good quality, high resolution images and can even display an object’s atoms and individual DNA fibers, according to Oral.
A Jan. 22, 2017 news article by Dominic Coyle for The Irish Times (Note: Links have been removed) shares this business news and mention of a world first,
MOF Technologies has raised £1.5 million (€1.73 million) from London-based venture capital group Excelsa Ventures and Queen’s University Belfast’s Qubis research commercialisation group.
MOF Technologies chief executive Paschal McCloskey welcomed the Excelsa investment.
Established in part by Qubis in 2012 in partnership with inventor Prof Stuart James, MOF Technologies began life in a lab at the School of Chemistry and Chemical Engineering at Queen’s.
Its metal organic framework (MOF) technology is seen as having significant potential in areas including gas storage, carbon capture, transport, drug delivery and heat transformation. Though still in its infancy, the market is forecast to grow to £2.2 billion by 2022, the company says.
MOF Technologies last year became the first company worldwide to successfully commercialise MOFs when it agreed a deal with US fruit and vegetable storage provider Decco Worldwide to commercialise MOFs for use in a food application.
TruPick, designed by Decco and using MOF Technologies’ environmentally friendly technology, enables nanomaterials control the effects of ethylene on fruit produce so it maintains freshness in storage or transport.
MOFs are crystalline, sponge-like materials composed of two components – metal ions and organic molecules known as linkers.
“We very quickly recognised the market potential of MOFs in terms of their unmatched ability for gas storage,” said Moritz Bolle from Excelsa Ventures. “This technology will revolutionise traditional applications and open countless new opportunities for industry. We are confident MOF Technologies is the company that will lead this seismic shift in materials science.
I’ve tagged this particular field of interest ‘machine/flesh’ because I find it more descriptive than ‘bio-hybrid system’ which was the term used in a Nov. 15, 2016 news item on phys.org,
One of the biggest challenges in cognitive or rehabilitation neurosciences is the ability to design a functional hybrid system that can connect and exchange information between biological systems, like neurons in the brain, and human-made electronic devices. A large multidisciplinary effort of researchers in Italy brought together physicists, chemists, biochemists, engineers, molecular biologists and physiologists to analyze the biocompatibility of the substrate used to connect these biological and human-made components, and investigate the functionality of the adhering cells, creating a living biohybrid system.
In an article appearing this week in AIP Advances, from AIP Publishing, the research team used the interaction between light and matter to investigate the material properties at the molecular level using Raman spectroscopy, a technique that, until now, has been principally applied to material science. Thanks to the coupling of the Raman spectrometer with a microscope, spectroscopy becomes a useful tool for investigating micro-objects such as cells and tissues. Raman spectroscopy presents clear advantages for this type of investigation: The molecular composition and the modi?cation of subcellular compartments can be obtained in label-free conditions with non-invasive methods and under physiological conditions, allowing the investigation of a large variety of biological processes both in vitro and in vivo.
Once the biocompatibility of the substrate was analyzed and the functionality of the adhering cells investigated, the next part of this puzzle is connecting with the electronic component. In this case a memristor was used.
“Its name reveals its peculiarity (MEMory ResISTOR), it has a sort of “memory”: depending on the amount of voltage that has been applied to it in the past, it is able to vary its resistance, because of a change of its microscopic physical properties,” said Silvia Caponi, a physicist at the Italian National Research Council in Rome. By combining memristors, it is possible to create pathways within the electrical circuits that work similar to the natural synapses, which develop variable weight in their connections to reproduce the adaptive/learning mechanism. Layers of organic polymers, like polyaniline (PANI) a semiconductor polymer, also have memristive properties, allowing them to work directly with biological materials into a hybrid bio-electronic system.
“We applied the analysis on a hybrid bio-inspired device but in a prospective view, this work provides the proof of concept of an integrated study able to analyse the status of living cells in a large variety of applications that merges nanosciences, neurosciences and bioelectronics,” said Caponi. A natural long-term objective of this work would be interfacing machines and nervous systems as seamlessly as possible.
The multidisciplinary team is ready to build on this proof of principle to realize the potential of memristor networks.
“Once assured the biocompatibility of the materials on which neurons grow,” said Caponi, “we want to define the materials and their functionalization procedures to find the best configuration for the neuron-memristor interface to deliver a full working hybrid bio-memristive system.”
Caption: These are immunofluorescence analysis of SH-SY5Y cells treated for 5 days with 10uM Retinoic Acid and 50ng/ml BDNF for the next 3 days. The DAPI fluorescence stain is blue and Beta-tubulin is green. Credit: Caponi, et al.