A 3-D view of a hyperbranched nanoparticle with complex structure, made possible by Tomviz 1.0, a new open-source software platform developed by researchers at the University of Michigan, Cornell University and Kitware Inc. Image credit: Robert Hovden, Michigan Engineering
Now it’s possible for anyone to see and share 3-D nanoscale imagery with a new open-source software platform developed by researchers at the University of Michigan, Cornell University and open-source software company Kitware Inc.
Tomviz 1.0 is the first open-source tool that enables researchers to easily create 3-D images from electron tomography data, then share and manipulate those images in a single platform.
The world of nanoscale materials—things 100 nanometers and smaller—is an important place for scientists and engineers who are designing the stuff of the future: semiconductors, metal alloys and other advanced materials.
Seeing in 3-D how nanoscale flecks of platinum arrange themselves in a car’s catalytic converter, for example, or how spiky dendrites can cause short circuits inside lithium-ion batteries, could spur advances like safer, longer-lasting batteries; lighter, more fuel efficient cars; and more powerful computers.
“3-D nanoscale imagery is useful in a variety of fields, including the auto industry, semiconductors and even geology,” said Robert Hovden, U-M assistant professor of materials science engineering and one of the creators of the program. “Now you don’t have to be a tomography expert to work with these images in a meaningful way.”
Tomviz solves a key challenge: the difficulty of interpreting data from the electron microscopes that examine nanoscale objects in 3-D. The machines shoot electron beams through nanoparticles from different angles. The beams form projections as they travel through the object, a bit like nanoscale shadow puppets.
Once the machine does its work, it’s up to researchers to piece hundreds of shadows into a single three-dimensional image. It’s as difficult as it sounds—an art as well as a science. Like staining a traditional microscope slide, researchers often add shading or color to 3-D images to highlight certain attributes.
A 3-D view of a particle used in a hydrogen fuel cell powered vehicle. The gray structure is carbon; the red and blue particles are nanoscale flecks of platinum. The image is made possible by Tomviz 1.0. Image credit: Elliot Padget, Cornell UniversityTraditionally, they’ve have had to rely on a hodgepodge of proprietary software to do the heavy lifting. The work is expensive and time-consuming; so much so that even big companies like automakers struggle with it. And once a 3-D image is created, it’s often impossible for other researchers to reproduce it or to share it with others.
Tomviz dramatically simplifies the process and reduces the amount of time and computing power needed to make it happen, its designers say. It also enables researchers to readily collaborate by sharing all the steps that went into creating a given image and enabling them to make tweaks of their own.
“These images are far different from the 3-D graphics you’d see at a movie theater, which are essentially cleverly lit surfaces,” Hovden said. “Tomviz explores both the surface and the interior of a nanoscale object, with detailed information about its density and structure. In some cases, we can see individual atoms.”
Key to making Tomviz happen was getting tomography experts and software developers together to collaborate, Hovden said. Their first challenge was gaining access to a large volume of high-quality tomography. The team rallied experts at Cornell, Berkeley Lab and UCLA to contribute their data, and also created their own using U-M’s microscopy center. To turn raw data into code, Hovden’s team worked with open-source software maker Kitware.
With the release of Tomviz 1.0, Hovden is looking toward the next stages of the project, where he hopes to integrate the software directly with microscopes. He believes that U-M’s atom probe tomography facilities and expertise could help him design a version that could ultimately uncover the chemistry of all atoms in 3-D.
“We are unlocking access to see new 3D nanomaterials that will power the next generation of technology,” Hovden said. “I’m very interested in pushing the boundaries of understanding materials in 3-D.”
What is the effect of Topical Curcumin Gel for treating burns and scalds? In a recent research paper, published in the open access journal BioDiscovery, Dr. Madalene Heng, Clinical Professor of Dermatology at the David Geffen School of Medicine, stresses that use of topical curcumin gel for treating skin problems, like burns and scalds, is very different, and appears to work more effectively, when compared to taking curcumin tablets by mouth for other conditions.
“Curcumin gel appears to work much better when used on the skin because the gel preparation allows curcumin to penetrate the skin, inhibit phosphorylase kinase and reduce inflammation,” explains Dr Heng.
In this report, use of curcumin after burns and scalds were found to reduce the severity of the injury, lessen pain and inflammation, and improve healing with less than expected scarring, or even no scarring, of the affected skin. Dr. Heng reports her experience using curcumin gel on such injuries using three examples of patients treated after burns and scalds, and provides a detailed explanation why topical curcumin may work on such injuries.
Curcumin is an ingredient found in the common spice turmeric. Turmeric has been used as a spice for centuries in many Eastern countries and gives well known dishes, such as curry, their typical yellow-gold color. The spice has also been used for cosmetic and medical purposes for just as long in these countries.
In recent years, the medicinal value of curcumin has been the subject of intense scientific studies, with publication numbering in the thousands, looking into the possible beneficial effects of this natural product on many kinds of affliction in humans.
This study published reports that topical curcumin gel applied soon after mild to moderate burns and scalds appears to be remarkably effective in relieving symptoms and improved healing of the affected skin.
“When taken by mouth, curcumin is very poorly absorbed into the body, and may not work as well,” notes Dr. Heng. “Nonetheless, our tests have shown that when the substance is used in a topical gel, the effect is notable.”
The author of the study believes that the effectiveness of curcumin gel on the skin – or topical curcumin – is related to its potent anti-inflammatory activity. Based on studies that she has done both in the laboratory and in patients over 25 years, the key to curcumin’s effectiveness on burns and scalds is that it is a natural inhibitor of an enzyme called phosphorylase kinase.
This enzyme in humans has many important functions, including its involvement in wound healing. Wound healing is the vital process that enables healing of tissues after injury. The process goes through a sequence of acute and chronic inflammatory events, during which there is redness, swelling, pain and then healing, often with scarring in the case of burns and scalds of the skin. The sequence is started by the release of phosphorylase kinase about 5 mins after injury, which activates over 200 genes that are involved in wound healing.
Dr. Heng uses curcumin gel for burns, scalds and other skin conditions as complementary treatment, in addition to standard treatment usually recommended for such conditions.
Caption: These are results from 5 days upon application of curcumin gel to burns, and results after 6 weeks. Credit: Dr. Madalene Heng
Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab
The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,
In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.
The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.
Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.
“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.
Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.
By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.
“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.
The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.
“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”
The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.
“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.
In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.
Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.
That means that radiation-sensitive objects can be imaged with lower doses of radiation.
The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),
Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.
The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.
What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.
Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …
Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.
“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.
Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.
Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.
“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.
A TEAM approach
The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.
The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.
They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.
Translating the data into scientific insights
Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.
“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.
To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.
“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.
Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”
The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),
The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,
… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.
“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.
Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.
Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.
Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.
“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”
The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,
A Supercomputing Milestone
Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.
For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.
“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.
To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.
To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.
“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.
As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.
Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.
“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.
Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.
In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.
Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.
“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.
Finally, here’s a link to and a citation for the paper,
The hit Disney movie “Moana” features stunning visual effects, including the animation of water to such a degree that it becomes a distinct character in the film. Courtesy of Walt Disney Animation Studios
Few people think to marvel over the mathematics when watching an animated feature but without mathematicians, the artists would not be able to achieve their artistic goals as a Jan. 4, 2017 news item on phys.org makes clear (Note: A link has been removed),
UCLA [University of California at Los Angeles] mathematics professor Joseph Teran, a Walt Disney consultant on animated movies since 2007, is under no illusion that artists want lengthy mathematics lessons, but many of them realize that the success of animated movies often depends on advanced mathematics.
“In general, the animators and artists at the studios want as little to do with mathematics and physics as possible, but the demands for realism in animated movies are so high,” Teran said. “Things are going to look fake if you don’t at least start with the correct physics and mathematics for many materials, such as water and snow. If the physics and mathematics are not simulated accurately, it will be very glaring that something is wrong with the animation of the material.”
Teran and his research team have helped infuse realism into several Disney movies, including “Frozen,” where they used science to animate snow scenes. Most recently, they applied their knowledge of math, physics and computer science to enliven the new 3-D computer-animated hit, “Moana,” a tale about an adventurous teenage girl who is drawn to the ocean and is inspired to leave the safety of her island on a daring journey to save her people.
Alexey Stomakhin, a former UCLA doctoral student of Teran’s and Andrea Bertozzi’s, played an important role in the making of “Moana.” After earning his Ph.D. in applied mathematics in 2013, he became a senior software engineer at Walt Disney Animation Studios. Working with Disney’s effects artists, technical directors and software developers, Stomakhin led the development of the code that was used to simulate the movement of water in “Moana,” enabling it to play a role as one of the characters in the film.
“The increased demand for realism and complexity in animated movies makes it preferable to get assistance from computers; this means we have to simulate the movement of the ocean surface and how the water splashes, for example, to make it look believable,” Stomakhin explained. “There is a lot of mathematics, physics and computer science under the hood. That’s what we do.”
“Moana” has been praised for its stunning visual effects in words the mathematicians love hearing. “Everything in the movie looks almost real, so the movement of the water has to look real too, and it does,” Teran said. “’Moana’ has the best water effects I’ve ever seen, by far.”
Stomakhin said his job is fun and “super-interesting, especially when we cheat physics and step beyond physics. It’s almost like building your own universe with your own laws of physics and trying to simulate that universe.
“Disney movies are about magic, so magical things happen which do not exist in the real world,” said the software engineer. “It’s our job to add some extra forces and other tricks to help create those effects. If you have an understanding of how the real physical laws work, you can push parameters beyond physical limits and change equations slightly; we can predict the consequences of that.”
To make animated movies these days, movie studios need to solve, or nearly solve, partial differential equations. Stomakhin, Teran and their colleagues build the code that solves the partial differential equations. More accurately, they write algorithms that closely approximate the partial differential equations because they cannot be solved perfectly. “We try to come up with new algorithms that have the highest-quality metrics in all possible categories, including preserving angular momentum perfectly and preserving energy perfectly. Many algorithms don’t have these properties,” Teran said.
Stomakhin was also involved in creating the ocean’s crashing waves that have to break at a certain place and time. That task required him to get creative with physics and use other tricks. “You don’t allow physics to completely guide it,” he said. “You allow the wave to break only when it needs to break.”
Depicting boats on waves posed additional challenges for the scientists.
“It’s easy to simulate a boat traveling through a static lake, but a boat on waves is much more challenging to simulate,” Stomakhin said. “We simulated the fluid around the boat; the challenge was to blend that fluid with the rest of the ocean. It can’t look like the boat is splashing in a little swimming pool — the blend needs to be seamless.”
Stomakhin spent more than a year developing the code and understanding the physics that allowed him to achieve this effect.
“It’s nice to see the great visual effect, something you couldn’t have achieved if you hadn’t designed the algorithm to solve physics accurately,” said Teran, who has taught an undergraduate course on scientific computing for the visual-effects industry.
While Teran loves spectacular visual effects, he said the research has many other scientific applications as well. It could be used to simulate plasmas, simulate 3-D printing or for surgical simulation, for example. Teran is using a related algorithm to build virtual livers to substitute for the animal livers that surgeons train on. He is also using the algorithm to study traumatic leg injuries.
Teran describes the work with Disney as “bread-and-butter, high-performance computing for simulating materials, as mechanical engineers and physicists at national laboratories would. Simulating water for a movie is not so different, but there are, of course, small tweaks to make the water visually compelling. We don’t have a separate branch of research for computer graphics. We create new algorithms that work for simulating wide ranges of materials.”
Teran, Stomakhin and three other applied mathematicians — Chenfanfu Jiang, Craig Schroeder and Andrew Selle — also developed a state-of-the-art simulation method for fluids in graphics, called APIC, based on months of calculations. It allows for better realism and stunning visual results. Jiang is a UCLA postdoctoral scholar in Teran’s laboratory, who won a 2015 UCLA best dissertation prize. Schroeder is a former UCLA postdoctoral scholar who worked with Teran and is now at UC Riverside. Selle, who worked at Walt Disney Animation Studios, is now at Google.
Their newest version of APIC has been accepted for publication by the peer-reviewed Journal of Computational Physics.
“Alexey is using ideas from high-performance computing to make movies,” Teran said, “and we are contributing to the scientific community by improving the algorithm.”
Unfortunately, the paper does not seem to have been published early online so I cannot offer a link.
Final comment, it would have been interesting to have had a comment from one of the film’s artists or animators included in the article but it may not have been possible due to time or space constraints.
Two Oct. 31, 2016 news item on Nanowerk signal the impending sunset date for the European Union’s Sustainable Nanotechnologies (SUN) project. The first Oct. 31, 2016 news item on Nanowerk describes the projects latest achievements,
The results from the 3rd SUN annual meeting showed great advancement of the project. The meeting was held in Edinburgh, Scotland, UK on 4-5 October 2016 where the project partners presented the results obtained during the second reporting period of the project.
SUN is a three and a half year EU project, running from 2013 to 2017, with a budget of about €14 million. Its main goal is to evaluate the risks along the supply chain of engineered nanomaterials and incorporate the results into tools and guidelines for sustainable manufacturing.
The ultimate goal of the SUN Project is the development of an online software Decision Support System – SUNDS – aimed at estimating and managing occupational, consumer, environmental and public health risks from nanomaterials in real industrial products along their lifecycles. The SUNDS beta prototype has been released last October, 2015, and since then the main focus has been on refining the methodologies and testing them on selected case studies i.e. nano-copper oxide based wood preserving paint and nano- sized colourants for plastic car part: organic pigment and carbon black. Obtained results and open issues were discussed during the third annual meeting in order collect feedbacks from the consortium that will inform, in the next months, the implementation of the final version of the SUNDS software system, due by March 2017.
Significant interest has been payed towards the results obtained in WP2 (Lifecycle Thinking) which main objectives are to assess the environmental impacts arising from each life cycle stage of the SUN case studies (i.e. Nano-WC-Cobalt (Tungsten Carbide-cobalt) sintered ceramics, Nanocopper wood preservatives, Carbon Nano Tube (CNT) in plastics, Silicon Dioxide (SiO2) as food additive, Nano-Titanium Dioxide (TiO2) air filter system, Organic pigment in plastics and Nanosilver (Ag) in textiles), and compare them to conventional products with similar uses and functionality, in order to develop and validate criteria and guiding principles for green nano-manufacturing. Specifically, the consortium partner COLOROBBIA CONSULTING S.r.l. expressed its willingness to exploit the results obtained from the life cycle assessment analysis related to nanoTiO2 in their industrial applications.
On 6th October , the discussions about the SUNDS advancement continued during a Stakeholder Workshop, where representatives from industry, regulatory and insurance sectors shared their feedback on the use of the decision support system. The recommendations collected during the workshop will be used for the further refinement and implemented in the final version of the software which will be released by March 2017.
The project has designed its final events to serve as an effective platform to communicate the main results achieved in its course within the Nanosafety community and bridge them to a wider audience addressing the emerging risks of Key Enabling Technologies (KETs).
Jointly organized by the Society for Risk Analysis (SRA) and the SUN Project, the SRA Policy Forum will address current efforts put towards refining the risk governance of emerging technologies through the integration of traditional risk analytic tools alongside considerations of social and economic concerns. The parallel sessions will be organized in 4 tracks: Risk analysis of engineered nanomaterials along product lifecycle, Risks and benefits of emerging technologies used in medical applications, Challenges of governing SynBio and Biotech, and Methods and tools for risk governance.
The SRA Policy Forum has announced its speakers and preliminary Programme. Confirmed speakers include:
Keld Alstrup Jensen (National Research Centre for the Working Environment, Denmark)
Elke Anklam (European Commission, Belgium)
Adam Arkin (University of California, Berkeley, USA)
Phil Demokritou (Harvard University, USA)
Gerard Escher (École polytechnique fédérale de Lausanne, Switzerland)
Lisa Friedersdor (National Nanotechnology Initiative, USA)
James Lambert (President, Society for Risk Analysis, USA)
Andre Nel (The University of California, Los Angeles, USA)
Bernd Nowack (EMPA, Switzerland)
Ortwin Renn (University of Stuttgart, Germany)
Vicki Stone (Heriot-Watt University, UK)
Theo Vermeire (National Institute for Public Health and the Environment (RIVM), Netherlands)
Tom van Teunenbroek (Ministry of Infrastructure and Environment, The Netherlands)
Wendel Wohlleben (BASF, Germany)
The New Tools and Approaches for Nanomaterial Safety Assessment (NMSA) conference aims at presenting the main results achieved in the course of the organizing projects fostering a discussion about their impact in the nanosafety field and possibilities for future research programmes. The conference welcomes consortium partners, as well as representatives from other EU projects, industry, government, civil society and media. Accordingly, the conference topics include: Hazard assessment along the life cycle of nano-enabled products, Exposure assessment along the life cycle of nano-enabled products, Risk assessment & management, Systems biology approaches in nanosafety, Categorization & grouping of nanomaterials, Nanosafety infrastructure, Safe by design. The NMSA conference key note speakers include:
Harri Alenius (University of Helsinki, Finland,)
Antonio Marcomini (Ca’ Foscari University of Venice, Italy)
Wendel Wohlleben (BASF, Germany)
Danail Hristozov (Ca’ Foscari University of Venice, Italy)
Eva Valsami-Jones (University of Birmingham, UK)
Socorro Vázquez-Campos (LEITAT Technolоgical Center, Spain)
Barry Hardy (Douglas Connect GmbH, Switzerland)
Egon Willighagen (Maastricht University, Netherlands)
Nina Jeliazkova (IDEAconsult Ltd., Bulgaria)
Haralambos Sarimveis (The National Technical University of Athens, Greece)
During the SUN-caLIBRAte Stakeholder workshop the final version of the SUN user-friendly, software-based Decision Support System (SUNDS) for managing the environmental, economic and social impacts of nanotechnologies will be presented and discussed with its end users: industries, regulators and insurance sector representatives. The results from the discussion will be used as a foundation of the development of the caLIBRAte’s Risk Governance framework for assessment and management of human and environmental risks of MN and MN-enabled products.
The SRA Policy Forum: Risk Governance for Key Enabling Technologies and the New Tools and Approaches for Nanomaterial Safety Assessment conference are now open for registration. Abstracts for the SRA Policy Forum can be submitted till 15th November 2016.
For further information go to: www.sra.org/riskgovernanceforum2017 http://www.nmsaconference.eu/
It never occurred to me that someone might want a wearable microscope but, apparently, there is a need. A Sept. 27, 2016 news item on phys.org,
UCLA [University of California at Los Angeles] researchers working with a team at Verily Life Sciences have designed a mobile microscope that can detect and monitor fluorescent biomarkers inside the skin with a high level of sensitivity, an important tool in tracking various biochemical reactions for medical diagnostics and therapy.
This new system weighs less than a one-tenth of a pound, making it small and light enough for a person to wear around their bicep, among other parts of their body. In the future, technology like this could be used for continuous patient monitoring at home or at point-of-care settings.
The research, which was published in the journal ACS Nano, was led by Aydogan Ozcan, UCLA’s Chancellor’s Professor of Electrical Engineering and Bioengineering and associate director of the California NanoSystems Institute and Vasiliki Demas of Verily Life Sciences (formerly Google Life Sciences).
Fluorescent biomarkers are routinely used for cancer detection and drug delivery and release among other medical therapies. Recently, biocompatible fluorescent dyes have emerged, creating new opportunities for noninvasive sensing and measuring of biomarkers through the skin.
However, detecting artificially added fluorescent objects under the skin is challenging. Collagen, melanin and other biological structures emit natural light in a process called autofluorescence. Various methods have been tried to investigate this problem using different sensing systems. Most are quite expensive and difficult to make small and cost-effective enough to be used in a wearable imaging system.
To test the mobile microscope, researchers first designed a tissue phantom — an artificially created material that mimics human skin optical properties, such as autofluorescence, absorption and scattering. The target fluorescent dye solution was injected into a micro-well with a volume of about one-hundredth of a microliter, thinner than a human hair, and subsequently implanted into the tissue phantom half a millimeter to 2 millimeters from the surface — which would be deep enough to reach blood and other tissue fluids in practice.
To measure the fluorescent dye, the wearable microscope created by Ozcan and his team used a laser to hit the skin at an angle. The fluorescent image at the surface of the skin was captured via the wearable microscope. The image was then uploaded to a computer where it was processed using a custom-designed algorithm, digitally separating the target fluorescent signal from the autofluorescence of the skin, at a very sensitive parts-per-billion level of detection.
“We can place various tiny bio-sensors inside the skin next to each other, and through our imaging system, we can tell them apart,” Ozcan said. “We can monitor all these embedded sensors inside the skin in parallel, even understand potential misalignments of the wearable imager and correct it to continuously quantify a panel of biomarkers.”
This computational imaging framework might also be used in the future to continuously monitor various chronic diseases through the skin using an implantable or injectable fluorescent dye.
As more nanotechnology-enabled products make their way to the market and concerns rise regarding safety, scientists work to find better ways of assessing and predicting the safety of these materials, from an Aug. 13, 2016 news item on Nanowerk,
UCLA [University of California at Los Angeles] researchers have designed a laboratory test that uses microchip technology to predict how potentially hazardous nanomaterials could be.
According to UCLA professor Huan Meng, certain engineered nanomaterials, such as non-purified carbon nanotubes that are used to strengthen commercial products, could have the potential to injure the lungs if inhaled during the manufacturing process. The new test he helped develop could be used to analyze the extent of the potential hazard.
The same test could also be used to identify biological biomarkers that can help scientists and doctors detect cancer and infectious diseases. Currently, scientists identify those biomarkers using other tests; one of the most common is called enzyme-linked immunosorbent assay, or ELISA. But the new platform, which is called semiconductor electronic label-free assay, or SELFA, costs less and is faster and more accurate, according to research published in the journal Scientific Reports.
The study was led by Meng, a UCLA assistant adjunct professor of medicine, and Chi On Chui, a UCLA associate professor of electrical engineering and bioengineering.
ELISA has been used by scientists for decades to analyze biological samples — for example, to detect whether epithelial cells in the lungs that have been exposed to nanomaterials are inflamed. But ELISA must be performed in a laboratory setting by skilled technicians, and a single test can cost roughly $700 and take five to seven days to process.
In contrast, SELFA uses microchip technology to analyze samples. The test can take between 30 minutes and two hours and, according to the UCLA researchers, could cost just a few dollars per sample when high-volume production begins.
The SELFA chip contains a T-shaped nanowire that acts as an integrated sensor and amplifier. To analyze a sample, scientists place it on a sensor on the chip. The vertical part of the T-shaped nanowire converts the current from the molecule being analyzed, and the horizontal portion amplifies that signal to distinguish the molecule from others.
The use of the T-shaped nanowires created in Chui’s lab is a new application of a UCLA patented invention that was developed by Chui and his colleagues. The device is the first time that “lab-on-a-chip” analysis has been tested in a scenario that mimics a real-life situation.
The UCLA scientists exposed cultured lung cells to different nanomaterials and then compared their results using SELFA with results in a database of previous studies that used other testing methods.
“By measuring biomarker concentrations in the cell culture, we showed that SELFA was 100 times more sensitive than ELISA,” Meng said. “This means that not only can SELFA analyze much smaller sample sizes, but also that it can minimize false-positive test results.”
Chui said, “The results are significant because SELFA measurement allows us to predict the inflammatory potential of a range of nanomaterials inside cells and validate the prediction with cellular imaging and experiments in animals’ lungs.”
To study certain aspects of cells, researchers need the ability to take the innards out, manipulate them, and put them back. Options for this kind of work are limited, but researchers reporting May 10  in Cell Metabolism describe a “nanoblade” that can slice through a cell’s membrane to insert mitochondria. The researchers have previously used this technology to transfer other materials between cells and hope to commercialize the nanoblade for wider use in bioengineering.
Caption: This diagram illustrates the process of transferring mitochondria between cells using the nanoblade technology. Credit: Alexander N. Patananan Courtesy UCLA
“As a new tool for cell engineering, to truly engineer cells for health purposes and research, I think this is very unique,” says Mike Teitell, a pathologist and bioengineer at the University of California, Los Angeles (UCLA). “We haven’t run into anything so far, up to a few microns in size, that we can’t deliver.”
Teitell and Pei-Yu “Eric” Chiou, also a bioengineer at UCLA, first conceived the idea of a nanoblade several years ago to transfer a nucleus from one cell to another. However, they soon delved into the intersection of stem cell biology and energy metabolism, where the technology could be used to manipulate a cell’s mitochondria. Studying the effects of mutations in the mitochondrial genome, which can cause debilitating or fatal diseases in humans, is tricky for a number of reasons.
“There’s a bottleneck in the field for modifying a cell’s mitochondrial DNA,” says Teitell. “So we are working on a two-step process: edit the mitochondrial genome outside of a cell, and then take those manipulated mitochondria and put them back into the cell. We’re still working on the first step, but we’ve solved that second one quite well.”
The nanoblade apparatus consists of a microscope, laser, and titanium-coated micropipette to act as the “blade,” operated using a joystick controller. When a laser pulse strikes the titanium, the metal heats up, vaporizing the surrounding water layers in the culture media and forming a bubble next to a cell. Within a microsecond, the bubble expands, generating a local force that punctures the cell membrane and creates a passageway several microns long that the “cargo”–in this case, mitochondria–can be pushed through. The cell then rapidly repairs the membrane defect.
Teitell, Chiou, and their team used the nanoblade to insert tagged mitochondria from human breast cancer cells and embryonic kidney cells into cells without mitochondrial DNA. When they sequenced the nuclear and mitochondrial DNA afterwards, the researchers saw that the mitochondria had been successfully transferred and replicated by 2% of the cells, with a range of functionality. Other methods of mitochondrial transfer are hard to control, and when they have been reported to work, the success rates have been only 0.0001%-0.5% according to the researchers.
“The success of the mitochondrial transfer was very encouraging,” says Chiou. “The most exciting application for the nanoblade, to me, is in the study of mitochondria and infectious diseases. This technology brings new capabilities to help advance these fields.”
The team’s aspirations also go well beyond mitochondria, and they’ve already scaled up the nanoblade apparatus into an automated high-throughput version. “We want to make a platform that’s easy to use for everyone and allow researchers to devise anything they can think of a few microns or smaller that would be helpful for their research–whether that’s inserting antibodies, pathogens, synthetic materials, or something else that we haven’t imagined,” says Teitell. “It would be very cool to allow people to do something that they can’t do right now.”
The pipette being used is measured at the microscale but it’s called a nanoblade? Well, perhaps the tip or the edge of the pipette is measured at the nanoscale.
Getting back to the research, here’s a link to and a citation for the paper,
Connecting with people at a shopping mall on the topic of science and technology can be surprisingly effective. I once managed to convince the powers-that-be in a technology company where I was employed to participate in a mall event for Canada’s National Science and Technology Week (every October). The initial skepticism evaporated after an hour at the mall and an almost continuous stream of visitors eager to learn about data communications. Sadly, Canada’s National Science and Technology Week programme no longer funds those kinds of events, which I think is a missed opportunity for Canadians.
Californians, on the other hand, have an opportunity to meet University of California at Los Angeles (UCLA) nanoscientists at the mall this April and May (2016) following a successful first event on Feb. 20, 2016 according to a March 8, 2016 news item on Nanotechnology Now,
A precocious 6-year-old, Spencer Reisner already has an ambitious “to do” list for his future: become an astronaut and go to Mars, create new fuel sources and learn more about nanotechnology. Recently, he achieved one of these lifelong objectives at an unlikely venue: an L.A. shopping mall.
Jia Chen, education director at the California NanoSystems Institute at UCLA, explains to a crowd of bystanders at the Promenade mall that atoms, while very small, can form large objects in a variety of shapes. Graduate student Pascal Krotee pours out a solution to demonstrate how water can be purified by using a filter made from nanomaterials.
On Feb. 20 , the Promenade at Howard Hughes Center became more than just a shopping bazaar for kids and parents looking to buy the latest cool sneakers. Volunteer scientists, graduate students and staff from the California NanoSystems Institute (CNSI) at UCLA set up a booth there to demystify nanoscience in fun ways. It’s a tough subject that’s not well understood by the general public, isn’t even in the Merriam-Webster Dictionary yet and may even sound a bit scary to them.
But not to Spencer. He seemed to take in every word and eagerly participated in simple tabletop demonstrations of nanoscience in action with Jia Chen, education director at the institute. “My favorite things were how fiber optics work and how things repel water,” said the boy after getting his picture taken wearing a white lab coat and protective goggles like a real scientist. “I want to go to UCLA!”
“This is 10 times greater than we thought it would be,” said Spencer’s mother, Frankie Drayus Reisner, of the event. She was especially impressed by the way the UCLA scientists answered basic questions without making people feel foolish or stupid. During the three hours the booth was open, crowds of adults and children gathered around to assemble mock atomic and molecular structures, experiment with water-repellant surfaces and learn about the ubiquitous impact nanoscience has on their daily lives in ways they never realized before.
Nanoscience at the Mall, a UCLA project funded by the American Physical Society, was an idea that came to Chen and his colleague Sarah Tolbert, professor and faculty director of CNSI outreach, after they found out that the average American visits a shopping mall for four hours weekly. That’s enough time, they figured, to engage shoppers in a fun conversation about nanoscience, the study of materials on an atomic or molecular scale.
Just the experience of meeting a genuine nanoscientist in a neighborhood shopping mall helps make this science seem less remote and less esoteric. “People aren’t expecting to encounter UCLA nanoscientists at the mall,” Chen said. Offering the public a convenient new venue where they can talk to a working scientist and recognize how science is relevant to their personal lives is a prime goal of the program, Chen explained.
He observed how people respond to this. “They are immediately fascinated by the fun atmosphere and become comfortable enough to dive right in with questions and comments,“ he said. “They quickly learn what ‘nano’ means (one-billionth part of something) and how it impacts their lives. And they can have a conversation over coffee with the scientists who make these discoveries.
“We hope this leads to a greater curiosity and greater understanding of nanoscience, its benefits to society and why supporting its advancement is important,” he said.
While CNSI also has educational programs geared to connect with middle and high school students and teachers, reaching the adult population, whose opinions could have a much greater influence on public policy, is more difficult. So to test the effectiveness of the mall booth as a learning tool, shoppers who stopped by, like Stephen Schieneman, were asked to answer questions on an e-tablet.
“I learned something about nanoscience today,” said Schieneman, a Scout master and fifth-grade teacher who brought his Cub Scout troop to the booth after learning about it in a local newspaper. “It was interesting finding out there were so many nanotechnologies already on the market and out in the environment.”
The news release also provides information about upcoming UCLA Nanoscience at the Mall events,
If you’re interested in learning more about the subject, the UCLA nanoscientists are planning to be at the Westfield Culver City Mall on Saturday, April 2, from 1-4 p.m. They will be back at the Promenade April 16 and May 21 from 3-6 p.m.
An Oct. 8, 2015 news item on Nanowerk offers some context for why researchers at the University of California at Los Angeles (UCLA) are studying silver nanoparticles and their entry into the water system,
More than 2,000 consumer products today contain nanoparticles — particles so small that they are measured in billionths of a meter.
Manufacturers use nanoparticles to help sunscreen work better against the sun’s rays and to make athletic apparel better at wicking moisture away from the body, among many other purposes.
Of those products, 462 — ranging from toothpaste to yoga mats — contain nanoparticles made from silver, which are used for their ability to kill bacteria. But that benefit might be coming at a cost to the environment. In many cases, simply using the products as intended causes silver nanoparticles to wind up in rivers and other bodies of water, where they can be ingested by fish and interact with other marine life.
For scientists, a key question has been to what extent organisms retain those particles and what effects they might have.
I’d like to know where they got those numbers “… 2,000 consumer products …” and “… 462 — ranging from toothpaste to yoga mats — contain nanoparticles made from silver… .”
A new study by the University of California Center for Environmental Implications of Nanotechnology has found that smaller silver nanoparticles were more likely to enter fish’s bodies, and that they persisted longer than larger silver nanoparticles or fluid silver nitrate. The study, published online in the journal ACS Nano, was led by UCLA postdoctoral scholars Olivia Osborne and Sijie Lin, and Andre Nel, director of UCLA’s Center for Environmental Implications of Nanotechnology and associate director of the California NanoSystems Institute at UCLA.
Nel said that although it is not yet known whether silver nanoparticles are harmful, the research team wanted to first identify whether they were even being absorbed by fish. CEIN, which is funded by the National Science Foundation, is focused on studying the effects of nanotechnology on the environment.
In the study, researchers placed zebrafish in water that contained fluid silver nitrate and two sizes of silver nanoparticles — some measuring 20 nanometers in diameter and others 110 nanometers. Although the difference in size between these two particles is so minute that it can only be seen using high-powered transmission electron microscopes, the researchers found that the two sizes of particles affected the fish very differently.
The researchers used zebrafish in the study because they have some genetic similarities to humans, their embryos and larvae are transparent (which makes them easier to observe). In addition, they tend to absorb chemicals and other substances from water.
Osborne said the team focused its research on the fish’s gills and intestines because they are the organs most susceptible to silver exposure.
“The gills showed a significantly higher silver content for the 20-nanometer than the 110-nanometer particles, while the values were more similar in the intestines,” she said, adding that both sizes of the silver particles were retained in the intestines even after the fish spent seven days in clean water. “The most interesting revelation was that the difference in size of only 90 nanometers made such a striking difference in the particles’ demeanor in the gills and intestines.”
The experiment was one of the most comprehensive in vivo studies to date on silver nanoparticles, as well as the first to compare silver nanoparticle toxicity by extent of organ penetration and duration with different-sized particles, and the first to demonstrate a mechanism for the differences.
Osborne said the results seem to indicate that smaller particles penetrated deeper into the fishes’ organs and stayed there longer because they dissolve faster than the larger particles and are more readily absorbed by the fish.
Lin said the results indicate that companies using silver nanoparticles have to strike a balance that recognizes their benefits and their potential as a pollutant. Using slightly larger nanoparticles might help make them somewhat safer, for example, but it also might make the products in which they’re used less effective.
He added that data from the study could be translated to understand how other nanoparticles could be used in more environmentally sustainable ways.
Nel said the team’s next step is to determine whether silver particles are potentially harmful. “Our research will continue in earnest to determine what the long-term effects of this exposure can be,” he said.