Tag Archives: University of Illinois at Urbana-Champaign

Data storytelling in libraries

I had no idea that thee was such enthusiasm for data storytelling but it seems libraries require a kit for the topic. From an August 30, 2022 University of Illinois School of Information Sciences news release (also on EurekAlert), Note: A link has been removed,

A new project led by Associate Professor Kate McDowell and Assistant Professor Matthew Turk of the School of Information Sciences (iSchool) at the University of Illinois Urbana-Champaign will help libraries tell data stories that connect with their audiences. Their project, “Data Storytelling Toolkit for Librarians,” has received a two-year, $99,330 grant from the Institute of Museum and Library Services (IMLS grant RE-250094-OLS-21), under the Laura Bush 21st Century Librarian Program, which supports innovative research by untenured, tenure-track faculty.

“There are thousands of librarians who are skittish about data but love stories,” explained McDowell, who co-teaches a data storytelling course at the iSchool with Turk. “And there are hundreds of librarians who see data as fundamental, but until those librarians have a language through which to connect with the passions of the thousands who love stories, this movement toward strategic data use in the field of libraries will be stifled, along with the potential collaborative creativity of librarians.”

The data storytelling toolkit will provide a set of easy-to-adapt templates, which librarians can use to move quickly from data to story to storytelling. Librarians will be able to use the toolkit to plug in data they already have and generate data visualization and narrative structure options.

“To give an example, public libraries need to communicate employment impact. In this case, the data story will include who has become employed based on library services, how (journey map showing a visual sequence of steps from job seeking to employment), a structure for the story of an individual’s outcomes, and a strong data visualization strategy for communicating this impact,” said McDowell.

According to the researchers, the toolkit will be clearly defined so that librarians understand the potential for communicating with data but also fully adaptable to each librarian’s setting and to the communication needs inside the organization and with the public. The project will focus on community college and public libraries, with initial collaborators to include Ericson Public Library in Boone, Iowa; Oregon City (OR) Public Library; Moraine Valley Community College in Palos Hills, Illinois; Jackson State Community College in Jackson, Tennessee; and The Urbana Free Library.

McDowell’s storytelling research has involved training collaborations with advancement staff both at the University of Illinois Urbana-Champaign and the University of Illinois system; storytelling consulting work for multiple nonprofits including the 50th anniversary of the statewide Prairie Rivers Network that protects Illinois water; and storytelling lectures for the Consortium of Academic and Research Libraries in Illinois (CARLI). McDowell researches and publishes in the areas of storytelling at work, social justice storytelling, and what library storytelling can teach the information sciences about data storytelling. She holds both an MS and PhD in library and information science from Illinois.

Turk also holds an appointment with the Department of Astronomy in the College of Liberal Arts and Sciences at the University of Illinois. His research focuses on how individuals interact with data and how that data is processed and understood. He is a recipient of the prestigious Gordon and Betty Moore Foundation’s Moore Investigator Award in Data-Driven Discovery. Turk holds a PhD in physics from Stanford University.

I found some earlier information about a data storytelling course taught by the two researchers, from a September 25, 2019 University of Illinois School of Information Sciences news release, which provides some additional insight,

Collecting and understanding data is important, but equally important is the ability to tell meaningful stories based on data. Students in the iSchool’s Data Science Storytelling course (IS 590DST) learn data visualization as well as storytelling techniques, a combination that will prove valuable to their employers as they enter the workforce.

The course instructors, Associate Professor and Interim Associate Dean for Academic Affairs Kate McDowell and Assistant Professor Matthew Turk, introduced Data Science Storytelling in fall 2017. The course combines McDowell’s research interests in storytelling practices and applications and Turk’s research interests in data analysis and visualization.

Students in the course learn storytelling concepts, narrative theories, and performance techniques as well as how to develop stories in a collaborative workshop style. They also work with data visualization toolkits, which involves some knowledge of coding.

Ashley Hetrick (MS ’18) took Data Science Storytelling because she wanted “the skills to be able to tell the right story when the time is right for it.” She appreciated the practical approach, which allowed the students to immediately apply the skills they learned, such as developing a story structure and using a pandas DataFrame to support and build a story. Hetrick is using those skills in her current work as assistant director for research data engagement and education at the University of Illinois.

“I combine tools and methods from data science and analytics with storytelling to make sense of my unit’s data and to help researchers make sense of theirs,” she said. “In my experience, few researchers like data for its own sake. They collect, care for, and analyze data because they’re after what all storytellers are after: meaning. They want to find the signal in all of this noise. And they want others to find it too, perhaps long after their own careers are complete. Each dataset is a story and raw material for stories waiting to be told.”

According to Turk, the students who have enrolled in the course have been outstanding, “always finding ways to tell meaningful stories from data.” He hopes they leave the class with an understanding that stories permeate their lives and that shaping the stories they tell others and about others is a responsibility they carry with them.

“One reason that this course means a lot to me is because it gives students the opportunity to really bring together the different threads of study at the iSchool,” Turk said. “It’s a way to combine across levels of technicality, and it gives students permission to take a holistic approach to how they present data.”

I didn’t put much effort into it but did find three other courses on data storytelling, one at the University of Texas (my favourite), one at the University of Toronto, and one (Data Visualization and Storytelling) at the University of British Columbia. The one at the University of British Columbia is available through the business school, the other two are available through information/library science faculties.

Sonifying the protein folding process

A sonification and animation of a state machine based on a simple lattice model used by Martin Gruebele to teach concepts of protein-folding dynamics. First posted January 25, 2022 on YouTube.

A February 17, 2022 news item on ScienceDaily announces the work featured in the animation above,

Musicians are helping scientists analyze data, teach protein folding and make new discoveries through sound.

A team of researchers at the University of Illinois Urbana-Champaign is using sonification — the use of sound to convey information — to depict biochemical processes and better understand how they happen.

Music professor and composer Stephen Andrew Taylor; chemistry professor and biophysicist Martin Gruebele; and Illinois music and computer science alumna, composer and software designer Carla Scaletti formed the Biophysics Sonification Group, which has been meeting weekly on Zoom since the beginning of the pandemic. The group has experimented with using sonification in Gruebele’s research into the physical mechanisms of protein folding, and its work recently allowed Gruebele to make a new discovery about the ways a protein can fold.

A February 17, 2022 University of Illinois at Urbana-Champaign news release (also on EurekAlert), which originated the news item, describes how the group sonifies and animates the protein folding process (Note: Links have been removed),

Taylor’s musical compositions have long been influenced by science, and recent works represent scientific data and biological processes. Gruebele also is a musician who built his own pipe organ that he plays and uses to compose music. The idea of working together on sonification struck a chord with them, and they’ve been collaborating for several years. Through her company, Symbolic Sound Corp., Scaletti develops a digital audio software and hardware sound design system called Kyma that is used by many musicians and researchers, including Taylor.

Scaletti created an animated visualization paired with sound that illustrated a simplified protein-folding process, and Gruebele and Taylor used it to introduce key concepts of the process to students and gauge whether it helped with their understanding. They found that sonification complemented and reinforced the visualizations and that, even for experts, it helped increase intuition for how proteins fold and misfold over time. The Biophysics Sonification Group – which also includes chemistry professor Taras Pogorelov, former chemistry graduate student (now alumna) Meredith Rickard, composer and pipe organist Franz Danksagmüller of the Lübeck Academy of Music in Germany, and Illinois electrical and computer engineering alumnus Kurt Hebel of Symbolic Sound – described using sonification in teaching in the Journal of Chemical Education.

Gruebele and his research team use supercomputers to run simulations of proteins folding into a specific structure, a process that relies on a complex pattern of many interactions. The simulation reveals the multiple pathways the proteins take as they fold, and also shows when they misfold or get stuck in the wrong shape – something thought to be related to a number of diseases such as Alzheimer’s and Parkinson’s.

The researchers use the simulation data to gain insight into the process. Nearly all data analysis is done visually, Gruebele said, but massive amounts of data generated by the computer simulations – representing hundreds of thousands of variables and millions of moments in time – can be very difficult to visualize.

“In digital audio, everything is a stream of numbers, so actually it’s quite natural to take a stream of numbers and listen to it as if it’s a digital recording,” Scaletti said. “You can hear things that you wouldn’t see if you looked at a list of numbers and you also wouldn’t see if you looked at an animation. There’s so much going on that there could be something that’s hidden, but you could bring it out with sound.”

For example, when the protein folds, it is surrounded by water molecules that are critical to the process. Gruebele said he wants to know when a water molecule touches and solvates a protein, but “there are 50,000 water molecules moving around, and only one or two are doing a critical thing. It’s impossible to see.” However, if a splashy sound occurred every time a water molecule touched a specific amino acid, that would be easy to hear.

Taylor and Scaletti use various audio-mapping techniques to link aspects of proteins to sound parameters such as pitch, timbre, loudness and pan position. For example, Taylor’s work uses different pitches and instruments to represent each unique amino acid, as well as their hydrophobic or hydrophilic qualities.

“I’ve been trying to draw on our instinctive responses to sound as much as possible,” Taylor said. “Beethoven said, ‘The deeper the stream, the deeper the tone.’ We expect an elephant to make a low sound because it’s big, and we expect a sparrow to make a high sound because it’s small. Certain kinds of mappings are built into us. As much as possible, we can take advantage of those and that helps to communicate more effectively.”

The highly developed instincts of musicians help in creating the best tool to use sound to convey information, Taylor said.

“It’s a new way of showing how music and sound can help us understand the world. Musicians have an important role to play,” he said. “It’s helped me become a better musician, in thinking about sound in different ways and thinking how sound can link to the world in different ways, even the world of the very small.”

Here’s a link to and a citation for the paper,

Sonification-Enhanced Lattice Model Animations for Teaching the Protein Folding Reaction by Carla Scaletti, Meredith M. Rickard, Kurt J. Hebel, Taras V. Pogorelov, Stephen A. Taylor, and Martin Gruebele. J. Chem. Educ. 2022, XXXX, XXX, XXX-XXX DOI: https://doi.org/10.1021/acs.jchemed.1c00857 Publication Date:February 16, 2022 © 2022 American Chemical Society and Division of Chemical Education, Inc.

This paper is behind a paywall.

For more about sonification and proteins, there’s my March 31, 2022 posting, Classical music makes protein songs easier listening.

Internet of living things (IoLT)?

It’s not here yet but there are scientists working on an internet of living things (IoLT). There are some details (see the fourth paragraph from the bottom of the news release excerpt) about how an IoLT would be achieved but it seems these are early days. From a September 9, 2021 University of Illinois news release (also on EurekAlert), Note: Links have been removed,

The National Science Foundation (NSF) announced today an investment of $25 million to launch the Center for Research on Programmable Plant Systems (CROPPS). The center, a partnership among the University of Illinois at Urbana-Champaign, Cornell University, the Boyce Thompson Institute, and the University of Arizona, aims to develop tools to listen and talk to plants and their associated organisms.

“CROPPS will create systems where plants communicate their hidden biology to sensors, optimizing plant growth to the local environment. This Internet of Living Things (IoLT) will enable breakthrough discoveries, offer new educational opportunities, and open transformative opportunities for productive, sustainable, and profitable management of crops,” says Steve Moose (BSD/CABBI/GEGC), the grant’s principal investigator at Illinois. Moose is a genomics professor in the Department of Crop Sciences, part of the College of Agricultural, Consumer and Environmental Sciences (ACES). 

As an example of what’s possible, CROPPS scientists could deploy armies of autonomous rovers to monitor and modify crop growth in real time. The researchers created leaf sensors to report on belowground processes in roots. This combination of machine and living sensors will enable completely new ways of decoding the language of plants, allowing researchers to teach plants how to better handle environmental challenges. 

“Right now, we’re working to program a circuit that responds to low-nitrogen stress, where the plant growth rate is ‘slowed down’ to give farmers more time to apply fertilizer during the window that is the most efficient at increasing yield,” Moose explains.

With 150+ years of global leadership in crop sciences and agricultural engineering, along with newer transdisciplinary research units such as the National Center for Supercomputing Applications (NCSA) and the Center for Digital Agriculture (CDA), Illinois is uniquely positioned to take on the technical challenges associated with CROPPS.

But U of I scientists aren’t working alone. For years, they’ve collaborated with partner institutions to conceptualize the future of digital agriculture and bring it into reality. For example, researchers at Illinois’ CDA and Cornell’s Initiative for Digital Agriculture jointly proposed the first IoLT for agriculture, laying the foundation for CROPPS.

“CROPPS represents a significant win from having worked closely with our partners at Cornell and other institutions. We’re thrilled to move forward with our colleagues to shift paradigms in agriculture,” says Vikram Adve, Donald B. Gillies Professor in computer science at Illinois and co-director of the CDA.

CROPPS research may sound futuristic, and that’s the point.

The researchers say new tools are needed to make crops productive, flexible, and sustainable enough to feed our growing global population under a changing climate. Many of the tools under development – biotransducers small enough to fit between soil particles, dexterous and highly autonomous field robots, field-applied gene editing nanoparticles, IoLT clouds, and more – have been studied in the proof-of-concept phase, and are ready to be scaled up.

“One of the most exciting goals of CROPPS is to apply recent advances in sensing and data analytics to understand the rules of life, where plants have much to teach us. What we learn will bring a stronger biological dimension to the next phase of digital agriculture,” Moose says. 

CROPPS will also foster innovations in STEM [science, technology[ engineering, and mathematics] education through programs that involve students at all levels, and each partner institution will share courses in digital agriculture topics. CROPPS also aims to engage professionals in digital agriculture at any career stage, and learn how the public views innovations in this emerging technology area.

“Along with cutting-edge research, CROPPS coordinated educational programs will address the future of work in plant sciences and agriculture,” says Germán Bollero, associate dean for research in the College of ACES.

I look forward to hearing more about IoLT.

Creating multiferroic material at room temperature

A Sept. 23, 2016 news item on ScienceDaily describes some research from Cornell University (US),

Multiferroics — materials that exhibit both magnetic and electric order — are of interest for next-generation computing but difficult to create because the conditions conducive to each of those states are usually mutually exclusive. And in most multiferroics found to date, their respective properties emerge only at extremely low temperatures.

Two years ago, researchers in the labs of Darrell Schlom, the Herbert Fisk Johnson Professor of Industrial Chemistry in the Department of Materials Science and Engineering, and Dan Ralph, the F.R. Newman Professor in the College of Arts and Sciences, in collaboration with professor Ramamoorthy Ramesh at UC Berkeley, published a paper announcing a breakthrough in multiferroics involving the only known material in which magnetism can be controlled by applying an electric field at room temperature: the multiferroic bismuth ferrite.

Schlom’s group has partnered with David Muller and Craig Fennie, professors of applied and engineering physics, to take that research a step further: The researchers have combined two non-multiferroic materials, using the best attributes of both to create a new room-temperature multiferroic.

Their paper, “Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic,” was published — along with a companion News & Views piece — Sept. 22 [2016] in Nature. …

A Sept. 22, 2016 Cornell University news release by Tom Fleischman, which originated the news item, details more about the work (Note: A link has been removed),

The group engineered thin films of hexagonal lutetium iron oxide (LuFeO3), a material known to be a robust ferroelectric but not strongly magnetic. The LuFeO3 consists of alternating single monolayers of lutetium oxide and iron oxide, and differs from a strong ferrimagnetic oxide (LuFe2O4), which consists of alternating monolayers of lutetium oxide with double monolayers of iron oxide.

The researchers found, however, that they could combine these two materials at the atomic-scale to create a new compound that was not only multiferroic but had better properties that either of the individual constituents. In particular, they found they need to add just one extra monolayer of iron oxide to every 10 atomic repeats of the LuFeO3 to dramatically change the properties of the system.

That precision engineering was done via molecular-beam epitaxy (MBE), a specialty of the Schlom lab. A technique Schlom likens to “atomic spray painting,” MBE let the researchers design and assemble the two different materials in layers, a single atom at a time.

The combination of the two materials produced a strongly ferrimagnetic layer near room temperature. They then tested the new material at the Lawrence Berkeley National Laboratory (LBNL) Advanced Light Source in collaboration with co-author Ramesh to show that the ferrimagnetic atoms followed the alignment of their ferroelectric neighbors when switched by an electric field.

“It was when our collaborators at LBNL demonstrated electrical control of magnetism in the material that we made that things got super exciting,” Schlom said. “Room-temperature multiferroics are exceedingly rare and only multiferroics that enable electrical control of magnetism are relevant to applications.”

In electronics devices, the advantages of multiferroics include their reversible polarization in response to low-power electric fields – as opposed to heat-generating and power-sapping electrical currents – and their ability to hold their polarized state without the need for continuous power. High-performance memory chips make use of ferroelectric or ferromagnetic materials.

“Our work shows that an entirely different mechanism is active in this new material,” Schlom said, “giving us hope for even better – higher-temperature and stronger – multiferroics for the future.”

Collaborators hailed from the University of Illinois at Urbana-Champaign, the National Institute of Standards and Technology, the University of Michigan and Penn State University.

Here is a link and a citation to the paper and to a companion piece,

Atomically engineered ferroic layers yield a room-temperature magnetoelectric multiferroic by Julia A. Mundy, Charles M. Brooks, Megan E. Holtz, Jarrett A. Moyer, Hena Das, Alejandro F. Rébola, John T. Heron, James D. Clarkson, Steven M. Disseler, Zhiqi Liu, Alan Farhan, Rainer Held, Robert Hovden, Elliot Padgett, Qingyun Mao, Hanjong Paik, Rajiv Misra, Lena F. Kourkoutis, Elke Arenholz, Andreas Scholl, Julie A. Borchers, William D. Ratcliff, Ramamoorthy Ramesh, Craig J. Fennie, Peter Schiffer et al. Nature 537, 523–527 (22 September 2016) doi:10.1038/nature19343 Published online 21 September 2016

Condensed-matter physics: Multitasking materials from atomic templates by Manfred Fiebig. Nature 537, 499–500  (22 September 2016) doi:10.1038/537499a Published online 21 September 2016

Both the paper and its companion piece are behind a paywall.

Osmotic power: electricity generated with water, salt and a 3-atoms-thick membrane


EPFL researchers have developed a system that generates electricity from osmosis with unparalleled efficiency. Their work, featured in “Nature”, uses seawater, fresh water, and a new type of membrane just three atoms thick.

A July 13, 2016 news item on Nanowerk highlights  research on osmotic power at École polytechnique fédérale de Lausanne (EPFL; Switzerland),

Proponents of clean energy will soon have a new source to add to their existing array of solar, wind, and hydropower: osmotic power. Or more specifically, energy generated by a natural phenomenon occurring when fresh water comes into contact with seawater through a membrane.

Researchers at EPFL’s Laboratory of Nanoscale Biology have developed an osmotic power generation system that delivers never-before-seen yields. Their innovation lies in a three atoms thick membrane used to separate the two fluids. …

A July 14, 2016 EPFL press release (also on EurekAlert but published July 13, 2016), which originated the news item, describes the research,

The concept is fairly simple. A semipermeable membrane separates two fluids with different salt concentrations. Salt ions travel through the membrane until the salt concentrations in the two fluids reach equilibrium. That phenomenon is precisely osmosis.

If the system is used with seawater and fresh water, salt ions in the seawater pass through the membrane into the fresh water until both fluids have the same salt concentration. And since an ion is simply an atom with an electrical charge, the movement of the salt ions can be harnessed to generate electricity.

A 3 atoms thick, selective membrane that does the job

EPFL’s system consists of two liquid-filled compartments separated by a thin membrane made of molybdenum disulfide. The membrane has a tiny hole, or nanopore, through which seawater ions pass into the fresh water until the two fluids’ salt concentrations are equal. As the ions pass through the nanopore, their electrons are transferred to an electrode – which is what is used to generate an electric current.

Thanks to its properties the membrane allows positively-charged ions to pass through, while pushing away most of the negatively-charged ones. That creates voltage between the two liquids as one builds up a positive charge and the other a negative charge. This voltage is what causes the current generated by the transfer of ions to flow.

“We had to first fabricate and then investigate the optimal size of the nanopore. If it’s too big, negative ions can pass through and the resulting voltage would be too low. If it’s too small, not enough ions can pass through and the current would be too weak,” said Jiandong Feng, lead author of the research.

What sets EPFL’s system apart is its membrane. In these types of systems, the current increases with a thinner membrane. And EPFL’s membrane is just a few atoms thick. The material it is made of – molybdenum disulfide – is ideal for generating an osmotic current. “This is the first time a two-dimensional material has been used for this type of application,” said Aleksandra Radenovic, head of the laboratory of Nanoscale Biology

Powering 50’000 energy-saving light bulbs with 1m2 membrane

The potential of the new system is huge. According to their calculations, a 1m2 membrane with 30% of its surface covered by nanopores should be able to produce 1MW of electricity – or enough to power 50,000 standard energy-saving light bulbs. And since molybdenum disulfide (MoS2) is easily found in nature or can be grown by chemical vapor deposition, the system could feasibly be ramped up for large-scale power generation. The major challenge in scaling-up this process is finding out how to make relatively uniform pores.

Until now, researchers have worked on a membrane with a single nanopore, in order to understand precisely what was going on. ” From an engineering perspective, single nanopore system is ideal to further our fundamental understanding of 8=-based processes and provide useful information for industry-level commercialization”, said Jiandong Feng.

The researchers were able to run a nanotransistor from the current generated by a single nanopore and thus demonstrated a self-powered nanosystem. Low-power single-layer MoS2 transistors were fabricated in collaboration with Andras Kis’ team at at EPFL, while molecular dynamics simulations were performed by collaborators at University of Illinois at Urbana–Champaign

Harnessing the potential of estuaries

EPFL’s research is part of a growing trend. For the past several years, scientists around the world have been developing systems that leverage osmotic power to create electricity. Pilot projects have sprung up in places such as Norway, the Netherlands, Japan, and the United States to generate energy at estuaries, where rivers flow into the sea. For now, the membranes used in most systems are organic and fragile, and deliver low yields. Some systems use the movement of water, rather than ions, to power turbines that in turn produce electricity.

Once the systems become more robust, osmotic power could play a major role in the generation of renewable energy. While solar panels require adequate sunlight and wind turbines adequate wind, osmotic energy can be produced just about any time of day or night – provided there’s an estuary nearby.

Here’s a link to and a citation for the paper,

Single-layer MoS2 nanopores as nanopower generators by Jiandong Feng, Michael Graf, Ke Liu, Dmitry Ovchinnikov, Dumitru Dumcenco, Mohammad Heiranian, Vishal Nandigana, Narayana R. Aluru, Andras Kis, & Aleksandra Radenovic. Nature (2016)  doi:10.1038/nature18593 Published online 13 July 2016

This paper is behind a paywall.

$1.4B for US National Nanotechnology Initiative (NNI) in 2017 budget

According to an April 1, 2016 news item on Nanowerk, the US National Nanotechnology (NNI) has released its 2017 budget supplement,

The President’s Budget for Fiscal Year 2017 provides $1.4 billion for the National Nanotechnology Initiative (NNI), affirming the important role that nanotechnology continues to play in the Administration’s innovation agenda. NNI
Cumulatively totaling nearly $24 billion since the inception of the NNI in 2001, the President’s 2017 Budget supports nanoscale science, engineering, and technology R&D at 11 agencies.

Another 9 agencies have nanotechnology-related mission interests or regulatory responsibilities.

An April 1, 2016 NNI news release, which originated the news item, affirms the Obama administration’s commitment to the NNI and notes the supplement serves as an annual report amongst other functions,

Throughout its two terms, the Obama Administration has maintained strong fiscal support for the NNI and has implemented new programs and activities to engage the broader nanotechnology community to support the NNI’s vision that the ability to understand and control matter at the nanoscale will lead to new innovations that will improve our quality of life and benefit society.

This Budget Supplement documents progress of these participating agencies in addressing the goals and objectives of the NNI. It also serves as the Annual Report for the NNI called for under the provisions of the 21st Century Nanotechnology Research and Development Act of 2003 (Public Law 108-153, 15 USC §7501). The report also addresses the requirement for Department of Defense reporting on its nanotechnology investments, per 10 USC §2358.

For additional details and to view the full document, visit www.nano.gov/2017BudgetSupplement.

I don’t seem to have posted about the 2016 NNI budget allotment but 2017’s $1.4B represents a drop of $100M since 2015’s $1.5 allotment.

The 2017 NNI budget supplement describes the NNI’s main focus,

Over the past year, the NNI participating agencies, the White House Office of Science and Technology Policy (OSTP), and the National Nanotechnology Coordination Office (NNCO) have been charting the future directions of the NNI, including putting greater focus on promoting commercialization and increasing education and outreach efforts to the broader nanotechnology community. As part of this effort, and in keeping with recommendations from the 2014 review of the NNI by the President’s Council of Advisors for Science and Technology, the NNI has been working to establish Nanotechnology-Inspired Grand Challenges, ambitious but achievable goals that will harness nanotechnology to solve National or global problems and that have the potential to capture the public’s imagination. Based upon inputs from NNI agencies and the broader community, the first Nanotechnology-Inspired Grand Challenge (for future computing) was announced by OSTP on October 20, 2015, calling for a collaborative effort to “create a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.” This Grand Challenge has generated broad interest within the nanotechnology community—not only NNI agencies, but also industry, technical societies, and private foundations—and planning is underway to address how the agencies and the community will work together to achieve this goal. Topics for additional Nanotechnology-Inspired Grand Challenges are under review.

Interestingly, it also offers an explanation of the images on its cover (Note: Links have been removed),

US_NNI_2017_budget_cover

About the cover

Each year’s National Nanotechnology Initiative Supplement to the President’s Budget features cover images illustrating recent developments in nanotechnology stemming from NNI activities that have the potential to make major contributions to National priorities. The text below explains the significance of each of the featured images on this year’s cover.

US_NNI_2017_front_cover_CloseUp

Front cover featured images (above): Images illustrating three novel nanomedicine applications. Center: microneedle array for glucose-responsive insulin delivery imaged using fluorescence microscopy. This “smart insulin patch” is based on painless microneedles loaded with hypoxia-sensitive vesicles ~100 nm in diameter that release insulin in response to high glucose levels. Dr. Zhen Gu and colleagues at the University of North Carolina (UNC) at Chapel Hill and North Carolina State University have demonstrated that this patch effectively regulates the blood glucose of type 1 diabetic mice with faster response than current pH-sensitive formulations. The inset image on the lower right shows the structure of the nanovesicles; each microneedle contains more than 100 million of these vesicles. The research was supported by the American Diabetes Association, the State of North Carolina, the National Institutes of Health (NIH), and the National Science Foundation (NSF). Left: colorized rendering of a candidate universal flu vaccine nanoparticle. The vaccine molecule, developed at the NIH Vaccine Research Center, displays only the conserved part of the viral spike and stimulates the production of antibodies to fight against the ever-changing flu virus. The vaccine is engineered from a ~13 nm ferritin core (blue) combined with a 7 nm influenza antigen (green). Image credit: NIH National Institute of Allergy and Infectious Diseases (NIAID). Right: colorized scanning electron micrograph of Ebola virus particles on an infected VERO E6 cell. Blue represents individual Ebola virus particles. The image was produced by John Bernbaum and Jiro Wada at NIAID. When the Ebola outbreak struck in 2014, the Food and Drug Administration authorized emergency use of lateral flow immunoassays for Ebola detection that use gold nanoparticles for visual interpretation of the tests.

US_NNI_2017_back_cover._CloseUp

Back cover featured images (above): Images illustrating examples of NNI educational outreach activities. Center: Comic from the NSF/NNI competition Generation Nano: Small Science Superheroes. Illustration by Amina Khan, NSF. Left of Center: Polymer Nanocone Array (biomimetic of antimicrobial insect surface) by Kyle Nowlin, UNC-Greensboro, winner from the first cycle of the NNI’s student image contest, EnvisioNano. Right of Center: Gelatin Nanoparticles in Brain (nasal delivery of stroke medication to the brain) by Elizabeth Sawicki, University of Illinois at Urbana-Champaign, winner from the second cycle of EnvisioNano. Outside right: still photo from the video Chlorination-less (water treatment method using reusable nanodiamond powder) by Abelardo Colon and Jennifer Gill, University of Puerto Rico at Rio Piedras, the winning video from the NNI’s Student Video Contest. Outside left: Society of Emerging NanoTechnologies (SENT) student group at the University of Central Florida, one of the initial nodes in the developing U.S. Nano and Emerging Technologies Student Network; photo by Alexis Vilaboy.

Protecting Disney’s art with an artificial nose

Curators and conservators are acutely aware of how fragile artworks (see my Jan. 10, 2013 posting about a show where curators watched helplessly as daguerreotypes deteriorated) can be so this new technology from Disney is likely to excite a lot of interest. From a March 14, 2016 news item on phys.org,

Original drawings and sketches from Walt Disney Animation Studio’s more than 90-year history—from Steamboat Willie through Frozen—traveled internationally for the first time this summer. This gave conservators the rare opportunity to monitor the artwork with a new state-of-the-art sensor. A team of researchers report today that they developed and used a super-sensitive artificial “nose,” customized specifically to detect pollutants before they could irreversibly damage the artwork.

Here’s a sample of the art work,

Caption: To protect works of art, including this image of Disney's Steamboat Willie, scientists developed an optoelectronic "nose" to sniff out potentially damaging compounds in pollution. Credit: Steamboat Willie, 1928 Animation cel and background © Disney Enterprises, Inc. Courtesy of Walt Disney Animation Research Library

Caption: To protect works of art, including this image of Disney’s Steamboat Willie, scientists developed an optoelectronic “nose” to sniff out potentially damaging compounds in pollution. Credit: Steamboat Willie, 1928 Animation cel and background © Disney Enterprises, Inc. Courtesy of Walt Disney Animation Research Library

A March 14, 2016 American Chemical Society (ACS) news release (also on EurekAlert), provides more detail,

The researchers report on their preservation efforts at the 251st National Meeting & Exposition of the American Chemical Society (ACS). ACS, the world’s largest scientific society, is holding the meeting here through Thursday. It features more than 12,500 presentations on a wide range of science topics.

“Many pollutants that are problematic for human beings are also problematic for works of art,” says Kenneth Suslick, Ph.D. For example, pollutants can spur oxidative damage and acid degradation that, in prints or canvases, lead to color changes or decomposition. “The ability to monitor how much pollution a drawing or painting is exposed to is an important element of art preservation,” he says.

However, works of art are susceptible to damage at far lower pollutant levels than what’s considered acceptable for humans. “The high sensitivity of artists’ materials makes a lot of sense for two reasons,” explains Suslick, who is at the University of Illinois at Urbana-Champaign. “Human beings are capable of healing, which, of course, works of art cannot do. Moreover, human beings have finite lifetimes, whereas ideally works of art should last for future generations.”

To protect valuable works of art from these effects, conservators enclose vulnerable pieces in sealed display cases. But even then, some artists’ materials may “exhale” reactive compounds that accumulate in the cases and damage the art. To counter the accumulation of pollutants, conservators often hide sorbent materials inside display cases that scrub potentially damaging compounds from the enclosed environment. But it is difficult to know precisely when to replace the sorbents.

Suslick, a self-proclaimed “museum hound,” figured he might have an answer. He had already invented an optoelectronic nose — an array of dyes that change color when exposed to various compounds. But it is used largely for biomedical purposes, and it can’t sniff out the low concentrations of pollutants that damage works of art. To redesign the nose with the aim of protecting artwork, he approached scientists at the Getty Conservation Institute (GCI), a private non-profit institution in Los Angeles that works internationally to advance art conservation practice. He proposed that his team devise a sensor several hundred times more sensitive than existing devices used for cultural heritage research. The collaboration took off, and the scientists built a keener nose.

At the time, GCI was involved in a research project with the Walt Disney Animation Research Library to investigate the impact of storage environment on their animation cels, which are transparent sheets that artists drew or painted on before computer animation was developed. Such research ultimately could help extend the life of this important collection. The new sensors would monitor levels of acetic acid and other compounds that emanate from these sheets.

Before the exhibit, “Drawn from Life: The Art of Disney Animation Studios,” hit the road on tour, Suslick recommended placing the sensors in discrete places to monitor the pollution levels both inside and outside of the sealed and framed artworks. If the sensors indicated pollution levels inside the sealed frames were rising, conservators traveling with the Disney exhibit would know to replace the sorbents. An initial analysis of sensor data showed that the sorbents were effective. Suslick says he expects to continue expanding the sensors’ applications in the field of cultural heritage.

Collaborators in the project include Maria LaGasse, a graduate student in Suslick’s lab; Kristen McCormick, art exhibitions and conservation manager at the Walt Disney Animation Research Library; Herant Khanjian, assistant scientist; and Michael Schilling, senior scientist at the Getty Conservation Institute.

I was able to find one museum exhibiting “Drawn from Life: The Art of Disney Animation Studios”; it was the Museum of China which hosted the show from June 30 – August 18, 2015. There are pictures of the exhibit at the Museum of China posted by Leon Ingram here on Behance.

4D printing: a hydrogel orchid

In 2013, the 4th dimension for printing was self-assembly according to a March 1, 2013 article by Tuan Nguyen for ZDNET. A Jan. 25, 2016 Wyss Institute for Biologically Inspired Engineering at Harvard University news release (also on EurekAlert) points to time as the fourth dimension in a description of the Wyss Institute’s latest 4D printed object,

A team of scientists at the Wyss Institute for Biologically Inspired Engineering at Harvard University and the Harvard John A. Paulson School of Engineering and Applied Sciences has evolved their microscale 3D printing technology to the fourth dimension, time. Inspired by natural structures like plants, which respond and change their form over time according to environmental stimuli, the team has unveiled 4D-printed hydrogel composite structures that change shape upon immersion in water.

“This work represents an elegant advance in programmable materials assembly, made possible by a multidisciplinary approach,” said Jennifer Lewis, Sc.D., senior author on the new study. “We have now gone beyond integrating form and function to create transformable architectures.”

In nature, flowers and plants have tissue composition and microstructures that result in dynamic morphologies that change according to their environments. Mimicking the variety of shape changes undergone by plant organs such as tendrils, leaves, and flowers in response to environmental stimuli like humidity and/or temperature, the 4D-printed hydrogel composites developed by Lewis and her team are programmed to contain precise, localized swelling behaviors. Importantly, the hydrogel composites contain cellulose fibrils that are derived from wood and are similar to the microstructures that enable shape changes in plants.

By aligning cellulose fibrils (also known as, cellulose nanofibrils or nanofibrillated cellulose) during printing, the hydrogel composite ink is encoded with anisotropic swelling and stiffness, which can be patterned to produce intricate shape changes. The anisotropic nature of the cellulose fibrils gives rise to varied directional properties that can be predicted and controlled. Just like wood, which can be split easier along the grain rather than across it. Likewise, when immersed in water, the hydrogel-cellulose fibril ink undergoes differential swelling behavior along and orthogonal to the printing path. Combined with a proprietary mathematical model developed by the team that predicts how a 4D object must be printed to achieve prescribed transformable shapes, the new method opens up many new and exciting potential applications for 4D printing technology including smart textiles, soft electronics, biomedical devices, and tissue engineering.

“Using one composite ink printed in a single step, we can achieve shape-changing hydrogel geometries containing more complexity than any other technique, and we can do so simply by modifying the print path,” said Gladman [A. Sydney Gladman, Wyss Institute a graduate research assistant]. “What’s more, we can interchange different materials to tune for properties such as conductivity or biocompatibility.”

The composite ink that the team uses flows like liquid through the printhead, yet rapidly solidifies once printed. A variety of hydrogel materials can be used interchangeably resulting in different stimuli-responsive behavior, while the cellulose fibrils can be replaced with other anisotropic fillers of choice, including conductive fillers.

“Our mathematical model prescribes the printing pathways required to achieve the desired shape-transforming response,” said Matsumoto [Elisabetta Matsumoto, Ph.D., a postdoctoral fellow at the Wyss]. “We can control the curvature both discretely and continuously using our entirely tunable and programmable method.”

Specifically, the mathematical modeling solves the “inverse problem”, which is the challenge of being able to predict what the printing toolpath must be in order to encode swelling behaviors toward achieving a specific desired target shape.

“It is wonderful to be able to design and realize, in an engineered structure, some of nature’s solutions,” said Mahadevan [L. Mahadevan, Ph.D., a Wyss Core Faculty member] , who has studied phenomena such as how botanical tendrils coil, how flowers bloom, and how pine cones open and close. “By solving the inverse problem, we are now able to reverse-engineer the problem and determine how to vary local inhomogeneity, i.e. the spacing between the printed ink filaments, and the anisotropy, i.e. the direction of these filaments, to control the spatiotemporal response of these shapeshifting sheets. ”

“What’s remarkable about this 4D printing advance made by Jennifer and her team is that it enables the design of almost any arbitrary, transformable shape from a wide range of available materials with different properties and potential applications, truly establishing a new platform for printing self-assembling, dynamic microscale structures that could be applied to a broad range of industrial and medical applications,” said Wyss Institute Founding Director Donald Ingber, M.D., Ph.D., who is also the Judah Folkman Professor of Vascular Biology at Harvard Medical School and the Vascular Biology Program at Boston Children’s Hospital and Professor of Bioengineering at Harvard SEAS [School of Engineering and Applied Science’.

Here’s an animation from the Wyss Institute illustrating the process,

And, here’s a link to and a citation for the paper,

Biomimetic 4D printing by A. Sydney Gladman, Elisabetta A. Matsumoto, Ralph G. Nuzzo, L. Mahadevan, & Jennifer A. Lewis. Nature Materials (2016) doi:10.1038/nmat4544 Published online 25 January 2016

This paper is behind a paywall.

What do nanocrystals have in common with the earth’s crust?

The deformation properties of nanocrystals resemble those in the earth’s crust according to a Nov. 17, 2015 news item on Nanowerk,

Apparently, size doesn’t always matter. An extensive study by an interdisciplinary research group suggests that the deformation properties of nanocrystals are not much different from those of the Earth’s crust.

“When solid materials such as nanocrystals, bulk metallic glasses, rocks, or granular materials are slowly deformed by compression or shear, they slip intermittently with slip-avalanches similar to earthquakes,” explained Karin Dahmen, a professor of physics at the University of Illinois at Urbana-Champaign. “Typically these systems are studied separately. But we found that the scaling behavior of their slip statistics agree across a surprisingly wide range of different length scales and material structures.”

There’s an illustration accompanying the research,

Courtesy of the University of Illinois

Caption: When solid materials such as nanocrystals, bulk metallic glasses, rocks, or granular materials are slowly deformed by compression or shear, they slip intermittently with slip-avalanches similar to earthquakes. Credit: University of Illinois

A Nov. 17, 2015 University of Illinois news release (also on EurekAlert) by Rick Kubetz, which originated the news item, provides more detail,

“Identifying agreement in aspects of the slip statistics is important, because it enables us to transfer results from one scale to another, from one material to another, from one stress to another, or from one strain rate to another,” stated Shivesh Pathak, a physics undergraduate at Illinois, and a co-author of the paper, “Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes,” appearing in Scientific Reports. “The study shows how to identify and explain commonalities in the deformation mechanisms of different materials on different scales.

“The results provide new tools and methods to use the slip statistics to predict future materials deformation,” added Michael LeBlanc, a physics graduate student and co-author of the paper. “They also clarify which system parameters significantly affect the deformation behavior on long length scales. We expect the results to be useful for applications in materials testing, failure prediction, and hazard prevention.”

Researchers representing a broad a range of disciplines–including physics, geosciences, mechanical engineering, chemical engineering, and materials science–from the United States, Germany, and the Netherlands contributed to the study, comparing five different experimental systems, on several different scales, with model predictions.

As a solid is sheared, each weak spot is stuck until the local shear stress exceeds a random failure threshold. It then slips by a random amount until it re-sticks. The released stress is redistributed to all other weak spots. Thus, a slipping weak spot can trigger other spots to fail in a slip avalanche.

Using tools from the theory of phase transitions, such as the renormalization group, one can show that the slip statistics of the model do not depend on the details of the system.

“Although these systems span 13 decades in length scale, they all show the same scaling behavior for their slip size distributions and other statistical properties,” stated Pathak. “Their size distributions follow the same simple (power law) function, multiplied with the same exponential cutoff.”

The cutoff, which is the largest slip or earthquake size, grows with applied force for materials spanning length scales from nanometers to kilometers. The dependence of the size of the largest slip or quake on stress reflects “tuned critical” behavior, rather than so-called self-organized criticality, which would imply stress-independence.

“The agreement of the scaling properties of the slip statistics across scales does not imply the predictability of individual slips or earthquakes,” LeBlanc said. “Rather, it implies that we can predict the scaling behavior of average properties of the slip statistics and the probability of slips of a certain size, including their dependence on stress and strain-rate.”

Here’s a link to and a citation for the paper,

Universal Quake Statistics: From Compressed Nanocrystals to Earthquakes by Jonathan T. Uhl, Shivesh Pathak, Danijel Schorlemmer, Xin Liu, Ryan Swindeman, Braden A. W. Brinkman, Michael LeBlanc, Georgios Tsekenis, Nir Friedman, Robert Behringer, Dmitry Denisov, Peter Schall, Xiaojun Gu, Wendelin J. Wright, Todd Hufnagel, Andrew Jennings, Julia R. Greer, P. K. Liaw, Thorsten Becker, Georg Dresen, & Karin A. Dahmen.  Scientific Reports 5, Article number: 16493 (2015)  doi:10.1038/srep16493 Published online: 17 November 2015

This is an open access paper.

One final comment, this story reminds me of a few other pieces of research featured here, which focus on repeating patterns in nature. The research was mentioned in an Aug. 27, 2015 posting about white dwarf stars and heartbeats and in an April 14, 2015 posting about gold nanoparticles and their resemblance to the Milky Way. You can also find more in the Wikipedia entry titled ‘Patterns in nature‘.

Nanopores and a new technique for desalination

There’s been more than one piece here about water desalination and purification and/or remediation efforts and at least one of them claims to have successfully overcome issues such as reverse osmosis energy needs which are hampering adoption of various technologies. Now, researchers at the University of Illinois at Champaign Urbana have developed another new technique for desalinating water while reverse osmosis issues according to a Nov. 11, 2015 news item on Nanowerk (Note: A link has been removed) ,

University of Illinois engineers have found an energy-efficient material for removing salt from seawater that could provide a rebuttal to poet Samuel Taylor Coleridge’s lament, “Water, water, every where, nor any drop to drink.”

The material, a nanometer-thick sheet of molybdenum disulfide (MoS2) riddled with tiny holes called nanopores, is specially designed to let high volumes of water through but keep salt and other contaminates out, a process called desalination. In a study published in the journal Nature Communications (“Water desalination with a single-layer MoS2 nanopore”), the Illinois team modeled various thin-film membranes and found that MoS2 showed the greatest efficiency, filtering through up to 70 percent more water than graphene membranes. [emphasis mine]

I’ll get to the professor’s comments about graphene membranes in a minute. Meanwhile, a Nov. 11, 2015 University of Illinois news release (also on EurekAlert), which originated the news item, provides more information about the research,

“Even though we have a lot of water on this planet, there is very little that is drinkable,” said study leader Narayana Aluru, a U. of I. professor of mechanical science and engineering. “If we could find a low-cost, efficient way to purify sea water, we would be making good strides in solving the water crisis.

“Finding materials for efficient desalination has been a big issue, and I think this work lays the foundation for next-generation materials. These materials are efficient in terms of energy usage and fouling, which are issues that have plagued desalination technology for a long time,” said Aluru, who also is affiliated with the Beckman Institute for Advanced Science and Technology at the U. of I.

Most available desalination technologies rely on a process called reverse osmosis to push seawater through a thin plastic membrane to make fresh water. The membrane has holes in it small enough to not let salt or dirt through, but large enough to let water through. They are very good at filtering out salt, but yield only a trickle of fresh water. Although thin to the eye, these membranes are still relatively thick for filtering on the molecular level, so a lot of pressure has to be applied to push the water through.

“Reverse osmosis is a very expensive process,” Aluru said. “It’s very energy intensive. A lot of power is required to do this process, and it’s not very efficient. In addition, the membranes fail because of clogging. So we’d like to make it cheaper and make the membranes more efficient so they don’t fail as often. We also don’t want to have to use a lot of pressure to get a high flow rate of water.”

One way to dramatically increase the water flow is to make the membrane thinner, since the required force is proportional to the membrane thickness. Researchers have been looking at nanometer-thin membranes such as graphene. However, graphene presents its own challenges in the way it interacts with water.

Aluru’s group has previously studied MoS2 nanopores as a platform for DNA sequencing and decided to explore its properties for water desalination. Using the Blue Waters supercomputer at the National Center for Supercomputing Applications at the U. of I., they found that a single-layer sheet of MoS2 outperformed its competitors thanks to a combination of thinness, pore geometry and chemical properties.

A MoS2 molecule has one molybdenum atom sandwiched between two sulfur atoms. A sheet of MoS2, then, has sulfur coating either side with the molybdenum in the center. The researchers found that creating a pore in the sheet that left an exposed ring of molybdenum around the center of the pore created a nozzle-like shape that drew water through the pore.

“MoS2 has inherent advantages in that the molybdenum in the center attracts water, then the sulfur on the other side pushes it away, so we have much higher rate of water going through the pore,” said graduate student Mohammad Heiranian, the first author of the study. “It’s inherent in the chemistry of MoS2 and the geometry of the pore, so we don’t have to functionalize the pore, which is a very complex process with graphene.”

In addition to the chemical properties, the single-layer sheets of MoS2 have the advantages of thinness, requiring much less energy, which in turn dramatically reduces operating costs. MoS2 also is a robust material, so even such a thin sheet is able to withstand the necessary pressures and water volumes.

The Illinois researchers are establishing collaborations to experimentally test MoS2 for water desalination and to test its rate of fouling, or clogging of the pores, a major problem for plastic membranes. MoS2 is a relatively new material, but the researchers believe that manufacturing techniques will improve as its high performance becomes more sought-after for various applications.

“Nanotechnology could play a great role in reducing the cost of desalination plants and making them energy efficient,” said Amir Barati Farimani, who worked on the study as a graduate student at Illinois and is now a postdoctoral fellow at Stanford University. “I’m in California now, and there’s a lot of talk about the drought and how to tackle it. I’m very hopeful that this work can help the designers of desalination plants. This type of thin membrane can increase return on investment because they are much more energy efficient.”

Here’s a link to and a citation for the paper,

Water desalination with a single-layer MoS2 nanopore by Mohammad Heiranian, Amir Barati Farimani, & Narayana R. Aluru. Nature Communications 6, Article number: 8616 doi:10.1038/ncomms9616 Published 14 October 2015

Graphene membranes

In a July 13, 2015 essay on Nanotechnology Now, Tim Harper provides an overview of the research into using graphene for water desalination and purification/remediation about which he is quite hopeful. There is no mention of an issue with interactions between water and graphene. It should be noted that Tim Harper is the Chief Executive Officer of G20, a company which produces a graphene-based solution (graphene oxide sheets), which can desalinate water and can purify/remediate it. Tim is a scientist and while you might have some hesitation given his fiscal interests, his essay is worthwhile reading as he supplies context and explanations of the science.