Tag Archives: Georgia Tech

Council of Canadian Academies and science policy for Alberta

The Council of Canadian Academies (CCA) has expanded its approach from assembling expert panels to report on questions posed by various Canadian government agencies (assessments) to special reports from a three-member panel and, now, to a workshop on the province of Alberta’s science policy ideas. From an Oct. 27, 2016 CCA news release (received via email),

The Council of Canadian Academies (CCA) is pleased to announce that it is undertaking an expert panel workshop on science policy ideas under development in Alberta. The workshop will engage national and international experts to explore various dimensions of sub-national science systems and the role of sub-national science policy.

“We are pleased to undertake this project,” said Eric M. Meslin, PhD, FCAHS, President and CEO of the CCA. “It is an assessment that could discuss strategies that have applications in Alberta, across Canada, and elsewhere.”

A two-day workshop, to be undertaken in November 2016, will bring together a multidisciplinary and multi-sectoral group of leading Canadian and international experts to review, validate, and advance work being done on science policy in Alberta. The workshop will explore the necessary considerations when creating science policy at the sub-national level. Specifically it will:

  • Debate and validate the main outcomes of a sub-national science enterprise, particularly in relation to knowledge, human, and social capital.
  • Identify the key elements and characteristics of a successful science enterprise (e.g., funding, trust, capacity, science culture, supporting interconnections and relationships) with a particular focus at a sub-national level.
  • Explore potential intents of a sub-national science policy, important features of such a policy, and the role of the policy in informing investment decisions.

To lead the design of the workshop, complete the necessary background research, and develop the workshop summary report, the CCA has appointed a five member Workshop Steering Committee, chaired by Joy Johnson, FCAHS, Vice President, Research, Simon Fraser University. The other Steering Committee members are: Paul Dufour, Adjunct Professor, Institute for Science, Society and Policy; University of Ottawa, Principal, Paulicy Works; Janet Halliwell, Principal, J.E. Halliwell Associates, Inc.; Kaye Husbands Fealing, Chair and Professor, School of Public Policy, Georgia Tech; and Marc LePage, President and CEO, Genome Canada.

The CCA, under the guidance of its Scientific Advisory Committee, and in collaboration with the Workshop Steering Committee, is now assembling a multidisciplinary, multi-sectoral, group of experts to participate in the two-day workshop. The CCA’s Member Academies – the Royal Society of Canada, the Canadian Academy of Engineering, and the Canadian Academy of Health Sciences – are a key source of membership for expert panels. Many experts are also Fellows of the Academies.

The workshop results will be published in a final summary report in spring 2017. This workshop assessment is supported by a grant from the Government of Alberta.

By comparison with the CCA’s last assessment mentioned here in a July 1, 2016 posting (The State of Science and Technology and Industrial Research and Development in Canada), this workshop has a better balance. The expert panel is being chaired by a woman (the first time I’ve seen that in a few years) and enough female members to add up to 60% representation. No representation from Québec (perhaps not a surprise given this is Alberta) but there is 40% from the western provinces given there is representation from both BC and Alberta. Business can boast 30% (?) with Paul Dufour doing double duty as both academic and business owner. It’s good to see international representation and one day I hope to see it from somewhere other than the US, the UK, and/or the Europe Union. Maybe Asia?

You can find contact information on the CCA’s Towards a Science Policy in Alberta webpage.

One comment, I find the lack of a specific date for the workshop interesting. It suggests either they were having difficulty scheduling or they wanted to keep the ‘unwashed’ away.

Achieving ultra-low friction without oil

Oiled gears as small parts of large mechanism Courtesy: Georgia Institute of Technology

Oiled gears as small parts of large mechanism Courtesy: Georgia Institute of Technology

Those gears are gorgeous, especially in full size; I will be giving a link to a full size version in a bit. Meanwhile, an Oct. 11, 2016 news item on Nanowerk makes an announcement about ultra-low friction without oil,

Researchers at Georgia Institute of Technology [Georgia Tech; US] have developed a new process for treating metal surfaces that has the potential to improve efficiency in piston engines and a range of other equipment.

The method improves the ability of metal surfaces to bond with oil, significantly reducing friction without special oil additives.

“About 50 percent of the mechanical energy losses in an internal combustion engine result from piston assembly friction. So if we can reduce the friction, we can save energy and reduce fuel and oil consumption,” said Michael Varenberg, an assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering.

An Oct. 5, 2016 Georgia Tech news release (also on EurekAlert but dated Oct. 11, 2016), which originated the news item, describes the research in more detail,

In the study, which was published Oct. 5 [2016] in the journal Tribology Letters, the researchers at Georgia Tech and Technion – Israel Institute of Technology tested treating the surface of cast iron blocks by blasting it with mixture of copper sulfide and aluminum oxide. The shot peening modified the surface chemically that changed how oil molecules bonded with the metal and led to a superior surface lubricity.

“We want oil molecules to be connected strongly to the surface. Traditionally this connection is created by putting additives in the oil,” Varenberg said. “In this specific case, we shot peen the surface with a blend of alumina and copper sulfide particles.  Making the surface more active chemically by deforming it allows for replacement reaction to form iron sulfide on top of the iron. And iron sulfides are known for very strong bonds with oil molecules.”

Oil is the primary tool used to reduce the friction that occurs when two surfaces slide in contact. The new surface treatment results in an ultra-low friction coefficient of about 0.01 in a base oil environment, which is about 10 times less than a friction coefficient obtained on a reference untreated surface, the researchers reported.

“The reported result surpasses the performance of the best current commercial oils and is similar to the performance of lubricants formulated with tungsten disulfide-based nanoparticles, but critically, our process does not use any expensive nanostructured media,” Varenberg said.

The method for reducing surface friction is flexible and similar results can be achieved using a variety of processes other than shot peening, such as lapping, honing, burnishing, laser shock peening, the researchers suggest. That would make the process even easier to adapt to a range of uses and industries. The researchers plan to continue to examine that fundamental functional principles and physicochemical mechanisms that caused the treatment to be so successful.

“This straightforward, scalable pathway to ultra-low friction opens new horizons for surface engineering, and it could significantly reduce energy losses on an industrial scale,” Varenberg said. “Moreover, our finding may result in a paradigm shift in the art of lubrication and initiate a whole new direction in surface science and engineering due to the generality of the idea and a broad range of potential applications.”

Here’s a link to and a citation for the paper,

Mechano-Chemical Surface Modification with Cu2S: Inducing Superior Lubricity by Michael Varenberg, Grigory Ryk, Alexander Yakhnis, Yuri Kligerman, Neha Kondekar, & Matthew T. McDowell. Tribol Lett (2016) 64: 28. doi:10.1007/s11249-016-0758-8 First online: Oct. 5, 2016

This paper is behind a paywall.

A human user manual—for robots

Researchers from the Georgia Institute of Technology (Georgia Tech), funded by the US Office of Naval Research (ONR), have developed a program that teaches robots to read stories and more in an effort to educate them about humans. From a June 16, 2016 ONR news release by Warren Duffie Jr. (also on EurekAlert),

With support from the Office of Naval Research (ONR), researchers at the Georgia Institute of Technology have created an artificial intelligence software program named Quixote to teach robots to read stories, learn acceptable behavior and understand successful ways to conduct themselves in diverse social situations.

“For years, researchers have debated how to teach robots to act in ways that are appropriate, non-intrusive and trustworthy,” said Marc Steinberg, an ONR program manager who oversees the research. “One important question is how to explain complex concepts such as policies, values or ethics to robots. Humans are really good at using narrative stories to make sense of the world and communicate to other people. This could one day be an effective way to interact with robots.”

The rapid pace of artificial intelligence has stirred fears by some that robots could act unethically or harm humans. Dr. Mark Riedl, an associate professor and director of Georgia Tech’s Entertainment Intelligence Lab, hopes to ease concerns by having Quixote serve as a “human user manual” by teaching robots values through simple stories. After all, stories inform, educate and entertain–reflecting shared cultural knowledge, social mores and protocols.

For example, if a robot is tasked with picking up a pharmacy prescription for a human as quickly as possible, it could: a) take the medicine and leave, b) interact politely with pharmacists, c) or wait in line. Without value alignment and positive reinforcement, the robot might logically deduce robbery is the fastest, cheapest way to accomplish its task. However, with value alignment from Quixote, it would be rewarded for waiting patiently in line and paying for the prescription.

For their research, Riedl and his team crowdsourced stories from the Internet. Each tale needed to highlight daily social interactions–going to a pharmacy or restaurant, for example–as well as socially appropriate behaviors (e.g., paying for meals or medicine) within each setting.

The team plugged the data into Quixote to create a virtual agent–in this case, a video game character placed into various game-like scenarios mirroring the stories. As the virtual agent completed a game, it earned points and positive reinforcement for emulating the actions of protagonists in the stories.

Riedl’s team ran the agent through 500,000 simulations, and it displayed proper social interactions more than 90 percent of the time.

“These games are still fairly simple,” said Riedl, “more like ‘Pac-Man’ instead of ‘Halo.’ However, Quixote enables these artificial intelligence agents to immerse themselves in a story, learn the proper sequence of events and be encoded with acceptable behavior patterns. This type of artificial intelligence can be adapted to robots, offering a variety of applications.”

Within the next six months, Riedl’s team hopes to upgrade Quixote’s games from “old-school” to more modern and complex styles like those found in Minecraft–in which players use blocks to build elaborate structures and societies.

Riedl believes Quixote could one day make it easier for humans to train robots to perform diverse tasks. Steinberg notes that robotic and artificial intelligence systems may one day be a much larger part of military life. This could involve mine detection and deactivation, equipment transport and humanitarian and rescue operations.

“Within a decade, there will be more robots in society, rubbing elbows with us,” said Riedl. “Social conventions grease the wheels of society, and robots will need to understand the nuances of how humans do things. That’s where Quixote can serve as a valuable tool. We’re already seeing it with virtual agents like Siri and Cortana, which are programmed not to say hurtful or insulting things to users.”

This story brought to mind two other projects: RoboEarth (an internet for robots only) mentioned in my Jan. 14, 2014 which was an update on the project featuring its use in hospitals and RoboBrain, a robot learning project (sourcing the internet, YouTube, and more for information to teach robots) was mentioned in my Sept. 2, 2014 posting.

Titanium dioxide nanoparticles have subtle effects on oxidative stress genes?

There’s research from the Georgia Institute of Technology (Georgia Tech; US) suggesting that titanium dioxide nanoparticles may have long term side effects. From a May 10, 2016 news item on ScienceDaily,

A nanoparticle commonly used in food, cosmetics, sunscreen and other products can have subtle effects on the activity of genes expressing enzymes that address oxidative stress inside two types of cells. While the titanium dioxide (TiO2) nanoparticles are considered non-toxic because they don’t kill cells at low concentrations, these cellular effects could add to concerns about long-term exposure to the nanomaterial.

A May 9, 2016 Georgia Tech news release on Newswire (also on EurekAlert), which originated the news item, describes the research in more detail,

Researchers at the Georgia Institute of Technology used high-throughput screening techniques to study the effects of titanium dioxide nanoparticles on the expression of 84 genes related to cellular oxidative stress. Their work found that six genes, four of them from a single gene family, were affected by a 24-hour exposure to the nanoparticles.

The effect was seen in two different kinds of cells exposed to the nanoparticles: human HeLa* cancer cells commonly used in research, and a line of monkey kidney cells. Polystyrene nanoparticles similar in size and surface electrical charge to the titanium dioxide nanoparticles did not produce a similar effect on gene expression.

“This is important because every standard measure of cell health shows that cells are not affected by these titanium dioxide nanoparticles,” said Christine Payne, an associate professor in Georgia Tech’s School of Chemistry and Biochemistry. “Our results show that there is a more subtle change in oxidative stress that could be damaging to cells or lead to long-term changes. This suggests that other nanoparticles should be screened for similar low-level effects.”

The research was reported online May 6 in the Journal of Physical Chemistry C. The work was supported by the National Institutes of Health (NIH) through the HERCULES Center at Emory University, and by a Vasser Woolley Fellowship.

Titanium dioxide nanoparticles help make powdered donuts white, protect skin from the sun’s rays and reflect light in painted surfaces. In concentrations commonly used, they are considered non-toxic, though several other studies have raised concern about potential effects on gene expression that may not directly impact the short-term health of cells.

To determine whether the nanoparticles could affect genes involved in managing oxidative stress in cells, Payne and colleague Melissa Kemp – an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University – designed a study to broadly evaluate the nanoparticle’s impact on the two cell lines.

Working with graduate students Sabiha Runa and Dipesh Khanal, they separately incubated HeLa cells and monkey kidney cells with titanium oxide at levels 100 times less than the minimum concentration known to initiate effects on cell health. After incubating the cells for 24 hours with the TiO2, the cells were lysed and their contents analyzed using both PCR and Western Blot techniques to study the expression of 84 genes associated with the cells’ ability to address oxidative processes.

Payne and Kemp were surprised to find changes in the expression of six genes, including four from the peroxiredoxin family of enzymes that helps cells degrade hydrogen peroxide, a byproduct of cellular oxidation processes. Too much hydrogen peroxide can create oxidative stress which can damage DNA and other molecules.

The effect measured was significant – changes of about 50 percent in enzyme expression compared to cells that had not been incubated with nanoparticles. The tests were conducted in triplicate and produced similar results each time.

“One thing that was really surprising was that this whole family of proteins was affected, though some were up-regulated and some were down-regulated,” Kemp said. “These were all related proteins, so the question is why they would respond differently to the presence of the nanoparticles.”

The researchers aren’t sure how the nanoparticles bind with the cells, but they suspect it may involve the protein corona that surrounds the particles. The corona is made up of serum proteins that normally serve as food for the cells, but adsorb to the nanoparticles in the culture medium. The corona proteins have a protective effect on the cells, but may also serve as a way for the nanoparticles to bind to cell receptors.

Titanium dioxide is well known for its photo-catalytic effects under ultraviolet light, but the researchers don’t think that’s in play here because their culturing was done in ambient light – or in the dark. The individual nanoparticles had diameters of about 21 nanometers, but in cell culture formed much larger aggregates.

In future work, Payne and Kemp hope to learn more about the interaction, including where the enzyme-producing proteins are located in the cells. For that, they may use HyPer-Tau, a reporter protein they developed to track the location of hydrogen peroxide within cells.

The research suggests a re-evaluation may be necessary for other nanoparticles that could create subtle effects even though they’ve been deemed safe.

“Earlier work had suggested that nanoparticles can lead to oxidative stress, but nobody had really looked at this level and at so many different proteins at the same time,” Payne said. “Our research looked at such low concentrations that it does raise questions about what else might be affected. We looked specifically at oxidative stress, but there may be other genes that are affected, too.”

Those subtle differences may matter when they’re added to other factors.

“Oxidative stress is implicated in all kinds of inflammatory and immune responses,” Kemp noted. “While the titanium dioxide alone may just be modulating the expression levels of this family of proteins, if that is happening at the same time you have other types of oxidative stress for different reasons, then you may have a cumulative effect.”

*HeLa cells are named for Henrietta Lacks who unknowingly donated her immortal cell line to medical research. You can find more about the  story on the Oprah Winfrey website, which features an excerpt from the Rebecca Skloot book “The Immortal Life of Henrietta Lacks.” By the way, on May 2, 2016 it was announced that Oprah Winfrey would star in a movie for HBO as Henrietta Lacks’ daughter in an adaptation of the Rebecca Skloot book. You can read more about the proposed production in a May 3, 2016 article by Benjamin Lee for the Guardian.

Getting back to titanium dioxide nanoparticles and their possible long term effects, here’s a link to and a citation for the Georgia Tech team’s paper,

TiO2 Nanoparticles Alter the Expression of Peroxiredoxin Antioxidant Genes by Sabiha Runa, Dipesh Khanal, Melissa L. Kemp‡, and Christine K. Payne. J. Phys. Chem. C, Article ASAP DOI: 10.1021/acs.jpcc.6b01939 Publication Date (Web): April 21, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

What robots and humans?

I have two robot news bits for this posting. The first probes the unease currently being expressed (pop culture movies, Stephen Hawking, the Cambridge Centre for Existential Risk, etc.) about robots and their increasing intelligence and increased use in all types of labour formerly and currently performed by humans. The second item is about a research project where ‘artificial agents’ (robots) are being taught human values with stories.

Human labour obsolete?

‘When machines can do any job, what will humans do?’ is the question being asked in a presentation by Rice University computer scientist, Moshe Vardi, for the American Association for the Advancement of Science (AAAS) annual meeting held in Washington, D.C. from Feb. 11 – 15, 2016.

Here’s more about Dr. Vardi’s provocative question from a Feb. 14, 2016 Rice University news release (also on EurekAlert),

Rice University computer scientist Moshe Vardi expects that within 30 years, machines will be capable of doing almost any job that a human can. In anticipation, he is asking his colleagues to consider the societal implications. Can the global economy adapt to greater than 50 percent unemployment? Will those out of work be content to live a life of leisure?

“We are approaching a time when machines will be able to outperform humans at almost any task,” Vardi said. “I believe that society needs to confront this question before it is upon us: If machines are capable of doing almost any work humans can do, what will humans do?”

Vardi addressed this issue Sunday [Feb. 14, 2016] in a presentation titled “Smart Robots and Their Impact on Society” at one of the world’s largest and most prestigious scientific meetings — the annual meeting of the American Association for the Advancement of Science in Washington, D.C.

“The question I want to put forward is, Does the technology we are developing ultimately benefit mankind?” Vardi said. He asked the question after presenting a body of evidence suggesting that the pace of advancement in the field of artificial intelligence (AI) is increasing, even as existing robotic and AI technologies are eliminating a growing number of middle-class jobs and thereby driving up income inequality.

Vardi, a member of both the National Academy of Engineering and the National Academy of Science, is a Distinguished Service Professor and the Karen Ostrum George Professor of Computational Engineering at Rice, where he also directs Rice’s Ken Kennedy Institute for Information Technology. Since 2008 he has served as the editor-in-chief of Communications of the ACM, the flagship publication of the Association for Computing Machinery (ACM), one of the world’s largest computational professional societies.

Vardi said some people believe that future advances in automation will ultimately benefit humans, just as automation has benefited society since the dawn of the industrial age.

“A typical answer is that if machines will do all our work, we will be free to pursue leisure activities,” Vardi said. But even if the world economic system could be restructured to enable billions of people to live lives of leisure, Vardi questioned whether it would benefit humanity.

“I do not find this a promising future, as I do not find the prospect of leisure-only life appealing. I believe that work is essential to human well-being,” he said.

“Humanity is about to face perhaps its greatest challenge ever, which is finding meaning in life after the end of ‘In the sweat of thy face shalt thou eat bread,’” Vardi said. “We need to rise to the occasion and meet this challenge” before human labor becomes obsolete, he said.

In addition to dual membership in the National Academies, Vardi is a Guggenheim fellow and a member of the American Academy of Arts and Sciences, the European Academy of Sciences and the Academia Europa. He is a fellow of the ACM, the American Association for Artificial Intelligence and the Institute for Electrical and Electronics Engineers (IEEE). His numerous honors include the Southeastern Universities Research Association’s 2013 Distinguished Scientist Award, the 2011 IEEE Computer Society Harry H. Goode Award, the 2008 ACM Presidential Award, the 2008 Blaise Pascal Medal for Computer Science by the European Academy of Sciences and the 2000 Goedel Prize for outstanding papers in the area of theoretical computer science.

Vardi joined Rice’s faculty in 1993. His research centers upon the application of logic to computer science, database systems, complexity theory, multi-agent systems and specification and verification of hardware and software. He is the author or co-author of more than 500 technical articles and of two books, “Reasoning About Knowledge” and “Finite Model Theory and Its Applications.”

In a Feb. 5, 2015 post, I rounded up a number of articles about our robot future. It provides a still useful overview of the thinking on the topic.

Teaching human values with stories

A Feb. 12, 2016 Georgia (US) Institute of Technology (Georgia Tech) news release (also on EurekAlert) describes the research,

The rapid pace of artificial intelligence (AI) has raised fears about whether robots could act unethically or soon choose to harm humans. Some are calling for bans on robotics research; others are calling for more research to understand how AI might be constrained. But how can robots learn ethical behavior if there is no “user manual” for being human?

Researchers Mark Riedl and Brent Harrison from the School of Interactive Computing at the Georgia Institute of Technology believe the answer lies in “Quixote” — to be unveiled at the AAAI [Association for the Advancement of Artificial Intelligence]-16 Conference in Phoenix, Ariz. (Feb. 12 – 17, 2016). Quixote teaches “value alignment” to robots by training them to read stories, learn acceptable sequences of events and understand successful ways to behave in human societies.

“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels and other literature,” says Riedl, associate professor and director of the Entertainment Intelligence Lab. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.”

Quixote is a technique for aligning an AI’s goals with human values by placing rewards on socially appropriate behavior. It builds upon Riedl’s prior research — the Scheherazade system — which demonstrated how artificial intelligence can gather a correct sequence of actions by crowdsourcing story plots from the Internet.

Scheherazade learns what is a normal or “correct” plot graph. It then passes that data structure along to Quixote, which converts it into a “reward signal” that reinforces certain behaviors and punishes other behaviors during trial-and-error learning. In essence, Quixote learns that it will be rewarded whenever it acts like the protagonist in a story instead of randomly or like the antagonist.

For example, if a robot is tasked with picking up a prescription for a human as quickly as possible, the robot could a) rob the pharmacy, take the medicine, and run; b) interact politely with the pharmacists, or c) wait in line. Without value alignment and positive reinforcement, the robot would learn that robbing is the fastest and cheapest way to accomplish its task. With value alignment from Quixote, the robot would be rewarded for waiting patiently in line and paying for the prescription.

Riedl and Harrison demonstrate in their research how a value-aligned reward signal can be produced to uncover all possible steps in a given scenario, map them into a plot trajectory tree, which is then used by the robotic agent to make “plot choices” (akin to what humans might remember as a Choose-Your-Own-Adventure novel) and receive rewards or punishments based on its choice.

The Quixote technique is best for robots that have a limited purpose but need to interact with humans to achieve it, and it is a primitive first step toward general moral reasoning in AI, Riedl says.

“We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behavior,” he adds. “Giving robots the ability to read and understand our stories may be the most expedient means in the absence of a human user manual.”

So there you have it, some food for thought.

When an atom more or less makes a big difference

As scientists continue exploring the nanoscale, it seems that finding the number of atoms in your particle makes a difference is no longer so surprising. From a Jan. 28, 2016 news item on ScienceDaily,

Combining experimental investigations and theoretical simulations, researchers have explained why platinum nanoclusters of a specific size range facilitate the hydrogenation reaction used to produce ethane from ethylene. The research offers new insights into the role of cluster shapes in catalyzing reactions at the nanoscale, and could help materials scientists optimize nanocatalysts for a broad class of other reactions.

A Jan. 28, 2016 Georgia Institute of Technology (Georgia Tech) news release (*also on EurekAlert*), which originated the news item, expands on the theme,

At the macro-scale, the conversion of ethylene has long been considered among the reactions insensitive to the structure of the catalyst used. However, by examining reactions catalyzed by platinum clusters containing between 9 and 15 atoms, researchers in Germany and the United States found that at the nanoscale, that’s no longer true. The shape of nanoscale clusters, they found, can dramatically affect reaction efficiency.

While the study investigated only platinum nanoclusters and the ethylene reaction, the fundamental principles may apply to other catalysts and reactions, demonstrating how materials at the very smallest size scales can provide different properties than the same material in bulk quantities. …

“We have re-examined the validity of a very fundamental concept on a very fundamental reaction,” said Uzi Landman, a Regents’ Professor and F.E. Callaway Chair in the School of Physics at the Georgia Institute of Technology. “We found that in the ultra-small catalyst range, on the order of a nanometer in size, old concepts don’t hold. New types of reactivity can occur because of changes in one or two atoms of a cluster at the nanoscale.”

The widely-used conversion process actually involves two separate reactions: (1) dissociation of H2 molecules into single hydrogen atoms, and (2) their addition to the ethylene, which involves conversion of a double bond into a single bond. In addition to producing ethane, the reaction can also take an alternative route that leads to the production of ethylidyne, which poisons the catalyst and prevents further reaction.

The project began with Professor Ueli Heiz and researchers in his group at the Technical University of Munich experimentally examining reaction rates for clusters containing 9, 10, 11, 12 or 13 platinum atoms that had been placed atop a magnesium oxide substrate. The 9-atom nanoclusters failed to produce a significant reaction, while larger clusters catalyzed the ethylene hydrogenation reaction with increasingly better efficiency. The best reaction occurred with 13-atom clusters.

Bokwon Yoon, a research scientist in Georgia Tech’s Center for Computational Materials Science, and Landman, the center’s director, then used large-scale first-principles quantum mechanical simulations to understand how the size of the clusters – and their shape – affected the reactivity. Using their simulations, they discovered that the 9-atom cluster resembled a symmetrical “hut,” while the larger clusters had bulges that served to concentrate electrical charges from the substrate.

“That one atom changes the whole activity of the catalyst,” Landman said. “We found that the extra atom operates like a lightning rod. The distribution of the excess charge from the substrate helps facilitate the reaction. Platinum 9 has a compact shape that doesn’t facilitate the reaction, but adding just one atom changes everything.”

Here’s an illustration featuring the difference between a 9 atom cluster and a 10 atom cluster,

A single atom makes a difference in the catalytic properties of platinum nanoclusters. Shown are platinum 9 (top) and platinum 10 (bottom). (Credit: Uzi Landman, Georgia Tech)

A single atom makes a difference in the catalytic properties of platinum nanoclusters. Shown are platinum 9 (top) and platinum 10 (bottom). (Credit: Uzi Landman, Georgia Tech)

The news release explains why the larger clusters function as catalysts,

Nanoclusters with 13 atoms provided the maximum reactivity because the additional atoms shift the structure in a phenomena Landman calls “fluxionality.” This structural adjustment has also been noted in earlier work of these two research groups, in studies of clusters of gold [emphasis mine] which are used in other catalytic reactions.

“Dynamic fluxionality is the ability of the cluster to distort its structure to accommodate the reactants to actually enhance reactivity,” he explained. “Only very small aggregates of metal can show such behavior, which mimics a biochemical enzyme.”

The simulations showed that catalyst poisoning also varies with cluster size – and temperature. The 10-atom clusters can be poisoned at room temperature, while the 13-atom clusters are poisoned only at higher temperatures, helping to account for their improved reactivity.

“Small really is different,” said Landman. “Once you get into this size regime, the old rules of structure sensitivity and structure insensitivity must be assessed for their continued validity. It’s not a question anymore of surface-to-volume ratio because everything is on the surface in these very small clusters.”

While the project examined only one reaction and one type of catalyst, the principles governing nanoscale catalysis – and the importance of re-examining traditional expectations – likely apply to a broad range of reactions catalyzed by nanoclusters at the smallest size scale. Such nanocatalysts are becoming more attractive as a means of conserving supplies of costly platinum.

“It’s a much richer world at the nanoscale than at the macroscopic scale,” added Landman. “These are very important messages for materials scientists and chemists who wish to design catalysts for new purposes, because the capabilities can be very different.”

Along with the experimental surface characterization and reactivity measurements, the first-principles theoretical simulations provide a unique practical means for examining these structural and electronic issues because the clusters are too small to be seen with sufficient resolution using most electron microscopy techniques or traditional crystallography.

“We have looked at how the number of atoms dictates the geometrical structure of the cluster catalysts on the surface and how this geometrical structure is associated with electronic properties that bring about chemical bonding characteristics that enhance the reactions,” Landman added.

I highlighted the news release’s reference to gold nanoclusters as I have noted the number issue in two April 14, 2015 postings, neither of which featured Georgia Tech, Gold atoms: sometimes they’re a metal and sometimes they’re a molecule and Nature’s patterns reflected in gold nanoparticles.

Here’s a link to and a citation for the ‘platinum catalyst’ paper,

Structure sensitivity in the nonscalable regime explored via catalysed ethylene hydrogenation on supported platinum nanoclusters by Andrew S. Crampton, Marian D. Rötzer, Claron J. Ridge, Florian F. Schweinberger, Ueli Heiz, Bokwon Yoon, & Uzi Landman.  Nature Communications 7, Article number: 10389  doi:10.1038/ncomms10389 Published 28 January 2016

This paper is open access.

*’also on EurekAlert’ added Jan. 29, 2016.

Center for Sustainable Nanotechnology or how not to poison and make the planet uninhabitable

I received notice of the Center for Sustainable Nanotechnology’s newest deal with the US National Science Foundation in an August 31, 2015 email University of Wisconsin-Madison (UWM) news release,

The Center for Sustainable Nanotechnology, a multi-institutional research center based at the University of Wisconsin-Madison, has inked a new contract with the National Science Foundation (NSF) that will provide nearly $20 million in support over the next five years.

Directed by UW-Madison chemistry Professor Robert Hamers, the center focuses on the molecular mechanisms by which nanoparticles interact with biological systems.

Nanotechnology involves the use of materials at the smallest scale, including the manipulation of individual atoms and molecules. Products that use nanoscale materials range from beer bottles and car wax to solar cells and electric and hybrid car batteries. If you read your books on a Kindle, a semiconducting material manufactured at the nanoscale underpins the high-resolution screen.

While there are already hundreds of products that use nanomaterials in various ways, much remains unknown about how these modern materials and the tiny particles they are composed of interact with the environment and living things.

“The purpose of the center is to explore how we can make sure these nanotechnologies come to fruition with little or no environmental impact,” explains Hamers. “We’re looking at nanoparticles in emerging technologies.”

In addition to UW-Madison, scientists from UW-Milwaukee, the University of Minnesota, the University of Illinois, Northwestern University and the Pacific Northwest National Laboratory have been involved in the center’s first phase of research. Joining the center for the next five-year phase are Tuskegee University, Johns Hopkins University, the University of Iowa, Augsburg College, Georgia Tech and the University of Maryland, Baltimore County.

At UW-Madison, Hamers leads efforts in synthesis and molecular characterization of nanomaterials. soil science Professor Joel Pedersen and chemistry Professor Qiang Cui lead groups exploring the biological and computational aspects of how nanomaterials affect life.

Much remains to be learned about how nanoparticles affect the environment and the multitude of organisms – from bacteria to plants, animals and people – that may be exposed to them.

“Some of the big questions we’re asking are: How is this going to impact bacteria and other organisms in the environment? What do these particles do? How do they interact with organisms?” says Hamers.

For instance, bacteria, the vast majority of which are beneficial or benign organisms, tend to be “sticky” and nanoparticles might cling to the microorganisms and have unintended biological effects.

“There are many different mechanisms by which these particles can do things,” Hamers adds. “The challenge is we don’t know what these nanoparticles do if they’re released into the environment.”

To get at the challenge, Hamers and his UW-Madison colleagues are drilling down to investigate the molecular-level chemical and physical principles that dictate how nanoparticles interact with living things.
Pedersen’s group, for example, is studying the complexities of how nanoparticles interact with cells and, in particular, their surface membranes.

“To enter a cell, a nanoparticle has to interact with a membrane,” notes Pedersen. “The simplest thing that can happen is the particle sticks to the cell. But it might cause toxicity or make a hole in the membrane.”

Pedersen’s group can make model cell membranes in the lab using the same lipids and proteins that are the building blocks of nature’s cells. By exposing the lab-made membranes to nanomaterials now used commercially, Pedersen and his colleagues can see how the membrane-particle interaction unfolds at the molecular level – the scale necessary to begin to understand the biological effects of the particles.

Such studies, Hamers argues, promise a science-based understanding that can help ensure the technology leaves a minimal environmental footprint by identifying issues before they manifest themselves in the manufacturing, use or recycling of products that contain nanotechnology-inspired materials.

To help fulfill that part of the mission, the center has established working relationships with several companies to conduct research on materials in the very early stages of development.

“We’re taking a look-ahead view. We’re trying to get into the technological design cycle,” Hamers says. “The idea is to use scientific understanding to develop a predictive ability to guide technology and guide people who are designing and using these materials.”

What with this initiative and the LCnano Network at Arizona State University (my April 8, 2014 posting; scroll down about 50% of the way), it seems that environmental and health and safety studies of nanomaterials are kicking into a higher gear as commercialization efforts intensify.

Controlling water with ‘stick-slip surfaces’

Controlling water could lead to better designed microfluidic devices such as ‘lab-on-a-chip’. A July 27, 2015 news item on Nanowerk announces a new technique for controlling water,

Coating the inside of glass microtubes with a polymer hydrogel material dramatically alters the way capillary forces draw water into the tiny structures, researchers have found. The discovery could provide a new way to control microfluidic systems, including popular lab-on-a-chip devices.

Capillary action draws water and other liquids into confined spaces such as tubes, straws, wicks and paper towels, and the flow rate can be predicted using a simple hydrodynamic analysis. But a chance observation by researchers at the Georgia Institute of Technology [US] will cause a recalculation of those predictions for conditions in which hydrogel films line the tubes carrying water-based liquids.

“Rather than moving according to conventional expectations, water-based liquids slip to a new location in the tube, get stuck, then slip again – and the process repeats over and over again,” explained Andrei Fedorov, a professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech. “Instead of filling the tube with a rate of liquid penetration that slows with time, the water propagates at a nearly constant speed into the hydrogel-coated capillary. This was very different from what we had expected.”

A July 27, 2015 Georgia Institute of Technology (Georgia Tech) news release (also on EurekAlert) by John Toon, which originated the news item, describes the work in more detail,

When the opening of a thin glass tube is exposed to a droplet of water, the liquid begins to flow into the tube, pulled by a combination of surface tension in the liquid and adhesion between the liquid and the walls of the tube. Leading the way is a meniscus, a curved surface of the water at the leading edge of the water column. An ordinary borosilicate glass tube fills by capillary action at a gradually decreasing rate with the speed of meniscus propagation slowing as a square root of time.

But when the inside of a tube is coated with a very thin layer of poly(N-isopropylacrylamide), a so-called “smart” polymer (PNIPAM), everything changes. Water entering a tube coated on the inside with a dry hydrogel film must first wet the film and allow it to swell before it can proceed farther into the tube. The wetting and swelling take place not continuously, but with discrete steps in which the water meniscus first sticks and its motion remains arrested while the polymer layer locally deforms. The meniscus then rapidly slides for a short distance before the process repeats. This “stick-slip” process forces the water to move into the tube in a step-by-step motion.

The flow rate measured by the researchers in the coated tube is three orders of magnitude less than the flow rate in an uncoated tube. A linear equation describes the time dependence of the filling process instead of a classical quadratic equation which describes filling of an uncoated tube.

“Instead of filling the capillary in a hundredth of a second, it might take tens of seconds to fill the same capillary,” said Fedorov. “Though there is some swelling of the hydrogel upon contact with water, the change in the tube diameter is negligible due to the small thickness of the hydrogel layer. This is why we were so surprised when we first observed such a dramatic slow-down of the filing process in our experiments.”

The researchers – who included graduate students James Silva, Drew Loney and Ren Geryak and senior research engineer Peter Kottke – tried the experiment again using glycerol, a liquid that is not absorbed by the hydrogel. With glycerol, the capillary action proceeded through the hydrogel-coated microtube as with an uncoated tube in agreement with conventional theory. After using high-resolution optical visualization to study the meniscus propagation while the polymer swelled, the researchers realized they could put this previously-unknown behavior to good use.

Water absorption by the hydrogels occurs only when the materials remain below a specific transition temperature. When heated above that temperature, the materials no longer absorb water, eliminating the “stick-slip” phenomenon in the microtubes and allowing them to behave like ordinary tubes.

This ability to turn the stick-slip behavior on and off with temperature could provide a new way to control the flow of water-based liquid in microfluidic devices, including labs-on-a-chip. The transition temperature can be controlled by varying the chemical composition of the hydrogel.

“By locally heating or cooling the polymer inside a microfluidic chamber, you can either speed up the filling process or slow it down,” Fedorov said. “The time it takes for the liquid to travel the same distance can be varied up to three orders of magnitude. That would allow precise control of fluid flow on demand using external stimuli to change polymer film behavior.”

The heating or cooling could be done locally with lasers, tiny heaters, or thermoelectric devices placed at specific locations in the microfluidic devices.

That could allow precise timing of reactions in microfluidic devices by controlling the rate of reactant delivery and product removal, or allow a sequence of fast and slow reactions to occur. Another important application could be controlled drug release in which the desired rate of molecule delivery could be dynamically tuned over time to achieve the optimal therapeutic outcome.

In future work, Fedorov and his team hope to learn more about the physics of the hydrogel-modified capillaries and study capillary flow using partially-transparent microtubes. They also want to explore other “smart” polymers which change the flow rate in response to different stimuli, including the changing pH of the liquid, exposure to electromagnetic radiation, or the induction of mechanical stress – all of which can change the properties of a particular hydrogel designed to be responsive to those triggers.

“These experimental and theoretical results provide a new conceptual framework for liquid motion confined by soft, dynamically evolving polymer interfaces in which the system creates an energy barrier to further motion through elasto-capillary deformation, and then lowers the barrier through diffusive softening,” the paper’s authors wrote. “This insight has implications for optimal design of microfluidic and lab-on-a-chip devices based on stimuli-responsive smart polymers.”

In addition to those already mentioned, the research team included Professor Vladimir Tsukruk from the Georgia Tech School of Materials Science and Engineering and Rajesh Naik, Biotechnology Lead and Tech Advisor of the Nanostructured and Biological Materials Branch of the Air Force Research Laboratory (AFRL).

Here’s a link to and a citation for the paper,

Stick–slip water penetration into capillaries coated with swelling hydrogel by J. E. Silva, R. Geryak, D. A. Loney, P. A. Kottke, R. R. Naik, V. V. Tsukruk, and A. G. Fedorov. Soft Matter, 2015,11, 5933-5939 DOI: 10.1039/C5SM00660K First published online 23 Jun 2015

This paper is behind a paywall.

I sing the body cyber: two projects funded by the US National Science Foundation

Points to anyone who recognized the reference to Walt Whitman’s poem, “I sing the body electric,” from his classic collection, Leaves of Grass (1867 edition; h/t Wikipedia entry). I wonder if the cyber physical systems (CPS) work being funded by the US National Science Foundation (NSF) in the US will occasion poetry too.

More practically, a May 15, 2015 news item on Nanowerk, describes two cyber physical systems (CPS) research projects newly funded by the NSF,

Today [May 12, 2015] the National Science Foundation (NSF) announced two, five-year, center-scale awards totaling $8.75 million to advance the state-of-the-art in medical and cyber-physical systems (CPS).

One project will develop “Cyberheart”–a platform for virtual, patient-specific human heart models and associated device therapies that can be used to improve and accelerate medical-device development and testing. The other project will combine teams of microrobots with synthetic cells to perform functions that may one day lead to tissue and organ re-generation.

CPS are engineered systems that are built from, and depend upon, the seamless integration of computation and physical components. Often called the “Internet of Things,” CPS enable capabilities that go beyond the embedded systems of today.

“NSF has been a leader in supporting research in cyber-physical systems, which has provided a foundation for putting the ‘smart’ in health, transportation, energy and infrastructure systems,” said Jim Kurose, head of Computer & Information Science & Engineering at NSF. “We look forward to the results of these two new awards, which paint a new and compelling vision for what’s possible for smart health.”

Cyber-physical systems have the potential to benefit many sectors of our society, including healthcare. While advances in sensors and wearable devices have the capacity to improve aspects of medical care, from disease prevention to emergency response, and synthetic biology and robotics hold the promise of regenerating and maintaining the body in radical new ways, little is known about how advances in CPS can integrate these technologies to improve health outcomes.

These new NSF-funded projects will investigate two very different ways that CPS can be used in the biological and medical realms.

A May 12, 2015 NSF news release (also on EurekAlert), which originated the news item, describes the two CPS projects,

Bio-CPS for engineering living cells

A team of leading computer scientists, roboticists and biologists from Boston University, the University of Pennsylvania and MIT have come together to develop a system that combines the capabilities of nano-scale robots with specially designed synthetic organisms. Together, they believe this hybrid “bio-CPS” will be capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.

“We bring together synthetic biology and micron-scale robotics to engineer the emergence of desired behaviors in populations of bacterial and mammalian cells,” said Calin Belta, a professor of mechanical engineering, systems engineering and bioinformatics at Boston University and principal investigator on the project. “This project will impact several application areas ranging from tissue engineering to drug development.”

The project builds on previous research by each team member in diverse disciplines and early proof-of-concept designs of bio-CPS. According to the team, the research is also driven by recent advances in the emerging field of synthetic biology, in particular the ability to rapidly incorporate new capabilities into simple cells. Researchers so far have not been able to control and coordinate the behavior of synthetic cells in isolation, but the introduction of microrobots that can be externally controlled may be transformative.

In this new project, the team will focus on bio-CPS with the ability to sense, transport and work together. As a demonstration of their idea, they will develop teams of synthetic cell/microrobot hybrids capable of constructing a complex, fabric-like surface.

Vijay Kumar (University of Pennsylvania), Ron Weiss (MIT), and Douglas Densmore (BU) are co-investigators of the project.

Medical-CPS and the ‘Cyberheart’

CPS such as wearable sensors and implantable devices are already being used to assess health, improve quality of life, provide cost-effective care and potentially speed up disease diagnosis and prevention. [emphasis mine]

Extending these efforts, researchers from seven leading universities and centers are working together to develop far more realistic cardiac and device models than currently exist. This so-called “Cyberheart” platform can be used to test and validate medical devices faster and at a far lower cost than existing methods. CyberHeart also can be used to design safe, patient-specific device therapies, thereby lowering the risk to the patient.

“Innovative ‘virtual’ design methodologies for implantable cardiac medical devices will speed device development and yield safer, more effective devices and device-based therapies, than is currently possible,” said Scott Smolka, a professor of computer science at Stony Brook University and one of the principal investigators on the award.

The group’s approach combines patient-specific computational models of heart dynamics with advanced mathematical techniques for analyzing how these models interact with medical devices. The analytical techniques can be used to detect potential flaws in device behavior early on during the device-design phase, before animal and human trials begin. They also can be used in a clinical setting to optimize device settings on a patient-by-patient basis before devices are implanted.

“We believe that our coordinated, multi-disciplinary approach, which balances theoretical, experimental and practical concerns, will yield transformational results in medical-device design and foundations of cyber-physical system verification,” Smolka said.

The team will develop virtual device models which can be coupled together with virtual heart models to realize a full virtual development platform that can be subjected to computational analysis and simulation techniques. Moreover, they are working with experimentalists who will study the behavior of virtual and actual devices on animals’ hearts.

Co-investigators on the project include Edmund Clarke (Carnegie Mellon University), Elizabeth Cherry (Rochester Institute of Technology), W. Rance Cleaveland (University of Maryland), Flavio Fenton (Georgia Tech), Rahul Mangharam (University of Pennsylvania), Arnab Ray (Fraunhofer Center for Experimental Software Engineering [Germany]) and James Glimm and Radu Grosu (Stony Brook University). Richard A. Gray of the U.S. Food and Drug Administration is another key contributor.

It is fascinating to observe how terminology is shifting from pacemakers and deep brain stimulators as implants to “CPS such as wearable sensors and implantable devices … .” A new category has been created, CPS, which conjoins medical devices with other sensing devices such as wearable fitness monitors found in the consumer market. I imagine it’s an attempt to quell fears about injecting strange things into or adding strange things to your body—microrobots and nanorobots partially derived from synthetic biology research which are “… capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.” They’ve also sneaked in a reference to synthetic biology, an area of research where some concerns have been expressed, from my March 19, 2013 post about a poll and synthetic biology concerns,

In our latest survey, conducted in January 2013, three-fourths of respondents say they have heard little or nothing about synthetic biology, a level consistent with that measured in 2010. While initial impressions about the science are largely undefined, these feelings do not necessarily become more positive as respondents learn more. The public has mixed reactions to specific synthetic biology applications, and almost one-third of respondents favor a ban “on synthetic biology research until we better understand its implications and risks,” while 61 percent think the science should move forward.

I imagine that for scientists, 61% in favour of more research is not particularly comforting given how easily and quickly public opinion can shift.

Bendable, stretchable, light-weight, and transparent: a new competitor in the competition for ‘thinnest electric generator’

An Oct. 15, 2014 Columbia University (New York, US) press release (also on EurekAlert), describes another contender for the title of the world’s thinnest electric generator,

Researchers from Columbia Engineering and the Georgia Institute of Technology [US] report today [Oct. 15, 2014] that they have made the first experimental observation of piezoelectricity and the piezotronic effect in an atomically thin material, molybdenum disulfide (MoS2), resulting in a unique electric generator and mechanosensation devices that are optically transparent, extremely light, and very bendable and stretchable.

In a paper published online October 15, 2014, in Nature, research groups from the two institutions demonstrate the mechanical generation of electricity from the two-dimensional (2D) MoS2 material. The piezoelectric effect in this material had previously been predicted theoretically.

Here’s a link to and a citation for the paper,

Piezoelectricity of single-atomic-layer MoS2 for energy conversion and piezotronics by Wenzhuo Wu, Lei Wang, Yilei Li, Fan Zhang, Long Lin, Simiao Niu, Daniel Chenet, Xian Zhang, Yufeng Hao, Tony F. Heinz, James Hone, & Zhong Lin Wang. Nature (2014) doi:10.1038/nature13792 Published online 15 October 2014

This paper is behind a paywall. There is a free preview available with ReadCube Access.

Getting back to the Columbia University press release, it offers a general description of piezoelectricity and some insight into this new research on molybdenum disulfide,

Piezoelectricity is a well-known effect in which stretching or compressing a material causes it to generate an electrical voltage (or the reverse, in which an applied voltage causes it to expand or contract). But for materials of only a few atomic thicknesses, no experimental observation of piezoelectricity has been made, until now. The observation reported today provides a new property for two-dimensional materials such as molybdenum disulfide, opening the potential for new types of mechanically controlled electronic devices.

“This material—just a single layer of atoms—could be made as a wearable device, perhaps integrated into clothing, to convert energy from your body movement to electricity and power wearable sensors or medical devices, or perhaps supply enough energy to charge your cell phone in your pocket,” says James Hone, professor of mechanical engineering at Columbia and co-leader of the research.

“Proof of the piezoelectric effect and piezotronic effect adds new functionalities to these two-dimensional materials,” says Zhong Lin Wang, Regents’ Professor in Georgia Tech’s School of Materials Science and Engineering and a co-leader of the research. “The materials community is excited about molybdenum disulfide, and demonstrating the piezoelectric effect in it adds a new facet to the material.”

Hone and his research group demonstrated in 2008 that graphene, a 2D form of carbon, is the strongest material. He and Lei Wang, a postdoctoral fellow in Hone’s group, have been actively exploring the novel properties of 2D materials like graphene and MoS2 as they are stretched and compressed.

Zhong Lin Wang and his research group pioneered the field of piezoelectric nanogenerators for converting mechanical energy into electricity. He and postdoctoral fellow Wenzhuo Wu are also developing piezotronic devices, which use piezoelectric charges to control the flow of current through the material just as gate voltages do in conventional three-terminal transistors.

There are two keys to using molybdenum disulfide for generating current: using an odd number of layers and flexing it in the proper direction. The material is highly polar, but, Zhong Lin Wang notes, so an even number of layers cancels out the piezoelectric effect. The material’s crystalline structure also is piezoelectric in only certain crystalline orientations.

For the Nature study, Hone’s team placed thin flakes of MoS2 on flexible plastic substrates and determined how their crystal lattices were oriented using optical techniques. They then patterned metal electrodes onto the flakes. In research done at Georgia Tech, Wang’s group installed measurement electrodes on samples provided by Hone’s group, then measured current flows as the samples were mechanically deformed. They monitored the conversion of mechanical to electrical energy, and observed voltage and current outputs.

The researchers also noted that the output voltage reversed sign when they changed the direction of applied strain, and that it disappeared in samples with an even number of atomic layers, confirming theoretical predictions published last year. The presence of piezotronic effect in odd layer MoS2 was also observed for the first time.

“What’s really interesting is we’ve now found that a material like MoS2, which is not piezoelectric in bulk form, can become piezoelectric when it is thinned down to a single atomic layer,” says Lei Wang.

To be piezoelectric, a material must break central symmetry. A single atomic layer of MoS2 has such a structure, and should be piezoelectric. However, in bulk MoS2, successive layers are oriented in opposite directions, and generate positive and negative voltages that cancel each other out and give zero net piezoelectric effect.

“This adds another member to the family of piezoelectric materials for functional devices,” says Wenzhuo Wu.

In fact, MoS2 is just one of a group of 2D semiconducting materials known as transition metal dichalcogenides, all of which are predicted to have similar piezoelectric properties. These are part of an even larger family of 2D materials whose piezoelectric materials remain unexplored. Importantly, as has been shown by Hone and his colleagues, 2D materials can be stretched much farther than conventional materials, particularly traditional ceramic piezoelectrics, which are quite brittle.

The research could open the door to development of new applications for the material and its unique properties.

“This is the first experimental work in this area and is an elegant example of how the world becomes different when the size of material shrinks to the scale of a single atom,” Hone adds. “With what we’re learning, we’re eager to build useful devices for all kinds of applications.”

Ultimately, Zhong Lin Wang notes, the research could lead to complete atomic-thick nanosystems that are self-powered by harvesting mechanical energy from the environment. This study also reveals the piezotronic effect in two-dimensional materials for the first time, which greatly expands the application of layered materials for human-machine interfacing, robotics, MEMS, and active flexible electronics.

I see there’s a reference in that last paragraph to “harvesting mechanical energy from  the environment.” I’m not sure what they mean by that but I have written a few times about harvesting biomechanical energy. One of my earliest pieces is a July 12, 2010 post which features work by Zhong Lin Wang on harvesting energy from heart beats, blood flow, muscle stretching, or even irregular vibrations. One of my latest pieces is a Sept. 17, 2014 post about some work in Canada on harvesting energy from the jaw as you chew.

A final note, Dexter Johnson discusses this work in an Oct. 16, 2014 post on the Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website).