Tag Archives: Georgia Tech

4D printing, what is that?

According to an April 12, 2017 news item on ScienceDaily, shapeshifting in response to environmental stimuli is the fourth dimension (I have a link to a posting about 4D printing with another fourth dimension),

A team of researchers from Georgia Institute of Technology and two other institutions has developed a new 3-D printing method to create objects that can permanently transform into a range of different shapes in response to heat.

The team, which included researchers from the Singapore University of Technology and Design (SUTD) and Xi’an Jiaotong University in China, created the objects by printing layers of shape memory polymers with each layer designed to respond differently when exposed to heat.

“This new approach significantly simplifies and increases the potential of 4-D printing by incorporating the mechanical programming post-processing step directly into the 3-D printing process,” said Jerry Qi, a professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech. “This allows high-resolution 3-D printed components to be designed by computer simulation, 3-D printed, and then directly and rapidly transformed into new permanent configurations by simply heating.”

The research was reported April 12 [2017] in the journal Science Advances, a publication of the American Association for the Advancement of Science. The work is funded by the U.S. Air Force Office of Scientific Research, the U.S. National Science Foundation and the Singapore National Research Foundation through the SUTD DManD Centre.

An April 12, 2017 Singapore University of Technology and Design (SUTD) press release on EurekAlert provides more detail,

4D printing is an emerging technology that allows a 3D-printed component to transform its structure by exposing it to heat, light, humidity, or other environmental stimuli. This technology extends the shape creation process beyond 3D printing, resulting in additional design flexibility that can lead to new types of products which can adjust its functionality in response to the environment, in a pre-programmed manner. However, 4D printing generally involves complex and time-consuming post-processing steps to mechanically programme the component. Furthermore, the materials are often limited to soft polymers, which limit their applicability in structural scenarios.

A group of researchers from the SUTD, Georgia Institute of Technology, Xi’an Jiaotong University and Zhejiang University has introduced an approach that significantly simplifies and increases the potential of 4D printing by incorporating the mechanical programming post-processing step directly into the 3D printing process. This allows high-resolution 3D-printed components to be designed by computer simulation, 3D printed, and then directly and rapidly transformed into new permanent configurations by using heat. This approach can help save printing time and materials used by up to 90%, while completely eliminating the time-consuming mechanical programming process from the design and manufacturing workflow.

“Our approach involves printing composite materials where at room temperature one material is soft but can be programmed to contain internal stress, and the other material is stiff,” said Dr. Zhen Ding of SUTD. “We use computational simulations to design composite components where the stiff material has a shape and size that prevents the release of the programmed internal stress from the soft material after 3D printing. Upon heating, the stiff material softens and allows the soft material to release its stress. This results in a change – often dramatic – in the product shape.” This new shape is fixed when the product is cooled, with good mechanical stiffness. The research demonstrated many interesting shape changing parts, including a lattice that can expand by almost 8 times when heated.

This new shape becomes permanent and the composite material will not return to its original 3D-printed shape, upon further heating or cooling. “This is because of the shape memory effect,” said Prof. H. Jerry Qi of Georgia Tech. “In the two-material composite design, the stiff material exhibits shape memory, which helps lock the transformed shape into a permanent one. Additionally, the printed structure also exhibits the shape memory effect, i.e. it can then be programmed into further arbitrary shapes that can always be recovered to its new permanent shape, but not its 3D-printed shape.”

Said SUTD’s Prof. Martin Dunn, “The key advance of this work, is a 4D printing method that is dramatically simplified and allows the creation of high-resolution complex 3D reprogrammable products; it promises to enable myriad applications across biomedical devices, 3D electronics, and consumer products. It even opens the door to a new paradigm in product design, where components are designed from the onset to inhabit multiple configurations during service.”

Here’s a video,


Uploaded on Apr 17, 2017

A research team led by the Singapore University of Technology and Design’s (SUTD) Associate Provost of Research, Professor Martin Dunn, has come up with a new and simplified 4D printing method that uses a 3D printer to rapidly create 3D objects, which can permanently transform into a range of different shapes in response to heat.

Here’s a link to and a citation for the paper,

Direct 4D printing via active composite materials by Zhen Ding, Chao Yuan, Xirui Peng, Tiejun Wang, H. Jerry Qi, and Martin L. Dunn. Science Advances  12 Apr 2017: Vol. 3, no. 4, e1602890 DOI: 10.1126/sciadv.1602890

This paper is open access.

Here is a link to a post about another 4th dimension, time,

4D printing: a hydrogel orchid (Jan. 28, 2016)

Council of Canadian Academies and science policy for Alberta

The Council of Canadian Academies (CCA) has expanded its approach from assembling expert panels to report on questions posed by various Canadian government agencies (assessments) to special reports from a three-member panel and, now, to a workshop on the province of Alberta’s science policy ideas. From an Oct. 27, 2016 CCA news release (received via email),

The Council of Canadian Academies (CCA) is pleased to announce that it is undertaking an expert panel workshop on science policy ideas under development in Alberta. The workshop will engage national and international experts to explore various dimensions of sub-national science systems and the role of sub-national science policy.

“We are pleased to undertake this project,” said Eric M. Meslin, PhD, FCAHS, President and CEO of the CCA. “It is an assessment that could discuss strategies that have applications in Alberta, across Canada, and elsewhere.”

A two-day workshop, to be undertaken in November 2016, will bring together a multidisciplinary and multi-sectoral group of leading Canadian and international experts to review, validate, and advance work being done on science policy in Alberta. The workshop will explore the necessary considerations when creating science policy at the sub-national level. Specifically it will:

  • Debate and validate the main outcomes of a sub-national science enterprise, particularly in relation to knowledge, human, and social capital.
  • Identify the key elements and characteristics of a successful science enterprise (e.g., funding, trust, capacity, science culture, supporting interconnections and relationships) with a particular focus at a sub-national level.
  • Explore potential intents of a sub-national science policy, important features of such a policy, and the role of the policy in informing investment decisions.

To lead the design of the workshop, complete the necessary background research, and develop the workshop summary report, the CCA has appointed a five member Workshop Steering Committee, chaired by Joy Johnson, FCAHS, Vice President, Research, Simon Fraser University. The other Steering Committee members are: Paul Dufour, Adjunct Professor, Institute for Science, Society and Policy; University of Ottawa, Principal, Paulicy Works; Janet Halliwell, Principal, J.E. Halliwell Associates, Inc.; Kaye Husbands Fealing, Chair and Professor, School of Public Policy, Georgia Tech; and Marc LePage, President and CEO, Genome Canada.

The CCA, under the guidance of its Scientific Advisory Committee, and in collaboration with the Workshop Steering Committee, is now assembling a multidisciplinary, multi-sectoral, group of experts to participate in the two-day workshop. The CCA’s Member Academies – the Royal Society of Canada, the Canadian Academy of Engineering, and the Canadian Academy of Health Sciences – are a key source of membership for expert panels. Many experts are also Fellows of the Academies.

The workshop results will be published in a final summary report in spring 2017. This workshop assessment is supported by a grant from the Government of Alberta.

By comparison with the CCA’s last assessment mentioned here in a July 1, 2016 posting (The State of Science and Technology and Industrial Research and Development in Canada), this workshop has a better balance. The expert panel is being chaired by a woman (the first time I’ve seen that in a few years) and enough female members to add up to 60% representation. No representation from Québec (perhaps not a surprise given this is Alberta) but there is 40% from the western provinces given there is representation from both BC and Alberta. Business can boast 30% (?) with Paul Dufour doing double duty as both academic and business owner. It’s good to see international representation and one day I hope to see it from somewhere other than the US, the UK, and/or the Europe Union. Maybe Asia?

You can find contact information on the CCA’s Towards a Science Policy in Alberta webpage.

One comment, I find the lack of a specific date for the workshop interesting. It suggests either they were having difficulty scheduling or they wanted to keep the ‘unwashed’ away.

Achieving ultra-low friction without oil

Oiled gears as small parts of large mechanism Courtesy: Georgia Institute of Technology

Oiled gears as small parts of large mechanism Courtesy: Georgia Institute of Technology

Those gears are gorgeous, especially in full size; I will be giving a link to a full size version in a bit. Meanwhile, an Oct. 11, 2016 news item on Nanowerk makes an announcement about ultra-low friction without oil,

Researchers at Georgia Institute of Technology [Georgia Tech; US] have developed a new process for treating metal surfaces that has the potential to improve efficiency in piston engines and a range of other equipment.

The method improves the ability of metal surfaces to bond with oil, significantly reducing friction without special oil additives.

“About 50 percent of the mechanical energy losses in an internal combustion engine result from piston assembly friction. So if we can reduce the friction, we can save energy and reduce fuel and oil consumption,” said Michael Varenberg, an assistant professor in Georgia Tech’s George W. Woodruff School of Mechanical Engineering.

An Oct. 5, 2016 Georgia Tech news release (also on EurekAlert but dated Oct. 11, 2016), which originated the news item, describes the research in more detail,

In the study, which was published Oct. 5 [2016] in the journal Tribology Letters, the researchers at Georgia Tech and Technion – Israel Institute of Technology tested treating the surface of cast iron blocks by blasting it with mixture of copper sulfide and aluminum oxide. The shot peening modified the surface chemically that changed how oil molecules bonded with the metal and led to a superior surface lubricity.

“We want oil molecules to be connected strongly to the surface. Traditionally this connection is created by putting additives in the oil,” Varenberg said. “In this specific case, we shot peen the surface with a blend of alumina and copper sulfide particles.  Making the surface more active chemically by deforming it allows for replacement reaction to form iron sulfide on top of the iron. And iron sulfides are known for very strong bonds with oil molecules.”

Oil is the primary tool used to reduce the friction that occurs when two surfaces slide in contact. The new surface treatment results in an ultra-low friction coefficient of about 0.01 in a base oil environment, which is about 10 times less than a friction coefficient obtained on a reference untreated surface, the researchers reported.

“The reported result surpasses the performance of the best current commercial oils and is similar to the performance of lubricants formulated with tungsten disulfide-based nanoparticles, but critically, our process does not use any expensive nanostructured media,” Varenberg said.

The method for reducing surface friction is flexible and similar results can be achieved using a variety of processes other than shot peening, such as lapping, honing, burnishing, laser shock peening, the researchers suggest. That would make the process even easier to adapt to a range of uses and industries. The researchers plan to continue to examine that fundamental functional principles and physicochemical mechanisms that caused the treatment to be so successful.

“This straightforward, scalable pathway to ultra-low friction opens new horizons for surface engineering, and it could significantly reduce energy losses on an industrial scale,” Varenberg said. “Moreover, our finding may result in a paradigm shift in the art of lubrication and initiate a whole new direction in surface science and engineering due to the generality of the idea and a broad range of potential applications.”

Here’s a link to and a citation for the paper,

Mechano-Chemical Surface Modification with Cu2S: Inducing Superior Lubricity by Michael Varenberg, Grigory Ryk, Alexander Yakhnis, Yuri Kligerman, Neha Kondekar, & Matthew T. McDowell. Tribol Lett (2016) 64: 28. doi:10.1007/s11249-016-0758-8 First online: Oct. 5, 2016

This paper is behind a paywall.

A human user manual—for robots

Researchers from the Georgia Institute of Technology (Georgia Tech), funded by the US Office of Naval Research (ONR), have developed a program that teaches robots to read stories and more in an effort to educate them about humans. From a June 16, 2016 ONR news release by Warren Duffie Jr. (also on EurekAlert),

With support from the Office of Naval Research (ONR), researchers at the Georgia Institute of Technology have created an artificial intelligence software program named Quixote to teach robots to read stories, learn acceptable behavior and understand successful ways to conduct themselves in diverse social situations.

“For years, researchers have debated how to teach robots to act in ways that are appropriate, non-intrusive and trustworthy,” said Marc Steinberg, an ONR program manager who oversees the research. “One important question is how to explain complex concepts such as policies, values or ethics to robots. Humans are really good at using narrative stories to make sense of the world and communicate to other people. This could one day be an effective way to interact with robots.”

The rapid pace of artificial intelligence has stirred fears by some that robots could act unethically or harm humans. Dr. Mark Riedl, an associate professor and director of Georgia Tech’s Entertainment Intelligence Lab, hopes to ease concerns by having Quixote serve as a “human user manual” by teaching robots values through simple stories. After all, stories inform, educate and entertain–reflecting shared cultural knowledge, social mores and protocols.

For example, if a robot is tasked with picking up a pharmacy prescription for a human as quickly as possible, it could: a) take the medicine and leave, b) interact politely with pharmacists, c) or wait in line. Without value alignment and positive reinforcement, the robot might logically deduce robbery is the fastest, cheapest way to accomplish its task. However, with value alignment from Quixote, it would be rewarded for waiting patiently in line and paying for the prescription.

For their research, Riedl and his team crowdsourced stories from the Internet. Each tale needed to highlight daily social interactions–going to a pharmacy or restaurant, for example–as well as socially appropriate behaviors (e.g., paying for meals or medicine) within each setting.

The team plugged the data into Quixote to create a virtual agent–in this case, a video game character placed into various game-like scenarios mirroring the stories. As the virtual agent completed a game, it earned points and positive reinforcement for emulating the actions of protagonists in the stories.

Riedl’s team ran the agent through 500,000 simulations, and it displayed proper social interactions more than 90 percent of the time.

“These games are still fairly simple,” said Riedl, “more like ‘Pac-Man’ instead of ‘Halo.’ However, Quixote enables these artificial intelligence agents to immerse themselves in a story, learn the proper sequence of events and be encoded with acceptable behavior patterns. This type of artificial intelligence can be adapted to robots, offering a variety of applications.”

Within the next six months, Riedl’s team hopes to upgrade Quixote’s games from “old-school” to more modern and complex styles like those found in Minecraft–in which players use blocks to build elaborate structures and societies.

Riedl believes Quixote could one day make it easier for humans to train robots to perform diverse tasks. Steinberg notes that robotic and artificial intelligence systems may one day be a much larger part of military life. This could involve mine detection and deactivation, equipment transport and humanitarian and rescue operations.

“Within a decade, there will be more robots in society, rubbing elbows with us,” said Riedl. “Social conventions grease the wheels of society, and robots will need to understand the nuances of how humans do things. That’s where Quixote can serve as a valuable tool. We’re already seeing it with virtual agents like Siri and Cortana, which are programmed not to say hurtful or insulting things to users.”

This story brought to mind two other projects: RoboEarth (an internet for robots only) mentioned in my Jan. 14, 2014 which was an update on the project featuring its use in hospitals and RoboBrain, a robot learning project (sourcing the internet, YouTube, and more for information to teach robots) was mentioned in my Sept. 2, 2014 posting.

Titanium dioxide nanoparticles have subtle effects on oxidative stress genes?

There’s research from the Georgia Institute of Technology (Georgia Tech; US) suggesting that titanium dioxide nanoparticles may have long term side effects. From a May 10, 2016 news item on ScienceDaily,

A nanoparticle commonly used in food, cosmetics, sunscreen and other products can have subtle effects on the activity of genes expressing enzymes that address oxidative stress inside two types of cells. While the titanium dioxide (TiO2) nanoparticles are considered non-toxic because they don’t kill cells at low concentrations, these cellular effects could add to concerns about long-term exposure to the nanomaterial.

A May 9, 2016 Georgia Tech news release on Newswire (also on EurekAlert), which originated the news item, describes the research in more detail,

Researchers at the Georgia Institute of Technology used high-throughput screening techniques to study the effects of titanium dioxide nanoparticles on the expression of 84 genes related to cellular oxidative stress. Their work found that six genes, four of them from a single gene family, were affected by a 24-hour exposure to the nanoparticles.

The effect was seen in two different kinds of cells exposed to the nanoparticles: human HeLa* cancer cells commonly used in research, and a line of monkey kidney cells. Polystyrene nanoparticles similar in size and surface electrical charge to the titanium dioxide nanoparticles did not produce a similar effect on gene expression.

“This is important because every standard measure of cell health shows that cells are not affected by these titanium dioxide nanoparticles,” said Christine Payne, an associate professor in Georgia Tech’s School of Chemistry and Biochemistry. “Our results show that there is a more subtle change in oxidative stress that could be damaging to cells or lead to long-term changes. This suggests that other nanoparticles should be screened for similar low-level effects.”

The research was reported online May 6 in the Journal of Physical Chemistry C. The work was supported by the National Institutes of Health (NIH) through the HERCULES Center at Emory University, and by a Vasser Woolley Fellowship.

Titanium dioxide nanoparticles help make powdered donuts white, protect skin from the sun’s rays and reflect light in painted surfaces. In concentrations commonly used, they are considered non-toxic, though several other studies have raised concern about potential effects on gene expression that may not directly impact the short-term health of cells.

To determine whether the nanoparticles could affect genes involved in managing oxidative stress in cells, Payne and colleague Melissa Kemp – an associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University – designed a study to broadly evaluate the nanoparticle’s impact on the two cell lines.

Working with graduate students Sabiha Runa and Dipesh Khanal, they separately incubated HeLa cells and monkey kidney cells with titanium oxide at levels 100 times less than the minimum concentration known to initiate effects on cell health. After incubating the cells for 24 hours with the TiO2, the cells were lysed and their contents analyzed using both PCR and Western Blot techniques to study the expression of 84 genes associated with the cells’ ability to address oxidative processes.

Payne and Kemp were surprised to find changes in the expression of six genes, including four from the peroxiredoxin family of enzymes that helps cells degrade hydrogen peroxide, a byproduct of cellular oxidation processes. Too much hydrogen peroxide can create oxidative stress which can damage DNA and other molecules.

The effect measured was significant – changes of about 50 percent in enzyme expression compared to cells that had not been incubated with nanoparticles. The tests were conducted in triplicate and produced similar results each time.

“One thing that was really surprising was that this whole family of proteins was affected, though some were up-regulated and some were down-regulated,” Kemp said. “These were all related proteins, so the question is why they would respond differently to the presence of the nanoparticles.”

The researchers aren’t sure how the nanoparticles bind with the cells, but they suspect it may involve the protein corona that surrounds the particles. The corona is made up of serum proteins that normally serve as food for the cells, but adsorb to the nanoparticles in the culture medium. The corona proteins have a protective effect on the cells, but may also serve as a way for the nanoparticles to bind to cell receptors.

Titanium dioxide is well known for its photo-catalytic effects under ultraviolet light, but the researchers don’t think that’s in play here because their culturing was done in ambient light – or in the dark. The individual nanoparticles had diameters of about 21 nanometers, but in cell culture formed much larger aggregates.

In future work, Payne and Kemp hope to learn more about the interaction, including where the enzyme-producing proteins are located in the cells. For that, they may use HyPer-Tau, a reporter protein they developed to track the location of hydrogen peroxide within cells.

The research suggests a re-evaluation may be necessary for other nanoparticles that could create subtle effects even though they’ve been deemed safe.

“Earlier work had suggested that nanoparticles can lead to oxidative stress, but nobody had really looked at this level and at so many different proteins at the same time,” Payne said. “Our research looked at such low concentrations that it does raise questions about what else might be affected. We looked specifically at oxidative stress, but there may be other genes that are affected, too.”

Those subtle differences may matter when they’re added to other factors.

“Oxidative stress is implicated in all kinds of inflammatory and immune responses,” Kemp noted. “While the titanium dioxide alone may just be modulating the expression levels of this family of proteins, if that is happening at the same time you have other types of oxidative stress for different reasons, then you may have a cumulative effect.”

*HeLa cells are named for Henrietta Lacks who unknowingly donated her immortal cell line to medical research. You can find more about the  story on the Oprah Winfrey website, which features an excerpt from the Rebecca Skloot book “The Immortal Life of Henrietta Lacks.” By the way, on May 2, 2016 it was announced that Oprah Winfrey would star in a movie for HBO as Henrietta Lacks’ daughter in an adaptation of the Rebecca Skloot book. You can read more about the proposed production in a May 3, 2016 article by Benjamin Lee for the Guardian.

Getting back to titanium dioxide nanoparticles and their possible long term effects, here’s a link to and a citation for the Georgia Tech team’s paper,

TiO2 Nanoparticles Alter the Expression of Peroxiredoxin Antioxidant Genes by Sabiha Runa, Dipesh Khanal, Melissa L. Kemp‡, and Christine K. Payne. J. Phys. Chem. C, Article ASAP DOI: 10.1021/acs.jpcc.6b01939 Publication Date (Web): April 21, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

What robots and humans?

I have two robot news bits for this posting. The first probes the unease currently being expressed (pop culture movies, Stephen Hawking, the Cambridge Centre for Existential Risk, etc.) about robots and their increasing intelligence and increased use in all types of labour formerly and currently performed by humans. The second item is about a research project where ‘artificial agents’ (robots) are being taught human values with stories.

Human labour obsolete?

‘When machines can do any job, what will humans do?’ is the question being asked in a presentation by Rice University computer scientist, Moshe Vardi, for the American Association for the Advancement of Science (AAAS) annual meeting held in Washington, D.C. from Feb. 11 – 15, 2016.

Here’s more about Dr. Vardi’s provocative question from a Feb. 14, 2016 Rice University news release (also on EurekAlert),

Rice University computer scientist Moshe Vardi expects that within 30 years, machines will be capable of doing almost any job that a human can. In anticipation, he is asking his colleagues to consider the societal implications. Can the global economy adapt to greater than 50 percent unemployment? Will those out of work be content to live a life of leisure?

“We are approaching a time when machines will be able to outperform humans at almost any task,” Vardi said. “I believe that society needs to confront this question before it is upon us: If machines are capable of doing almost any work humans can do, what will humans do?”

Vardi addressed this issue Sunday [Feb. 14, 2016] in a presentation titled “Smart Robots and Their Impact on Society” at one of the world’s largest and most prestigious scientific meetings — the annual meeting of the American Association for the Advancement of Science in Washington, D.C.

“The question I want to put forward is, Does the technology we are developing ultimately benefit mankind?” Vardi said. He asked the question after presenting a body of evidence suggesting that the pace of advancement in the field of artificial intelligence (AI) is increasing, even as existing robotic and AI technologies are eliminating a growing number of middle-class jobs and thereby driving up income inequality.

Vardi, a member of both the National Academy of Engineering and the National Academy of Science, is a Distinguished Service Professor and the Karen Ostrum George Professor of Computational Engineering at Rice, where he also directs Rice’s Ken Kennedy Institute for Information Technology. Since 2008 he has served as the editor-in-chief of Communications of the ACM, the flagship publication of the Association for Computing Machinery (ACM), one of the world’s largest computational professional societies.

Vardi said some people believe that future advances in automation will ultimately benefit humans, just as automation has benefited society since the dawn of the industrial age.

“A typical answer is that if machines will do all our work, we will be free to pursue leisure activities,” Vardi said. But even if the world economic system could be restructured to enable billions of people to live lives of leisure, Vardi questioned whether it would benefit humanity.

“I do not find this a promising future, as I do not find the prospect of leisure-only life appealing. I believe that work is essential to human well-being,” he said.

“Humanity is about to face perhaps its greatest challenge ever, which is finding meaning in life after the end of ‘In the sweat of thy face shalt thou eat bread,’” Vardi said. “We need to rise to the occasion and meet this challenge” before human labor becomes obsolete, he said.

In addition to dual membership in the National Academies, Vardi is a Guggenheim fellow and a member of the American Academy of Arts and Sciences, the European Academy of Sciences and the Academia Europa. He is a fellow of the ACM, the American Association for Artificial Intelligence and the Institute for Electrical and Electronics Engineers (IEEE). His numerous honors include the Southeastern Universities Research Association’s 2013 Distinguished Scientist Award, the 2011 IEEE Computer Society Harry H. Goode Award, the 2008 ACM Presidential Award, the 2008 Blaise Pascal Medal for Computer Science by the European Academy of Sciences and the 2000 Goedel Prize for outstanding papers in the area of theoretical computer science.

Vardi joined Rice’s faculty in 1993. His research centers upon the application of logic to computer science, database systems, complexity theory, multi-agent systems and specification and verification of hardware and software. He is the author or co-author of more than 500 technical articles and of two books, “Reasoning About Knowledge” and “Finite Model Theory and Its Applications.”

In a Feb. 5, 2015 post, I rounded up a number of articles about our robot future. It provides a still useful overview of the thinking on the topic.

Teaching human values with stories

A Feb. 12, 2016 Georgia (US) Institute of Technology (Georgia Tech) news release (also on EurekAlert) describes the research,

The rapid pace of artificial intelligence (AI) has raised fears about whether robots could act unethically or soon choose to harm humans. Some are calling for bans on robotics research; others are calling for more research to understand how AI might be constrained. But how can robots learn ethical behavior if there is no “user manual” for being human?

Researchers Mark Riedl and Brent Harrison from the School of Interactive Computing at the Georgia Institute of Technology believe the answer lies in “Quixote” — to be unveiled at the AAAI [Association for the Advancement of Artificial Intelligence]-16 Conference in Phoenix, Ariz. (Feb. 12 – 17, 2016). Quixote teaches “value alignment” to robots by training them to read stories, learn acceptable sequences of events and understand successful ways to behave in human societies.

“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels and other literature,” says Riedl, associate professor and director of the Entertainment Intelligence Lab. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.”

Quixote is a technique for aligning an AI’s goals with human values by placing rewards on socially appropriate behavior. It builds upon Riedl’s prior research — the Scheherazade system — which demonstrated how artificial intelligence can gather a correct sequence of actions by crowdsourcing story plots from the Internet.

Scheherazade learns what is a normal or “correct” plot graph. It then passes that data structure along to Quixote, which converts it into a “reward signal” that reinforces certain behaviors and punishes other behaviors during trial-and-error learning. In essence, Quixote learns that it will be rewarded whenever it acts like the protagonist in a story instead of randomly or like the antagonist.

For example, if a robot is tasked with picking up a prescription for a human as quickly as possible, the robot could a) rob the pharmacy, take the medicine, and run; b) interact politely with the pharmacists, or c) wait in line. Without value alignment and positive reinforcement, the robot would learn that robbing is the fastest and cheapest way to accomplish its task. With value alignment from Quixote, the robot would be rewarded for waiting patiently in line and paying for the prescription.

Riedl and Harrison demonstrate in their research how a value-aligned reward signal can be produced to uncover all possible steps in a given scenario, map them into a plot trajectory tree, which is then used by the robotic agent to make “plot choices” (akin to what humans might remember as a Choose-Your-Own-Adventure novel) and receive rewards or punishments based on its choice.

The Quixote technique is best for robots that have a limited purpose but need to interact with humans to achieve it, and it is a primitive first step toward general moral reasoning in AI, Riedl says.

“We believe that AI has to be enculturated to adopt the values of a particular society, and in doing so, it will strive to avoid unacceptable behavior,” he adds. “Giving robots the ability to read and understand our stories may be the most expedient means in the absence of a human user manual.”

So there you have it, some food for thought.

When an atom more or less makes a big difference

As scientists continue exploring the nanoscale, it seems that finding the number of atoms in your particle makes a difference is no longer so surprising. From a Jan. 28, 2016 news item on ScienceDaily,

Combining experimental investigations and theoretical simulations, researchers have explained why platinum nanoclusters of a specific size range facilitate the hydrogenation reaction used to produce ethane from ethylene. The research offers new insights into the role of cluster shapes in catalyzing reactions at the nanoscale, and could help materials scientists optimize nanocatalysts for a broad class of other reactions.

A Jan. 28, 2016 Georgia Institute of Technology (Georgia Tech) news release (*also on EurekAlert*), which originated the news item, expands on the theme,

At the macro-scale, the conversion of ethylene has long been considered among the reactions insensitive to the structure of the catalyst used. However, by examining reactions catalyzed by platinum clusters containing between 9 and 15 atoms, researchers in Germany and the United States found that at the nanoscale, that’s no longer true. The shape of nanoscale clusters, they found, can dramatically affect reaction efficiency.

While the study investigated only platinum nanoclusters and the ethylene reaction, the fundamental principles may apply to other catalysts and reactions, demonstrating how materials at the very smallest size scales can provide different properties than the same material in bulk quantities. …

“We have re-examined the validity of a very fundamental concept on a very fundamental reaction,” said Uzi Landman, a Regents’ Professor and F.E. Callaway Chair in the School of Physics at the Georgia Institute of Technology. “We found that in the ultra-small catalyst range, on the order of a nanometer in size, old concepts don’t hold. New types of reactivity can occur because of changes in one or two atoms of a cluster at the nanoscale.”

The widely-used conversion process actually involves two separate reactions: (1) dissociation of H2 molecules into single hydrogen atoms, and (2) their addition to the ethylene, which involves conversion of a double bond into a single bond. In addition to producing ethane, the reaction can also take an alternative route that leads to the production of ethylidyne, which poisons the catalyst and prevents further reaction.

The project began with Professor Ueli Heiz and researchers in his group at the Technical University of Munich experimentally examining reaction rates for clusters containing 9, 10, 11, 12 or 13 platinum atoms that had been placed atop a magnesium oxide substrate. The 9-atom nanoclusters failed to produce a significant reaction, while larger clusters catalyzed the ethylene hydrogenation reaction with increasingly better efficiency. The best reaction occurred with 13-atom clusters.

Bokwon Yoon, a research scientist in Georgia Tech’s Center for Computational Materials Science, and Landman, the center’s director, then used large-scale first-principles quantum mechanical simulations to understand how the size of the clusters – and their shape – affected the reactivity. Using their simulations, they discovered that the 9-atom cluster resembled a symmetrical “hut,” while the larger clusters had bulges that served to concentrate electrical charges from the substrate.

“That one atom changes the whole activity of the catalyst,” Landman said. “We found that the extra atom operates like a lightning rod. The distribution of the excess charge from the substrate helps facilitate the reaction. Platinum 9 has a compact shape that doesn’t facilitate the reaction, but adding just one atom changes everything.”

Here’s an illustration featuring the difference between a 9 atom cluster and a 10 atom cluster,

A single atom makes a difference in the catalytic properties of platinum nanoclusters. Shown are platinum 9 (top) and platinum 10 (bottom). (Credit: Uzi Landman, Georgia Tech)

A single atom makes a difference in the catalytic properties of platinum nanoclusters. Shown are platinum 9 (top) and platinum 10 (bottom). (Credit: Uzi Landman, Georgia Tech)

The news release explains why the larger clusters function as catalysts,

Nanoclusters with 13 atoms provided the maximum reactivity because the additional atoms shift the structure in a phenomena Landman calls “fluxionality.” This structural adjustment has also been noted in earlier work of these two research groups, in studies of clusters of gold [emphasis mine] which are used in other catalytic reactions.

“Dynamic fluxionality is the ability of the cluster to distort its structure to accommodate the reactants to actually enhance reactivity,” he explained. “Only very small aggregates of metal can show such behavior, which mimics a biochemical enzyme.”

The simulations showed that catalyst poisoning also varies with cluster size – and temperature. The 10-atom clusters can be poisoned at room temperature, while the 13-atom clusters are poisoned only at higher temperatures, helping to account for their improved reactivity.

“Small really is different,” said Landman. “Once you get into this size regime, the old rules of structure sensitivity and structure insensitivity must be assessed for their continued validity. It’s not a question anymore of surface-to-volume ratio because everything is on the surface in these very small clusters.”

While the project examined only one reaction and one type of catalyst, the principles governing nanoscale catalysis – and the importance of re-examining traditional expectations – likely apply to a broad range of reactions catalyzed by nanoclusters at the smallest size scale. Such nanocatalysts are becoming more attractive as a means of conserving supplies of costly platinum.

“It’s a much richer world at the nanoscale than at the macroscopic scale,” added Landman. “These are very important messages for materials scientists and chemists who wish to design catalysts for new purposes, because the capabilities can be very different.”

Along with the experimental surface characterization and reactivity measurements, the first-principles theoretical simulations provide a unique practical means for examining these structural and electronic issues because the clusters are too small to be seen with sufficient resolution using most electron microscopy techniques or traditional crystallography.

“We have looked at how the number of atoms dictates the geometrical structure of the cluster catalysts on the surface and how this geometrical structure is associated with electronic properties that bring about chemical bonding characteristics that enhance the reactions,” Landman added.

I highlighted the news release’s reference to gold nanoclusters as I have noted the number issue in two April 14, 2015 postings, neither of which featured Georgia Tech, Gold atoms: sometimes they’re a metal and sometimes they’re a molecule and Nature’s patterns reflected in gold nanoparticles.

Here’s a link to and a citation for the ‘platinum catalyst’ paper,

Structure sensitivity in the nonscalable regime explored via catalysed ethylene hydrogenation on supported platinum nanoclusters by Andrew S. Crampton, Marian D. Rötzer, Claron J. Ridge, Florian F. Schweinberger, Ueli Heiz, Bokwon Yoon, & Uzi Landman.  Nature Communications 7, Article number: 10389  doi:10.1038/ncomms10389 Published 28 January 2016

This paper is open access.

*’also on EurekAlert’ added Jan. 29, 2016.

Center for Sustainable Nanotechnology or how not to poison and make the planet uninhabitable

I received notice of the Center for Sustainable Nanotechnology’s newest deal with the US National Science Foundation in an August 31, 2015 email University of Wisconsin-Madison (UWM) news release,

The Center for Sustainable Nanotechnology, a multi-institutional research center based at the University of Wisconsin-Madison, has inked a new contract with the National Science Foundation (NSF) that will provide nearly $20 million in support over the next five years.

Directed by UW-Madison chemistry Professor Robert Hamers, the center focuses on the molecular mechanisms by which nanoparticles interact with biological systems.

Nanotechnology involves the use of materials at the smallest scale, including the manipulation of individual atoms and molecules. Products that use nanoscale materials range from beer bottles and car wax to solar cells and electric and hybrid car batteries. If you read your books on a Kindle, a semiconducting material manufactured at the nanoscale underpins the high-resolution screen.

While there are already hundreds of products that use nanomaterials in various ways, much remains unknown about how these modern materials and the tiny particles they are composed of interact with the environment and living things.

“The purpose of the center is to explore how we can make sure these nanotechnologies come to fruition with little or no environmental impact,” explains Hamers. “We’re looking at nanoparticles in emerging technologies.”

In addition to UW-Madison, scientists from UW-Milwaukee, the University of Minnesota, the University of Illinois, Northwestern University and the Pacific Northwest National Laboratory have been involved in the center’s first phase of research. Joining the center for the next five-year phase are Tuskegee University, Johns Hopkins University, the University of Iowa, Augsburg College, Georgia Tech and the University of Maryland, Baltimore County.

At UW-Madison, Hamers leads efforts in synthesis and molecular characterization of nanomaterials. soil science Professor Joel Pedersen and chemistry Professor Qiang Cui lead groups exploring the biological and computational aspects of how nanomaterials affect life.

Much remains to be learned about how nanoparticles affect the environment and the multitude of organisms – from bacteria to plants, animals and people – that may be exposed to them.

“Some of the big questions we’re asking are: How is this going to impact bacteria and other organisms in the environment? What do these particles do? How do they interact with organisms?” says Hamers.

For instance, bacteria, the vast majority of which are beneficial or benign organisms, tend to be “sticky” and nanoparticles might cling to the microorganisms and have unintended biological effects.

“There are many different mechanisms by which these particles can do things,” Hamers adds. “The challenge is we don’t know what these nanoparticles do if they’re released into the environment.”

To get at the challenge, Hamers and his UW-Madison colleagues are drilling down to investigate the molecular-level chemical and physical principles that dictate how nanoparticles interact with living things.
Pedersen’s group, for example, is studying the complexities of how nanoparticles interact with cells and, in particular, their surface membranes.

“To enter a cell, a nanoparticle has to interact with a membrane,” notes Pedersen. “The simplest thing that can happen is the particle sticks to the cell. But it might cause toxicity or make a hole in the membrane.”

Pedersen’s group can make model cell membranes in the lab using the same lipids and proteins that are the building blocks of nature’s cells. By exposing the lab-made membranes to nanomaterials now used commercially, Pedersen and his colleagues can see how the membrane-particle interaction unfolds at the molecular level – the scale necessary to begin to understand the biological effects of the particles.

Such studies, Hamers argues, promise a science-based understanding that can help ensure the technology leaves a minimal environmental footprint by identifying issues before they manifest themselves in the manufacturing, use or recycling of products that contain nanotechnology-inspired materials.

To help fulfill that part of the mission, the center has established working relationships with several companies to conduct research on materials in the very early stages of development.

“We’re taking a look-ahead view. We’re trying to get into the technological design cycle,” Hamers says. “The idea is to use scientific understanding to develop a predictive ability to guide technology and guide people who are designing and using these materials.”

What with this initiative and the LCnano Network at Arizona State University (my April 8, 2014 posting; scroll down about 50% of the way), it seems that environmental and health and safety studies of nanomaterials are kicking into a higher gear as commercialization efforts intensify.

Controlling water with ‘stick-slip surfaces’

Controlling water could lead to better designed microfluidic devices such as ‘lab-on-a-chip’. A July 27, 2015 news item on Nanowerk announces a new technique for controlling water,

Coating the inside of glass microtubes with a polymer hydrogel material dramatically alters the way capillary forces draw water into the tiny structures, researchers have found. The discovery could provide a new way to control microfluidic systems, including popular lab-on-a-chip devices.

Capillary action draws water and other liquids into confined spaces such as tubes, straws, wicks and paper towels, and the flow rate can be predicted using a simple hydrodynamic analysis. But a chance observation by researchers at the Georgia Institute of Technology [US] will cause a recalculation of those predictions for conditions in which hydrogel films line the tubes carrying water-based liquids.

“Rather than moving according to conventional expectations, water-based liquids slip to a new location in the tube, get stuck, then slip again – and the process repeats over and over again,” explained Andrei Fedorov, a professor in the George W. Woodruff School of Mechanical Engineering at Georgia Tech. “Instead of filling the tube with a rate of liquid penetration that slows with time, the water propagates at a nearly constant speed into the hydrogel-coated capillary. This was very different from what we had expected.”

A July 27, 2015 Georgia Institute of Technology (Georgia Tech) news release (also on EurekAlert) by John Toon, which originated the news item, describes the work in more detail,

When the opening of a thin glass tube is exposed to a droplet of water, the liquid begins to flow into the tube, pulled by a combination of surface tension in the liquid and adhesion between the liquid and the walls of the tube. Leading the way is a meniscus, a curved surface of the water at the leading edge of the water column. An ordinary borosilicate glass tube fills by capillary action at a gradually decreasing rate with the speed of meniscus propagation slowing as a square root of time.

But when the inside of a tube is coated with a very thin layer of poly(N-isopropylacrylamide), a so-called “smart” polymer (PNIPAM), everything changes. Water entering a tube coated on the inside with a dry hydrogel film must first wet the film and allow it to swell before it can proceed farther into the tube. The wetting and swelling take place not continuously, but with discrete steps in which the water meniscus first sticks and its motion remains arrested while the polymer layer locally deforms. The meniscus then rapidly slides for a short distance before the process repeats. This “stick-slip” process forces the water to move into the tube in a step-by-step motion.

The flow rate measured by the researchers in the coated tube is three orders of magnitude less than the flow rate in an uncoated tube. A linear equation describes the time dependence of the filling process instead of a classical quadratic equation which describes filling of an uncoated tube.

“Instead of filling the capillary in a hundredth of a second, it might take tens of seconds to fill the same capillary,” said Fedorov. “Though there is some swelling of the hydrogel upon contact with water, the change in the tube diameter is negligible due to the small thickness of the hydrogel layer. This is why we were so surprised when we first observed such a dramatic slow-down of the filing process in our experiments.”

The researchers – who included graduate students James Silva, Drew Loney and Ren Geryak and senior research engineer Peter Kottke – tried the experiment again using glycerol, a liquid that is not absorbed by the hydrogel. With glycerol, the capillary action proceeded through the hydrogel-coated microtube as with an uncoated tube in agreement with conventional theory. After using high-resolution optical visualization to study the meniscus propagation while the polymer swelled, the researchers realized they could put this previously-unknown behavior to good use.

Water absorption by the hydrogels occurs only when the materials remain below a specific transition temperature. When heated above that temperature, the materials no longer absorb water, eliminating the “stick-slip” phenomenon in the microtubes and allowing them to behave like ordinary tubes.

This ability to turn the stick-slip behavior on and off with temperature could provide a new way to control the flow of water-based liquid in microfluidic devices, including labs-on-a-chip. The transition temperature can be controlled by varying the chemical composition of the hydrogel.

“By locally heating or cooling the polymer inside a microfluidic chamber, you can either speed up the filling process or slow it down,” Fedorov said. “The time it takes for the liquid to travel the same distance can be varied up to three orders of magnitude. That would allow precise control of fluid flow on demand using external stimuli to change polymer film behavior.”

The heating or cooling could be done locally with lasers, tiny heaters, or thermoelectric devices placed at specific locations in the microfluidic devices.

That could allow precise timing of reactions in microfluidic devices by controlling the rate of reactant delivery and product removal, or allow a sequence of fast and slow reactions to occur. Another important application could be controlled drug release in which the desired rate of molecule delivery could be dynamically tuned over time to achieve the optimal therapeutic outcome.

In future work, Fedorov and his team hope to learn more about the physics of the hydrogel-modified capillaries and study capillary flow using partially-transparent microtubes. They also want to explore other “smart” polymers which change the flow rate in response to different stimuli, including the changing pH of the liquid, exposure to electromagnetic radiation, or the induction of mechanical stress – all of which can change the properties of a particular hydrogel designed to be responsive to those triggers.

“These experimental and theoretical results provide a new conceptual framework for liquid motion confined by soft, dynamically evolving polymer interfaces in which the system creates an energy barrier to further motion through elasto-capillary deformation, and then lowers the barrier through diffusive softening,” the paper’s authors wrote. “This insight has implications for optimal design of microfluidic and lab-on-a-chip devices based on stimuli-responsive smart polymers.”

In addition to those already mentioned, the research team included Professor Vladimir Tsukruk from the Georgia Tech School of Materials Science and Engineering and Rajesh Naik, Biotechnology Lead and Tech Advisor of the Nanostructured and Biological Materials Branch of the Air Force Research Laboratory (AFRL).

Here’s a link to and a citation for the paper,

Stick–slip water penetration into capillaries coated with swelling hydrogel by J. E. Silva, R. Geryak, D. A. Loney, P. A. Kottke, R. R. Naik, V. V. Tsukruk, and A. G. Fedorov. Soft Matter, 2015,11, 5933-5939 DOI: 10.1039/C5SM00660K First published online 23 Jun 2015

This paper is behind a paywall.