Monthly Archives: September 2016

How might artificial intelligence affect urban life in 2030? A study

Peering into the future is always a chancy business as anyone who’s seen those film shorts from the 1950’s and 60’s which speculate exuberantly as to what the future will bring knows.

A sober approach (appropriate to our times) has been taken in a study about the impact that artificial intelligence might have by 2030. From a Sept. 1, 2016 Stanford University news release (also on EurekAlert) by Tom Abate (Note: Links have been removed),

A panel of academic and industrial thinkers has looked ahead to 2030 to forecast how advances in artificial intelligence (AI) might affect life in a typical North American city – in areas as diverse as transportation, health care and education ­– and to spur discussion about how to ensure the safe, fair and beneficial development of these rapidly emerging technologies.

Titled “Artificial Intelligence and Life in 2030,” this year-long investigation is the first product of the One Hundred Year Study on Artificial Intelligence (AI100), an ongoing project hosted by Stanford to inform societal deliberation and provide guidance on the ethical development of smart software, sensors and machines.

“We believe specialized AI applications will become both increasingly common and more useful by 2030, improving our economy and quality of life,” said Peter Stone, a computer scientist at the University of Texas at Austin and chair of the 17-member panel of international experts. “But this technology will also create profound challenges, affecting jobs and incomes and other issues that we should begin addressing now to ensure that the benefits of AI are broadly shared.”

The new report traces its roots to a 2009 study that brought AI scientists together in a process of introspection that became ongoing in 2014, when Eric and Mary Horvitz created the AI100 endowment through Stanford. AI100 formed a standing committee of scientists and charged this body with commissioning periodic reports on different aspects of AI over the ensuing century.

“This process will be a marathon, not a sprint, but today we’ve made a good start,” said Russ Altman, a professor of bioengineering and the Stanford faculty director of AI100. “Stanford is excited to host this process of introspection. This work makes practical contribution to the public debate on the roles and implications of artificial intelligence.”

The AI100 standing committee first met in 2015, led by chairwoman and Harvard computer scientist Barbara Grosz. It sought to convene a panel of scientists with diverse professional and personal backgrounds and enlist their expertise to assess the technological, economic and policy implications of potential AI applications in a societally relevant setting.

“AI technologies can be reliable and broadly beneficial,” Grosz said. “Being transparent about their design and deployment challenges will build trust and avert unjustified fear and suspicion.”

The report investigates eight domains of human activity in which AI technologies are beginning to affect urban life in ways that will become increasingly pervasive and profound by 2030.

The 28,000-word report includes a glossary to help nontechnical readers understand how AI applications such as computer vision might help screen tissue samples for cancers or how natural language processing will allow computerized systems to grasp not simply the literal definitions, but the connotations and intent, behind words.

The report is broken into eight sections focusing on applications of AI. Five examine application arenas such as transportation where there is already buzz about self-driving cars. Three other sections treat technological impacts, like the section on employment and workplace trends which touches on the likelihood of rapid changes in jobs and incomes.

“It is not too soon for social debate on how the fruits of an AI-dominated economy should be shared,” the researchers write in the report, noting also the need for public discourse.

“Currently in the United States, at least sixteen separate agencies govern sectors of the economy related to AI technologies,” the researchers write, highlighting issues raised by AI applications: “Who is responsible when a self-driven car crashes or an intelligent medical device fails? How can AI applications be prevented from [being used for] racial discrimination or financial cheating?”

The eight sections discuss:

Transportation: Autonomous cars, trucks and, possibly, aerial delivery vehicles may alter how we commute, work and shop and create new patterns of life and leisure in cities.

Home/service robots: Like the robotic vacuum cleaners already in some homes, specialized robots will clean and provide security in live/work spaces that will be equipped with sensors and remote controls.

Health care: Devices to monitor personal health and robot-assisted surgery are hints of things to come if AI is developed in ways that gain the trust of doctors, nurses, patients and regulators.

Education: Interactive tutoring systems already help students learn languages, math and other skills. More is possible if technologies like natural language processing platforms develop to augment instruction by humans.

Entertainment: The conjunction of content creation tools, social networks and AI will lead to new ways to gather, organize and deliver media in engaging, personalized and interactive ways.

Low-resource communities: Investments in uplifting technologies like predictive models to prevent lead poisoning or improve food distributions could spread AI benefits to the underserved.

Public safety and security: Cameras, drones and software to analyze crime patterns should use AI in ways that reduce human bias and enhance safety without loss of liberty or dignity.

Employment and workplace: Work should start now on how to help people adapt as the economy undergoes rapid changes as many existing jobs are lost and new ones are created.

“Until now, most of what is known about AI comes from science fiction books and movies,” Stone said. “This study provides a realistic foundation to discuss how AI technologies are likely to affect society.”

Grosz said she hopes the AI 100 report “initiates a century-long conversation about ways AI-enhanced technologies might be shaped to improve life and societies.”

You can find the A100 website here, and the group’s first paper: “Artificial Intelligence and Life in 2030” here. Unfortunately, I don’t have time to read the report but I hope to do so soon.

The AI100 website’s About page offered a surprise,

This effort, called the One Hundred Year Study on Artificial Intelligence, or AI100, is the brainchild of computer scientist and Stanford alumnus Eric Horvitz who, among other credits, is a former president of the Association for the Advancement of Artificial Intelligence.

In that capacity Horvitz convened a conference in 2009 at which top researchers considered advances in artificial intelligence and its influences on people and society, a discussion that illuminated the need for continuing study of AI’s long-term implications.

Now, together with Russ Altman, a professor of bioengineering and computer science at Stanford, Horvitz has formed a committee that will select a panel to begin a series of periodic studies on how AI will affect automation, national security, psychology, ethics, law, privacy, democracy and other issues.

“Artificial intelligence is one of the most profound undertakings in science, and one that will affect every aspect of human life,” said Stanford President John Hennessy, who helped initiate the project. “Given’s Stanford’s pioneering role in AI and our interdisciplinary mindset, we feel obliged and qualified to host a conversation about how artificial intelligence will affect our children and our children’s children.”

Five leading academicians with diverse interests will join Horvitz and Altman in launching this effort. They are:

  • Barbara Grosz, the Higgins Professor of Natural Sciences at HarvardUniversity and an expert on multi-agent collaborative systems;
  • Deirdre K. Mulligan, a lawyer and a professor in the School of Information at the University of California, Berkeley, who collaborates with technologists to advance privacy and other democratic values through technical design and policy;

    This effort, called the One Hundred Year Study on Artificial Intelligence, or AI100, is the brainchild of computer scientist and Stanford alumnus Eric Horvitz who, among other credits, is a former president of the Association for the Advancement of Artificial Intelligence.

    In that capacity Horvitz convened a conference in 2009 at which top researchers considered advances in artificial intelligence and its influences on people and society, a discussion that illuminated the need for continuing study of AI’s long-term implications.

    Now, together with Russ Altman, a professor of bioengineering and computer science at Stanford, Horvitz has formed a committee that will select a panel to begin a series of periodic studies on how AI will affect automation, national security, psychology, ethics, law, privacy, democracy and other issues.

    “Artificial intelligence is one of the most profound undertakings in science, and one that will affect every aspect of human life,” said Stanford President John Hennessy, who helped initiate the project. “Given’s Stanford’s pioneering role in AI and our interdisciplinary mindset, we feel obliged and qualified to host a conversation about how artificial intelligence will affect our children and our children’s children.”

    Five leading academicians with diverse interests will join Horvitz and Altman in launching this effort. They are:

    • Barbara Grosz, the Higgins Professor of Natural Sciences at HarvardUniversity and an expert on multi-agent collaborative systems;
    • Deirdre K. Mulligan, a lawyer and a professor in the School of Information at the University of California, Berkeley, who collaborates with technologists to advance privacy and other democratic values through technical design and policy;
    • Yoav Shoham, a professor of computer science at Stanford, who seeks to incorporate common sense into AI;
    • Tom Mitchell, the E. Fredkin University Professor and chair of the machine learning department at Carnegie Mellon University, whose studies include how computers might learn to read the Web;
    • and Alan Mackworth, a professor of computer science at the University of British Columbia [emphases mine] and the Canada Research Chair in Artificial Intelligence, who built the world’s first soccer-playing robot.

    I wasn’t expecting to see a Canadian listed as a member of the AI100 standing committee and then I got another surprise (from the AI100 People webpage),

    Study Panels

    Study Panels are planned to convene every 5 years to examine some aspect of AI and its influences on society and the world. The first study panel was convened in late 2015 to study the likely impacts of AI on urban life by the year 2030, with a focus on typical North American cities.

    2015 Study Panel Members

    • Peter Stone, UT Austin, Chair
    • Rodney Brooks, Rethink Robotics
    • Erik Brynjolfsson, MIT
    • Ryan Calo, University of Washington
    • Oren Etzioni, Allen Institute for AI
    • Greg Hager, Johns Hopkins University
    • Julia Hirschberg, Columbia University
    • Shivaram Kalyanakrishnan, IIT Bombay
    • Ece Kamar, Microsoft
    • Sarit Kraus, Bar Ilan University
    • Kevin Leyton-Brown, [emphasis mine] UBC [University of British Columbia]
    • David Parkes, Harvard
    • Bill Press, UT Austin
    • AnnaLee (Anno) Saxenian, Berkeley
    • Julie Shah, MIT
    • Milind Tambe, USC
    • Astro Teller, Google[X]
  • [emphases mine] and the Canada Research Chair in Artificial Intelligence, who built the world’s first soccer-playing robot.

I wasn’t expecting to see a Canadian listed as a member of the AI100 standing committee and then I got another surprise (from the AI100 People webpage),

Study Panels

Study Panels are planned to convene every 5 years to examine some aspect of AI and its influences on society and the world. The first study panel was convened in late 2015 to study the likely impacts of AI on urban life by the year 2030, with a focus on typical North American cities.

2015 Study Panel Members

  • Peter Stone, UT Austin, Chair
  • Rodney Brooks, Rethink Robotics
  • Erik Brynjolfsson, MIT
  • Ryan Calo, University of Washington
  • Oren Etzioni, Allen Institute for AI
  • Greg Hager, Johns Hopkins University
  • Julia Hirschberg, Columbia University
  • Shivaram Kalyanakrishnan, IIT Bombay
  • Ece Kamar, Microsoft
  • Sarit Kraus, Bar Ilan University
  • Kevin Leyton-Brown, [emphasis mine] UBC [University of British Columbia]
  • David Parkes, Harvard
  • Bill Press, UT Austin
  • AnnaLee (Anno) Saxenian, Berkeley
  • Julie Shah, MIT
  • Milind Tambe, USC
  • Astro Teller, Google[X]

I see they have representation from Israel, India, and the private sector as well. Refreshingly, there’s more than one woman on the standing committee and in this first study group. It’s good to see these efforts at inclusiveness and I’m particularly delighted with the inclusion of an organization from Asia. All too often inclusiveness means Europe, especially the UK. So, it’s good (and I think important) to see a different range of representation.

As for the content of report, should anyone have opinions about it, please do let me know your thoughts in the blog comments.

Cooling the skin with plastic clothing

Rather that cooling or heating an entire room, why not cool or heat the person? Engineers at Stanford University (California, US) have developed a material that helps with half of that premise: cooling. From a Sept. 1, 2016 news item on ScienceDaily,

Stanford engineers have developed a low-cost, plastic-based textile that, if woven into clothing, could cool your body far more efficiently than is possible with the natural or synthetic fabrics in clothes we wear today.

Describing their work in Science, the researchers suggest that this new family of fabrics could become the basis for garments that keep people cool in hot climates without air conditioning.

“If you can cool the person rather than the building where they work or live, that will save energy,” said Yi Cui, an associate professor of materials science and engineering and of photon science at Stanford.

A Sept. 1, 2016 Stanford University news release (also on EurekAlert) by Tom Abate, which originated the news item, further explains the information in the video,

This new material works by allowing the body to discharge heat in two ways that would make the wearer feel nearly 4 degrees Fahrenheit cooler than if they wore cotton clothing.

The material cools by letting perspiration evaporate through the material, something ordinary fabrics already do. But the Stanford material provides a second, revolutionary cooling mechanism: allowing heat that the body emits as infrared radiation to pass through the plastic textile.

All objects, including our bodies, throw off heat in the form of infrared radiation, an invisible and benign wavelength of light. Blankets warm us by trapping infrared heat emissions close to the body. This thermal radiation escaping from our bodies is what makes us visible in the dark through night-vision goggles.

“Forty to 60 percent of our body heat is dissipated as infrared radiation when we are sitting in an office,” said Shanhui Fan, a professor of electrical engineering who specializes in photonics, which is the study of visible and invisible light. “But until now there has been little or no research on designing the thermal radiation characteristics of textiles.”

Super-powered kitchen wrap

To develop their cooling textile, the Stanford researchers blended nanotechnology, photonics and chemistry to give polyethylene – the clear, clingy plastic we use as kitchen wrap – a number of characteristics desirable in clothing material: It allows thermal radiation, air and water vapor to pass right through, and it is opaque to visible light.

The easiest attribute was allowing infrared radiation to pass through the material, because this is a characteristic of ordinary polyethylene food wrap. Of course, kitchen plastic is impervious to water and is see-through as well, rendering it useless as clothing.

The Stanford researchers tackled these deficiencies one at a time.

First, they found a variant of polyethylene commonly used in battery making that has a specific nanostructure that is opaque to visible light yet is transparent to infrared radiation, which could let body heat escape. This provided a base material that was opaque to visible light for the sake of modesty but thermally transparent for purposes of energy efficiency.

They then modified the industrial polyethylene by treating it with benign chemicals to enable water vapor molecules to evaporate through nanopores in the plastic, said postdoctoral scholar and team member Po-Chun Hsu, allowing the plastic to breathe like a natural fiber.

Making clothes

That success gave the researchers a single-sheet material that met their three basic criteria for a cooling fabric. To make this thin material more fabric-like, they created a three-ply version: two sheets of treated polyethylene separated by a cotton mesh for strength and thickness.

To test the cooling potential of their three-ply construct versus a cotton fabric of comparable thickness, they placed a small swatch of each material on a surface that was as warm as bare skin and measured how much heat each material trapped.

“Wearing anything traps some heat and makes the skin warmer,” Fan said. “If dissipating thermal radiation were our only concern, then it would be best to wear nothing.”

The comparison showed that the cotton fabric made the skin surface 3.6 F warmer than their cooling textile. The researchers said this difference means that a person dressed in their new material might feel less inclined to turn on a fan or air conditioner.

The researchers are continuing their work on several fronts, including adding more colors, textures and cloth-like characteristics to their material. Adapting a material already mass produced for the battery industry could make it easier to create products.

“If you want to make a textile, you have to be able to make huge volumes inexpensively,” Cui said.

Fan believes that this research opens up new avenues of inquiry to cool or heat things, passively, without the use of outside energy, by tuning materials to dissipate or trap infrared radiation.

“In hindsight, some of what we’ve done looks very simple, but it’s because few have really been looking at engineering the radiation characteristics of textiles,” he said.

Dexter Johnson (Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website) has written a Sept. 2, 2016 posting where he provides more technical detail about this work,

The nanoPE [nanoporous polyethylene] material is able to achieve this release of the IR heat because of the size of the interconnected pores. The pores can range in size from 50 to 1000 nanometers. They’re therefore comparable in size to wavelengths of visible light, which allows the material to scatter that light. However, because the pores are much smaller than the wavelength of infrared light, the nanoPE is transparent to the IR.

It is this combination of blocking visible light and allowing IR to pass through that distinguishes the nanoPE material from regular polyethylene, which allows similar amounts of IR to pass through, but can only block 20 percent of the visible light compared to nanoPE’s 99 percent opacity.

The Stanford researchers were also able to improve on the water wicking capability of the nanoPE material by using a microneedle punching technique and coating the material with a water-repelling agent. The result is that perspiration can evaporate through the material unlike with regular polyethylene.

For those who wish to further pursue their interest, Dexter has a lively writing style and he provides more detail and insight in his posting.

Here’s a link to and a citation for the paper,

Radiative human body cooling by nanoporous polyethylene textile by Po-Chun Hsu, Alex Y. Song, Peter B. Catrysse, Chong Liu, Yucan Peng, Jin Xie, Shanhui Fan, Yi Cui. Science  02 Sep 2016: Vol. 353, Issue 6303, pp. 1019-1023 DOI: 10.1126/science.aaf5471

This paper is open access.

Treating graphene with lasers for paper-based electronics

Engineers at Iowa State University have found a way they hope will make it easier to commercialize graphene. A Sept. 1, 2016 news item on phys.org describes the research,

The researchers in Jonathan Claussen’s lab at Iowa State University (who like to call themselves nanoengineers) have been looking for ways to use graphene and its amazing properties in their sensors and other technologies.

Graphene is a wonder material: The carbon honeycomb is just an atom thick. It’s great at conducting electricity and heat; it’s strong and stable. But researchers have struggled to move beyond tiny lab samples for studying its material properties to larger pieces for real-world applications.

Recent projects that used inkjet printers to print multi-layer graphene circuits and electrodes had the engineers thinking about using it for flexible, wearable and low-cost electronics. For example, “Could we make graphene at scales large enough for glucose sensors?” asked Suprem Das, an Iowa State postdoctoral research associate in mechanical engineering and an associate of the U.S. Department of Energy’s Ames Laboratory.

But there were problems with the existing technology. Once printed, the graphene had to be treated to improve electrical conductivity and device performance. That usually meant high temperatures or chemicals – both could degrade flexible or disposable printing surfaces such as plastic films or even paper.

Das and Claussen came up with the idea of using lasers to treat the graphene. Claussen, an Iowa State assistant professor of mechanical engineering and an Ames Laboratory associate, worked with Gary Cheng, an associate professor at Purdue University’s School of Industrial Engineering, to develop and test the idea.

A Sept. 1, 2016 Iowa State University news release (also on EurekAlert), which originated the news item, provides more detail about the intellectual property, as well as, the technology,

… They found treating inkjet-printed, multi-layer graphene electric circuits and electrodes with a pulsed-laser process improves electrical conductivity without damaging paper, polymers or other fragile printing surfaces.

“This creates a way to commercialize and scale-up the manufacturing of graphene,” Claussen said.

Two major grants are supporting the project and related research: a three-year grant from the National Institute of Food and Agriculture, U.S. Department of Agriculture, under award number 11901762 and a three-year grant from the Roy J. Carver Charitable Trust. Iowa State’s College of Engineering and department of mechanical engineering are also supporting the research.

The Iowa State Research Foundation Inc. has filed for a patent on the technology.

“The breakthrough of this project is transforming the inkjet-printed graphene into a conductive material capable of being used in new applications,” Claussen said.

Those applications could include sensors with biological applications, energy storage systems, electrical conducting components and even paper-based electronics.

To make all that possible, the engineers developed computer-controlled laser technology that selectively irradiates inkjet-printed graphene oxide. The treatment removes ink binders and reduces graphene oxide to graphene – physically stitching together millions of tiny graphene flakes. The process makes electrical conductivity more than a thousand times better.

“The laser works with a rapid pulse of high-energy photons that do not destroy the graphene or the substrate,” Das said. “They heat locally. They bombard locally. They process locally.”

That localized, laser processing also changes the shape and structure of the printed graphene from a flat surface to one with raised, 3-D nanostructures. The engineers say the 3-D structures are like tiny petals rising from the surface. The rough and ridged structure increases the electrochemical reactivity of the graphene, making it useful for chemical and biological sensors.

All of that, according to Claussen’s team of nanoengineers, could move graphene to commercial applications.

“This work paves the way for not only paper-based electronics with graphene circuits,” the researchers wrote in their paper, “it enables the creation of low-cost and disposable graphene-based electrochemical electrodes for myriad applications including sensors, biosensors, fuel cells and (medical) devices.”

Here’s a link to and a citation for the paper,

3D nanostructured inkjet printed graphene via UV-pulsed laser irradiation enables paper-based electronics and electrochemical devices by Suprem R. Das, Qiong Nian, Allison A. Cargill, John A. Hondred, Shaowei Ding, Mojib Saei, Gary J. Cheng, and   Jonathan C. Claussen. Nanoscale, 2016,8, 15870-15879 DOI: 10.1039/C6NR04310K First published online 12 Jul 2016

This paper is open access but you do need to have registered for your free account to access the material.

Doctor Strange contest for girls in the US aged 15 – 18

The deadline is Oct. 5, 2016 so if you do qualify for entry, you’d best be quick.

David Bruggeman in his Sept. 25, 2016 posting provides more information,

… the latest contest is called The Magic of STEM Challenge and is tied to the November [2016] release of the film Doctor Strange.

The name highlights part of the dramatic arc of the film – a neurosurgeon engaging with magic as he seeks to recover from an accident.  I have not seen the film, but it may bear some resemblance to how the Thor films have tried to explain the fantastical actions of those characters with some basis in science.  But don’t look too close (as you shouldn’t in any superhero film) or the gloss of scientific realism will disappear.

But I’m writing about the contest.  There’s a short window for entries, because the contest is open until October 5th.  Entrants are girls in the U.S. from 15-18 years old (grades 10-12), and must submit a video blog (vlog) on a scientific or technological questions. …

As some may know, Canadian actress Rachel McAdams is one of the leads in the film so she’s introducing the contest and the winner of the previous STEM Marvel contest (Captain America: Civil War),

You can find out more about the contest and the rules here.

One final thing about the movie, there has been a bit of a controversy with regard to the casting of Brit actress Tilda Swinton. From an April 28, 2016 posting by Kaiser on the Celebitchy blog,

… now C. Robert Cargill, the Strange screenwriter, has come out to try to explain it.

Tilda Swinton was cast as a Tibetan monk in the Marvel movie Doctor Strange so the comic book character could be changed to a ‘Celtic’ to avoid upsetting China, a screenwriter has claimed. One of the film’s screenwriters has suggested that the casting of the British actress as sorcerer the Ancient One was partly done to avoid offending China’s government. Moviegoers in China now represent the world’s second-largest annual box office after North America but the film’s backers apparently did not want to risk losing out on the Chinese market by introducing the highly politically charged subject of Tibet.

“He originates from Tibet, so if you acknowledge that Tibet is a place and that he’s Tibetan, you risk alienating one billion people who think that that’s bullsh*t and risk the Chinese government going, ‘Hey, you know one of the biggest film-watching countries in the world? We’re not going to show your movie because you decided to get political,” screenwriter C. Robert Cargill said in a podcast interview with the Texas-based DoubleToasted.com.

Cargill, who wasn’t involved in the casting of Swinton, said the comic book character of the Ancient One was ‘a racist stereotype.’

‘There is no other character in Marvel history that is such a cultural landmine, that is absolutely unwinnable,’ he said, adding: ‘It all comes down on to which way you are willing to lose.’

After the controversy over the 2016 Academy Awards regarding the paucity of minority nominees which  extended into a conversation about the lack of opportunity for minorities, it seems Hollywood is being held to closer account on topics of race.

As for the science end of things, I guess we can expect a light sprinkling of relatively accurate information mixed in with fantasy science.

Good luck to everyone who enters the contest and may your science be as accurate as possible.

Next generation of power lines could be carbon nanotube-coated

This research was done at the Masdar Institute in Abu Dhabi of the United Arab Emirates. From a Sept. 1, 2016 news item on Nanowerk,

A Masdar Institute Assistant Professor may have brought engineers one step closer to developing the type of next-generation power lines needed to achieve sustainable and resilient electrical power grids.

Dr. Kumar Shanmugam, Assistant Professor of Materials and Mechanical Engineering, helped develop a novel coating made from carbon nanotubes that, when layered around an aluminum-conductor composite core (ACCC) transmission line, reduces the line’s operating temperature and significantly improves its overall transmission efficiency.

An Aug. 29, 2016 Masdar Institute news release by Erica Solomon, which originated the news item, provides more detail,

The coating is made from carbon nanostructures (CNS) – which are bundles of aligned carbon nanotubes that have exceptional mechanical and electrical properties – provided by the project’s sponsor, Lockheed Martin. The second component of the coating is an epoxy resin, which is the thick material used to protect things like appliances and electronics from damage.  Together, the CNS and epoxy resin help prevent power lines from overheating, increases their current carrying capacity (the amount of current that can flow through a transmission line), while also protecting them from damages associated with lightning strikes, ice and other environmental impacts.

The researchers found that by replacing traditional steel-core transmission lines with ACCC cables layered with a CNS-epoxy coating (referred to in the study as ACCC-CNS lines), the amount of aluminum used in an ACCC cable can be reduced by 25%, making the cable significantly lighter and cheaper to produce. The span length of a transmission line can also increase by 30%, which will make it easier to transmit electricity across longer distances while the amount of current the line can carry can increase by 40%.

“The coating helps to dissipate the heat generated in the conductor more efficiently through radiation and convection, thereby preventing the cable from overheating and enabling it to carry more current farther distances,” Dr. Kumar explained.

Ultimately, the purpose of the coating is to effectively eliminate the transmission line losses. Each year, anywhere from 5% to over 10% of the overall power generated in a power plant is lost in transmission and distribution lines. Most of this electrical energy is lost in the form of heat; as current runs through a conductor (the transmission line), the conductor heats up because it resists the flow of electrons to some extent – a phenomenon known as resistive Joule heating. Resistive Joule heating causes the energy that was moving the electrons forward to change into heat energy, which means some of the generated power gets converted into heat and lost to the surrounding environment instead of getting to its intended destination (like our homes and offices).

In addition to wasting energy, resistive Joule heating can lead to overheating, which can trigger a transmission line to “sag”, or physically droop low to the ground. Sagging power lines in turn can have catastrophic effects, including short circuits and power outages.

Efforts to reduce the problem of resistive heating and energy loss in power lines have led to significant improvements in transmission line technologies. For example, in 2002 ACCC transmission cables – which feature a carbon and glass-fiber reinforced composite core wrapped in aluminum conductor wires – were invented. The ACCC conductors are lighter and more heat-resistant than traditional steel-core cables, which means they can carry more current without overheating or sagging. Today, it is estimated that over 200 power and distribution networks use ACCC transmission cables.

While the advent of composite core cables marks the first major turning point in the development of energy-efficient transmission lines, Dr. Kumar’s CNS-epoxy coating may be the second significant advancement in the evolution of sustainable power lines.

The CNS-epoxy coating works by keeping the cable’s operating temperatures low. It does this by dissipating, any generated heat away from the conductor efficiently, thereby preventing further increase in temperature of the line and avoiding the trickle-effect that often leads to overheating.

The coating is layered twice in the ACCC cable – an outer layer, which dissipates the heat and protects the cable from environmental factors like lightning strikes and foreign object impact; and an inner layer, which protects the composite core from damage caused by stray radio frequency radiation generated by the electromagnetic pulse emanating from high electric current carrying aluminum conductor

The research team utilized a multi-physics modeling framework to analyze how the CNS-epoxy coating would influence the performance of ACCC transmission line. After fabricating the coating, they characterized it, which is a critical step to determine its mechanical, thermal and electrical properties. These properties were then used in the computational and theoretical models to evaluate and predict the coating’s performance. Finally, a design tool was developed and used to find the optimal combination of parameters (core diameter, span distance and sag) needed to reduce the cable’s weight, sag, and operating temperature while increasing its span distance and current carrying capacity.

Dr. Kumar’s innovative transmission line technology research comes at a pivotal time, when countries all over the world, including the UAE, are seeking ways to reduce their carbon footprint in a concerted effort to mitigate global climate change. Turning to energy-efficient power lines that waste less power and in turn produce less carbon dioxide emissions will be an obvious choice for nations devoted to greater sustainability.

Here’s a link to and a citation for the paper,

High-Ampacity Overhead Power Lines With Carbon Nanostructure–Epoxy Composites by V. S. N. Ranjith Kumar, S. Kumar, G. Pal, and Tushar Shah. J. Eng. Mater. Technol 138(4), 041018 (Aug 09, 2016) (9 pages) Paper No: MATS-15-1217; doi: 10.1115/1.4034095

This paper is behind a paywall.

Scientific evidence and certainty: a controversy in the US Justice system

It seems that forensic evidence does not deliver the certainty that television and US prosecutors (I wonder if Canadian Crown Attorneys or Crown Counsels concur with their US colleagues?) would have us believe. The US President’s Council of Advisors on Science and Technology (PCAST) released a report (‘Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods‘ 174 pp PDF) on Sept. 20, 2016 that amongst other findings, notes that more scientific rigour needs to be applied to the field of forensic science.

Here’s more from the Sept. 20, 2016 posting by Eric Lander, William Press, S. James Gates, Jr., Susan L. Graham, J. Michael McQuade, and Daniel Schrag, on the White House blog,

The study that led to the report was a response to the President’s question to his PCAST in 2015, as to whether there are additional steps on the scientific side, beyond those already taken by the Administration in the aftermath of a highly critical 2009 National Research Council report on the state of the forensic sciences, that could help ensure the validity of forensic evidence used in the Nation’s legal system.

PCAST concluded that two important gaps warranted the group’s attention: (1) the need for clarity about the scientific standards for the validity and reliability of forensic methods and (2) the need to evaluate specific forensic methods to determine whether they have been scientifically established to be valid and reliable. The study aimed to help close these gaps for a number of forensic “feature-comparison” methods—specifically, methods for comparing DNA samples, bitemarks, latent fingerprints, firearm marks, footwear, and hair.

In the course of its year-long study, PCAST compiled and reviewed a set of more than 2,000 papers from various sources, educated itself on factual matters relating to the interaction between science and the law, and obtained input from forensic scientists and practitioners, judges, prosecutors, defense attorneys, academic researchers, criminal-justice-reform advocates, and representatives of Federal agencies.

A Sept. 23, 2016 article by Daniel Denvir for Salon.com sums up the responses from some of the institutions affected by this report,

Under fire yet again, law enforcement is fighting back. Facing heavy criticism for misconduct and abuse, prosecutors are protesting a new report from President Obama’s top scientific advisors that documents what has long been clear: much of the forensic evidence used to win convictions, including complex DNA samples and bite mark analysis, is not backed up by credible scientific research.

Although the evidence of this is clear, many in law enforcement seem terrified that keeping pseudoscience out of prosecutions will make them unwinnable. Attorney General Loretta Lynch declined to accept the report’s recommendations on the admissibility of evidence and the FBI accused the advisors of making “broad, unsupported assertions.” But the National District Attorneys Association, which represents roughly 2,5000 top prosecutors nationwide, went the furthest, taking it upon itself to, in its own words, “slam” the report.

Prosecutors’ actual problem with the report, produced by some of the nation’s leading scientists on the President’s Council of Advisors on Science and Technology, seems to be unrelated to science. Reached by phone NDAA president-elect Michael O. Freeman could not point to any specific problem with the research and accused the scientists of having an agenda against law enforcement.

“I’m a prosecutor and not a scientist,” Freeman, the County Attorney in Hennepin County, Minnesota, which encompasses Minneapolis, told Salon. “We think that there’s particular bias that exists in the folks who worked on this, and they were being highly critical of the forensic disciplines that we use in investigating and prosecuting cases.”

That response, devoid of any reference to hard science, has prompted some mockery, including from Robert Smith, Senior Research Fellow and Director of the Fair Punishment Project at Harvard Law School, who accused the NDAA of “fighting to turn America’s prosecutors into the Anti-Vaxxers, the Phrenologists, the Earth-Is-Flat Evangelists of the criminal justice world.”

It has also, however, also lent credence to a longstanding criticism that American prosecutors are more concerned with winning than in establishing a defendant’s guilt beyond a reasonable doubt.

“Prosecutors should not be concerned principally with convictions; they should be concerned with justice,” said Daniel S. Medwed, author of “Prosecution Complex: America’s Race to Convict and Its Impact on the Innocent” and a professor at Northern University School of Law, told Salon. “Using dodgy science to obtain convictions does not advance justice.”

Denvir’s article is lengthier and worth reading in its entirety.

Assuming there’s an association of forensic scientists, I find it interesting they don’t appear to have responded.

Finally, if there’s one thing you learn while writing about science it’s that there is no real certainty. For example, if you read about the Higgs boson discovery, you’ll note that the scientists involved the research never stated with absolute certainty that it exists but rather they ‘were pretty darn sure’ it does (I believe the scientific term is 5-sigma). There’s more about the Higgs boson and 5-sigma in this July 17, 2012 article by Evelyn Lamb for Scientific American,

In short, five-sigma corresponds to a p-value, or probability, of 3×10-7, or about 1 in 3.5 million. This is not the probability that the Higgs boson does or doesn’t exist; rather, it is the probability that if the particle does not exist, the data that CERN [European Particle Physics Laboratory] scientists collected in Geneva, Switzerland, would be at least as extreme as what they observed. “The reason that it’s so annoying is that people want to hear declarative statements, like ‘The probability that there’s a Higgs is 99.9 percent,’ but the real statement has an ‘if’ in there. There’s a conditional. There’s no way to remove the conditional,” says Kyle Cranmer, a physicist at New York University and member of the ATLAS team, one of the two groups that announced the new particle results in Geneva on July 4 [2012].

For the interested, there’s a lot more to Lamb’s article.

Getting back to forensic science, this PCAST report looks like an attempt to bring forensics back into line with the rest of the science world.

Graphene Canada and its second annual conference

An Aug. 31, 2016 news item on Nanotechnology Now announces Canada’s second graphene-themed conference,

The 2nd edition of Graphene & 2D Materials Canada 2016 International Conference & Exhibition (www.graphenecanadaconf.com) will take place in Montreal (Canada): 18-20 October, 2016.

– An industrial forum with focus on Graphene Commercialization (Abalonyx, Alcereco Inc, AMO GmbH, Avanzare, AzTrong Inc, Bosch GmbH, China Innovation Alliance of the Graphene Industry (CGIA), Durham University & Applied Graphene Materials, Fujitsu Laboratories Ltd., Hanwha Techwin, Haydale, IDTechEx, North Carolina Central University & Chaowei Power Ltd, NTNU&CrayoNano, Phantoms Foundation, Southeast University, The Graphene Council, University of Siegen, University of Sunderland and University of Waterloo)
– Extensive thematic workshops in parallel (Materials & Devices Characterization, Chemistry, Biosensors & Energy and Electronic Devices)
– A significant exhibition (Abalonyx, Go Foundation, Grafoid, Group NanoXplore Inc., Raymor | Nanointegris and Suragus GmbH)

As I noted in my 2015 post about Graphene Canada and its conference, the group is organized in a rather interesting fashion and I see the tradition continues, i.e., the lead organizers seem to be situated in countries other than Canada. From the Aug. 31, 2016 news item on Nanotechnology Now,

Organisers: Phantoms Foundation [located in Spain] www.phantomsnet.net
Catalan Institute of Nanoscience and Nanotechnology – ICN2 (Spain) | CEMES/CNRS (France) | GO Foundation (Canada) | Grafoid Inc (Canada) | Graphene Labs – IIT (Italy) | McGill University (Canada) | Texas Instruments (USA) | Université Catholique de Louvain (Belgium) | Université de Montreal (Canada)

You can find the conference website here.

‘Fill in the Planck’ with Tom McFadden

A science rhyming quiz set to music? Here’s more from David Bruggeman’s Aug. 30, 2016 posting (on his Pasco Phronesis blog; Note: Links have been removed),

Tom McFadden, fresh off of his featured appearance as Joseph-Louis Lagrange in William Rowan Hamilton [a science-oriented production by Tim Blais featuring music from the Broadway musical, Hamilton], has a rhyming quiz going on at his YouTube channel.  That’s right, a rhyming quiz, and it’s called Fill in the Planck.

There are two quizzes so far, one on the JUNO spacecraft and the most recent on water.  The idea is to complete each rhyme in the verse. …

McFadden includes instructions in his into. to the quiz. Here’s the second in the series, Hot Water – Fill in the Planck #2,

The ‘Fill in the Planck’ video series can be found here.

You can find out more about McFadden and his work here and there was this: vote for his panel ‘Hip Hop in the Science Classroom’ (voting ended Sept. 2, 2016) to be presented at the 2017 SXSWedu (South by Southwest education). There is no word yet as to whether or not McFadden’s presentation will be seen at the 2017 SXSWedu.

Mechanism behind interaction of silver nanoparticles with the cells of the immune system

Scientists have come to a better understanding of the mechanism affecting silver nanoparticle toxicity according to an Aug. 30, 2016 news item on Nanowerk (Note: A link has been removed),

A senior fellow at the Faculty of Chemistry, MSU (Lomonosov Moscow State University), Vladimir Bochenkov together with his colleagues from Denmark succeeded in deciphering the mechanism of interaction of silver nanoparticles with the cells of the immune system. The study is published in the journal Nature Communications (“Dynamic protein coronas revealed as a modulator of silver nanoparticle sulphidation in vitro”).

‘Currently, a large number of products are containing silver nanoparticles: antibacterial drugs, toothpaste, polishes, paints, filters, packaging, medical and textile items. The functioning of these products lies in the capacity of silver to dissolve under oxidation and form ions Ag+ with germicidal properties. At the same time there are research data in vitro, showing the silver nanoparticles toxicity for various organs, including the liver, brain and lungs. In this regard, it is essential to study the processes occurring with silver nanoparticles in biological environments, and the factors affecting their toxicity,’ says Vladimir Bochenkov.

Caption: Increased intensity of the electric field near the silver nanoparticle surface in the excitation of plasmon resonance. Credit: Vladimir Bochenkov

Caption: Increased intensity of the electric field near the silver nanoparticle surface in the excitation of plasmon resonance. Credit: Vladimir Bochenkov

An Aug. 30, 2016 MSU press release on EurekAlert, which originated the news item, provides more information about the research,

The study is devoted to the protein corona — a layer of adsorbed protein molecules, which is formed on the surface of the silver nanoparticles during their contact with the biological environment, for example in blood. Protein crown masks nanoparticles and largely determines their fate: the speed of the elimination from the body, the ability to penetrate to a particular cell type, the distribution between the organs, etc.

According to the latest research, the protein corona consists of two layers: a rigid hard corona — protein molecules tightly bound with silver nanoparticles, and soft corona, consisting of weakly bound protein molecules in a dynamic equilibrium with the solution. Hitherto soft corona has been studied very little because of the experimental difficulties: the weakly bound nanoparticles separated from the protein solution easily desorbed (leave a particle remaining in the solution), leaving only the rigid corona on the nanoparticle surface.

The size of the studied silver nanoparticles was of 50-88 nm, and the diameter of the proteins that made up the crown — 3-7 nm. Scientists managed to study the silver nanoparticles with the protein corona in situ, not removing them from the biological environment. Due to the localized surface plasmon resonance used for probing the environment near the surface of the silver nanoparticles, the functions of the soft corona have been primarily investigated.

‘In the work we showed that the corona may affect the ability of the nanoparticles to dissolve to silver cations Ag+, which determine the toxic effect. In the absence of a soft corona (quickly sharing the medium protein layer with the environment) silver cations are associated with the sulfur-containing amino acids in serum medium, particularly cysteine and methionine, and precipitate as nanocrystals Ag2S in the hard corona,’ says Vladimir Bochenkov.

Ag2S (silver sulfide) famously easily forms on the silver surface even on the air in the presence of the hydrogen sulfide traces. Sulfur is also part of many biomolecules contained in the body, provoking the silver to react and be converted into sulfide. Forming of the nano-crystals Ag2S due to low solubility reduces the bioavailability of the Ag+ ions, reducing the toxicity of silver nanoparticles to null. With a sufficient amount of amino acid sulfur sources available for reaction, all the potentially toxic silver is converted into the nontoxic insoluble sulfide. Scientists have shown that what happens in the absence of a soft corona.

In the presence of a soft corona, the Ag2S silver sulfide nanocrystals are formed in smaller quantities or not formed at all. Scientists attribute this to the fact that the weakly bound protein molecules transfer the Ag+ ions from nanoparticles into the solution, thereby leaving the sulfide not crystallized. Thus, the soft corona proteins are ‘vehicles’ for the silver ions.

This effect, scientists believe, be taken into account when analyzing the stability of silver nanoparticles in a protein environment, and in interpreting the results of the toxicity studies. Studies of the cells viability of the immune system (J774 murine line macrophages) confirmed the reduction in cell toxicity of silver nanoparticles at the sulfidation (in the absence of a soft corona).

Vladimir Bochenkov’s challenge was to simulate the plasmon resonance spectra of the studied systems and to create the theoretical model that allowed quantitative determination of silver sulfide content in situ around nanoparticles, following the change in the absorption bands in the experimental spectra. Since the frequency of the plasmon resonance is sensitive to a change in dielectric constant near the nanoparticle surface, changes in the absorption spectra contain information about the amount of silver sulfide formed.

Knowledge of the mechanisms of formation and dynamics of the behavior of the protein corona, information about its composition and structure are extremely important for understanding the toxicity and hazards of nanoparticles for the human body. In prospect the protein corona formation can be used to deliver drugs in the body, including the treatment of cancer. For this purpose it will be enough to pick such a content of the protein corona, which enables silver nanoparticles penetrate only in the cancer cell and kill it.

Here’s a link to and a citation for the paper describing this fascinating work,

Dynamic protein coronas revealed as a modulator of silver nanoparticle sulphidation in vitro by Teodora Miclăuş, Christiane Beer, Jacques Chevallier, Carsten Scavenius, Vladimir E. Bochenkov, Jan J. Enghild, & Duncan S. Sutherland. Nature Communications 7,
Article number: 11770 doi:10.1038/ncomms11770 Published  09 June 2016

This paper is open access.

Mechanically strong organic nanotubes made with light

This research comes from Nagoya University in Japan according to an Aug. 30, 2016 news item on Nanowerk,

Organic nanotubes (ONTs) are tubular nanostructures composed of organic molecules that have unique properties and have found various applications, such as electro-conductive materials and organic photovoltaics. A group of scientists at Nagoya University have developed a simple and effective method for the formation of robust covalent ONTs from simple molecules. This method is expected to be useful in generating a range of nanotube-based materials with desirable properties.

An Aug. 30, 2016 Nagoya University press release (also on EurekAlert), which originated the news item, provides more information,

Kaho Maeda, Dr. Hideto Ito, Professor Kenichiro Itami of the JST-ERATO Itami Molecular Nanocarbon Project and the Institute of Transformative Bio-Molecules (ITbM) of Nagoya University, and their colleagues have reported in the Journal of the American Chemical Society, on the development of a new and simple strategy, “helix-to-tube” to synthesize covalent organic nanotubes.

Organic nanotubes (ONTs) are organic molecules with tubular nanostructures. Nanostructures are structures that range between 1 nm and 100 nm, and ONTs have a nanometer-sized cavity. Various 
applications of ONTs have been reported, including molecular recognition materials, transmembrane ion channel/sensors, electro-conductive materials, and organic photovoltaics. Most ONTs are constructed by a self-assembly process based on weak non-covalent interactions such as hydrogen bonding, hydrophobic interactions and π-π interactions between aromatic rings. Due to these relatively weak interactions, most non-covalent ONTs possess a relatively fragile structure (Figure 1).

Figure1_ONT.png
Figure 1. Conventional synthetic method for non-covalent ONTs, their applications and disadvantages.

Covalent ONTs, whose tubular skeletons are cross-linked by covalent bonding (a bond made by sharing of electrons between atoms) could be synthesized from non-covalent ONTs. While covalent ONTs show higher stability and mechanical strength than non-covalent ONTs, the general synthetic strategy for covalent ONTs was yet to be established (Figure 2).

Figure2_ONT.png
Figure 2. Covalent ONTs derived from non-covalent ONTs by cross-linking, their properties and disadvantages.

A team led by Hideto Ito and Kenichiro Itami has succeeded in developing a simple and effective method for the synthesis of robust covalent ONTs (tube) by an operationally simple light irradiation of a readily accessible helical polymer (helix). This so-called “helix-to-tube” strategy is based on the following steps: 1) polymerization of a small molecule (monomer) to make a helical polymer followed by, 2) light-induced cross-linking at longitudinally repeating pitches across the whole helix to form covalent nanotubes (Figure 3).

Figure3_ONT.png
Figure 3. New synthetic approach towards covalent ONTs through longitudinal cross-linking between helical pitches in helical polymers.

With their strategy, the team designed and synthesized diacetylene-based helical polymers (acetylenes are molecules that contain carbon-carbon triple bonds), poly(m-phenylene diethynylene)s (poly-PDEs), which has chiral amide side chains that are able to induce a helical folding through hydrogen-bonding interactions (Figure 4).

Figure4_ONT.png
Figure 4. Molecular design for helical poly-PDE bearing chiral amide side chains.

The researchers revealed that light-induced cross-linking at longitudinally aligned 1,3-butadiyne moieties (a group of molecules that contain four carbons with triple bonds at the first and third carbons) could generate the desired covalent ONT (Figure 5). “This is the first time in the world to show that the photochemical polymerization reaction of diynes is applicable to the cross-linking reaction of a helical polymer,” says Maeda, a graduate student who mainly conducted the experiments.

The “helix-to-tube” method is expected to be able to generate a range of ONT-based materials by simply changing the arene (aromatic ring) unit in the monomer.

Figure5_ONT.png
Figure 5. Synthesis of a covalent ONT by photochemical cross-linking between longitudinal aligned 1,3-butadiyne moieties (red lines).

“One of the most difficult parts of this research was how to obtain scientific evidence on the structures of poly-PDEs and covalent ONTs,” says Ito, one of the leaders of this study. “We had little experience with the analysis of polymers and macromolecules such as ONTs. Fortunately, thanks to the support of our collaborators in Nagoya University, who are specialists in these particular research fields, we finally succeeded in characterizing these macromolecules by various techniques including spectroscopy, X-ray diffraction, and microscopy.”

“Although it took us about a year to synthesize the covalent ONT, it took another one and a half year to determine the structure of the nanotube,” says Maeda. “I was extremely excited when I first saw the transmission electron microscopy (TEM) images, which indicated that we had actually made the covalent ONT that we were expecting,” she continues (Figure 6).

Figure6_ONT.png
Figure 6. TEM images of the bundle structures of covalent ONT

“The best part of the research for me was finding that the photochemical cross-linking had taken place on the helix for the first time,” says Maeda. “In addition, photochemical cross-linking is known to usually occur in the solid phase, but we were able to show that the reaction takes place in the solution phase as well. As the reactions have never been carried out before, I was dubious at first, but it was a wonderful feeling to succeed in making the reaction work for the first time in the world. I can say for sure that this was a moment where I really found research interesting.”

“We were really excited to develop this simple yet powerful method to achieve the synthesis of covalent ONTs,” says Itami, the director of the JST-ERATO project and the center director of ITbM. “The “helix-to-tube” method enables molecular level design and will lead to the synthesis of various covalent ONTs with fixed diameters and tube lengths with desirable functionalities.”

“We envisage that ongoing advances in the “helix-to-tube” method may lead to the development of various ONT-based materials including electro-conductive materials and luminescent materials,” says Ito. “We are currently carrying out work on the “helix-to-tube” methodology and we hope to synthesize covalent ONTs with interesting properties for various applications.”

Here’s a link to and a citation for the paper,

Construction of Covalent Organic Nanotubes by Light-Induced Cross-Linking of Diacetylene-Based Helical Polymers by Kaho Maeda, Liu Hong, Taishi Nishihara, Yusuke Nakanishi, Yuhei Miyauchi, Ryo Kitaura, Naoki Ousaka, Eiji Yashima, Hideto Ito, and Kenichiro Itami. J. Am. Chem. Soc., Article ASAP DOI: 10.1021/jacs.6b05582 Publication Date (Web): August 3, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.