‘Smart’ fabric that’s bony

Researchers at Australia’s University of New South of Wales (UNSW) have devised a means of ‘weaving’ a material that mimics the bone tissue, periosteum according to a Jan. 11, 2017 news item on ScienceDaily,

For the first time, UNSW [University of New South Wales] biomedical engineers have woven a ‘smart’ fabric that mimics the sophisticated and complex properties of one nature’s ingenious materials, the bone tissue periosteum.

Having achieved proof of concept, the researchers are now ready to produce fabric prototypes for a range of advanced functional materials that could transform the medical, safety and transport sectors. Patents for the innovation are pending in Australia, the United States and Europe.

Potential future applications range from protective suits that stiffen under high impact for skiers, racing-car drivers and astronauts, through to ‘intelligent’ compression bandages for deep-vein thrombosis that respond to the wearer’s movement and safer steel-belt radial tyres.

A Jan. 11, 2017 UNSW press release on EurekAlert, which originated the news item, expands on the theme,

Many animal and plant tissues exhibit ‘smart’ and adaptive properties. One such material is the periosteum, a soft tissue sleeve that envelops most bony surfaces in the body. The complex arrangement of collagen, elastin and other structural proteins gives periosteum amazing resilience and provides bones with added strength under high impact loads.

Until now, a lack of scalable ‘bottom-up’ approaches by researchers has stymied their ability to use smart tissues to create advanced functional materials.

UNSW’s Paul Trainor Chair of Biomedical Engineering, Professor Melissa Knothe Tate, said her team had for the first time mapped the complex tissue architectures of the periosteum, visualised them in 3D on a computer, scaled up the key components and produced prototypes using weaving loom technology.

“The result is a series of textile swatch prototypes that mimic periosteum’s smart stress-strain properties. We have also demonstrated the feasibility of using this technique to test other fibres to produce a whole range of new textiles,” Professor Knothe Tate said.

In order to understand the functional capacity of the periosteum, the team used an incredibly high fidelity imaging system to investigate and map its architecture.

“We then tested the feasibility of rendering periosteum’s natural tissue weaves using computer-aided design software,” Professor Knothe Tate said.

The computer modelling allowed the researchers to scale up nature’s architectural patterns to weave periosteum-inspired, multidimensional fabrics using a state-of-the-art computer-controlled jacquard loom. The loom is known as the original rudimentary computer, first unveiled in 1801.

“The challenge with using collagen and elastin is their fibres, that are too small to fit into the loom. So we used elastic material that mimics elastin and silk that mimics collagen,” Professor Knothe Tate said.

In a first test of the scaled-up tissue weaving concept, a series of textile swatch prototypes were woven, using specific combinations of collagen and elastin in a twill pattern designed to mirror periosteum’s weave. Mechanical testing of the swatches showed they exhibited similar properties found in periosteum’s natural collagen and elastin weave.

First author and biomedical engineering PhD candidate, Joanna Ng, said the technique had significant implications for the development of next-generation advanced materials and mechanically functional textiles.

While the materials produced by the jacquard loom have potential manufacturing applications – one tyremaker believes a titanium weave could spawn a new generation of thinner, stronger and safer steel-belt radials – the UNSW team is ultimately focused on the machine’s human potential.

“Our longer term goal is to weave biological tissues – essentially human body parts – in the lab to replace and repair our failing joints that reflect the biology, architecture and mechanical properties of the periosteum,” Ms Ng said.

An NHMRC development grant received in November [2016] will allow the team to take its research to the next phase. The researchers will work with the Cleveland Clinic and the University of Sydney’s Professor Tony Weiss to develop and commercialise prototype bone implants for pre-clinical research, using the ‘smart’ technology, within three years.

In searching for more information about this work, I found a Winter 2015 article (PDF; pp. 8-11) by Amy Coopes and Steve Offner for UNSW Magazine about Knothe Tate and her work (Note: In Australia, winter would be what we in the Northern Hemisphere consider summer),

Tucked away in a small room in UNSW’s Graduate School of Biomedical Engineering sits a 19th century–era weaver’s wooden loom. Operated by punch cards and hooks, the machine was the first rudimentary computer when it was unveiled in 1801. While on the surface it looks like a standard Jacquard loom, it has been enhanced with motherboards integrated into each of the loom’s five hook modules and connected to a computer. This state-of-the-art technology means complex algorithms control each of the 5,000 feed-in fibres with incredible precision.

That capacity means the loom can weave with an extraordinary variety of substances, from glass and titanium to rayon and silk, a development that has attracted industry attention around the world.

The interest lies in the natural advantage woven materials have over other manufactured substances. Instead of manipulating material to create new shades or hues as in traditional weaving, the fabrics’ mechanical properties can be modulated, to be stiff at one end, for example, and more flexible at the other.

“Instead of a pattern of colours we get a pattern of mechanical properties,” says Melissa Knothe Tate, UNSW’s Paul Trainor Chair of Biomedical Engineering. “Think of a rope; it’s uniquely good in tension and in bending. Weaving is naturally strong in that way.”

The interface of mechanics and physiology is the focus of Knothe Tate’s work. In March [2015], she travelled to the United States to present another aspect of her work at a meeting of the international Orthopedic Research Society in Las Vegas. That project – which has been dubbed “Google Maps for the body” – explores the interaction between cells and their environment in osteoporosis and other degenerative musculoskeletal conditions such as osteoarthritis.

Using previously top-secret semiconductor technology developed by optics giant Zeiss, and the same approach used by Google Maps to locate users with pinpoint accuracy, Knothe Tate and her team have created “zoomable” anatomical maps from the scale of a human joint down to a single cell.

She has also spearheaded a groundbreaking partnership that includes the Cleveland Clinic, and Brown and Stanford universities to help crunch terabytes of data gathered from human hip studies – all processed with the Google technology. Analysis that once took 25 years can now be done in a matter of weeks, bringing researchers ever closer to a set of laws that govern biological behaviour. [p. 9]

I gather she was recruited from the US to work at the University of New South Wales and this article was to highlight why they recruited her and to promote the university’s biomedical engineering department, which she chairs.

Getting back to 2017, here’s a link to and citation for the paper,

Scale-up of nature’s tissue weaving algorithms to engineer advanced functional materials by Joanna L. Ng, Lillian E. Knothe, Renee M. Whan, Ulf Knothe & Melissa L. Knothe Tate. Scientific Reports 7, Article number: 40396 (2017) doi:10.1038/srep40396 Published online: 11 January 2017

This paper is open access.

One final comment, that’s a lot of people (three out of five) with the last name Knothe in the author’s list for the paper.

Canadian Science Policy Conference inaugurates Lecture Series: Science Advice in a Troubled World

The Canadian Science Policy Centre (CSPC) launched a lecture series on Monday, Jan. 16, 2017 with Sir Peter Gluckman as the first speaker in a talk titled, Science Advice in a Troubled World. From a Jan. 18, 2017 CSPC announcement (received via email),

The inaugural session of the Canadian Science Policy Lecture Series was hosted by ISSP [University of Ottawa’s Institute for Science Society and Policy (ISSP)] on Monday January 16th [2017] at the University of Ottawa. Sir Peter Gluckman, Chief Science Advisor to the Prime Minister of New Zealand gave a presentation titled “Science Advise [sic] in a troubled world”. For a summary of the event, video and pictures please visit the event page.  

The session started with speeches by Monica Gattiner, Director, Institute for Science, Society and Policy, Jacques Frémont, President of the University of Ottawa as well as Mehrdad Hariri, CEO and President of the Canadian Science Policy Centre (CSPC).

The talk itself is about 50 mins. but there are lengthy introductions, including a rather unexpected (by me) reference to the recent US election from the president of the University of Ottawa, Jacques Frémont (formerly the head of Québec’s Human Rights Commission, where the talk was held. There was also a number of questions after the talk. So, the running time for the video 1 hr. 12 mins.

Here’s a bit more information about Sir Peter, from the Science Advice in a Troubled World event page on the CSPC website,

Sir Peter Gluckman ONZ FRS is the first Chief Science Advisor to the Prime Minister of New Zealand, having been appointed in 2009. He is also science envoy and advisor to the Ministry of Foreign Affairs and Trade. He is chair of the International Network of Government Science Advice (INGSA), which operates under the aegis of the international Council of Science (ICSU). He chairs the APEC Chief Science Advisors and Equivalents group and is the coordinator of the secretariat of Small Advanced Economies Initiative.  In 2016 he received the AAAS award in Science Diplomacy. He trained as a pediatric and biomedical scientist and holds a Distinguished University Professorship at the Liggins Institute of the University of Auckland. He has published over 700 scientific papers and several technical and popular science books. He has received the highest scientific (Rutherford medal) and civilian (Order of New Zealand, limited to 20 living persons) honours in NZ and numerous international scientific awards. He is a Fellow of the Royal Society of London, a member of the National Academy of Medicine (USA) and a fellow of the Academy of Medical Sciences (UK).

I listened to the entire video and Gluckman presented a thoughtful, nuanced lecture in which he also mentioned Calestous Juma and his 2016 book, Innovation and Its Enemies (btw, I will be writing a commentary about Juma’s extraordinary effort). He also referenced the concepts of post-truth and post-trust, and made an argument for viewing evidence-based science as part of the larger policymaking process rather than the dominant or only factor. From the Science Advice in a Troubled World event page,

Lecture Introduction

The world is facing many challenges from environmental degradation and climate change to global health issues, and many more.  Societal relationships are changing; sources of information, reliable and otherwise, and their transmission are affecting the nature of public policy.

Within this context the question arises; how can scientific advice to governments help address these emerging issues in a more unstable and uncertain world?
The relationship between science and politics is complex and the challenges at their interface are growing. What does scientific advice mean within this context?
How can science better inform policy where decision making is increasingly made against a background of post-truth polemic?

I’m not in perfect agreement with Gluckman with regard to post-truth as I have been influenced by an essay of Steve Fuller’s suggesting that science too can be post-truth. (Fuller’s essay was highlighted in my Jan. 6, 2017 posting.)

Gluckman seems to be wielding a fair amount of influence on the Canadian scene. This is his second CSPC visit in the last few months. He was an invited speaker at the Eighth Annual CSPC conference in November 2016 and, while he’s here in Jan. 2017, he’s chairing the Canadian Institutes of Health Research (CIHR) International Panel on Peer Review. (The CIHR is one of Canada’s three major government funding agencies for the sciences.)

In other places too, he’s going to be a member of a panel at the University of Oxford Martin School in later January 2017. From the “Is a post-truth world a post-expert world?” event page on the Oxford Martin webspace,

Winston Churchill advised that “experts should be on tap but never on top”. In 2017, is a post-truth world a post-expert world? What does this mean for future debates on difficult policy issues? And what place can researchers usefully occupy in an academic landscape that emphasises policy impact but a political landscape that has become wary of experts? Join us for a lively discussion on academia and the provision of policy advice, examining the role of evidence and experts and exploring how gaps with the public and politicians might be bridged.

This event will be chaired by Achim Steiner, Director of the Oxford Martin School and former Executive Director of the United Nations Environment Programme, with panellists including Oxford Martin Visiting Fellow Professor Sir Peter Gluckman, Chief Science Advisor to the Prime Minister of New Zealand and Chair of the International Network for Government Science Advice; Dr Gemma Harper, Deputy Director for Marine Policy and Evidence and Chief Social Scientist in the Department for Environment, Food and Rural Affairs (Defra), and Professor Stefan Dercon, Chief Economist of the Department for International Development (DFID) and Professor of Economic Policy at the Blavatnik School of Government.

This discussion will be followed by a drinks reception, all welcome.

Here are the logistics should you be lucky enough to be able to attend (from the event page),

25 January 2017 17:00 – 18:15

Lecture Theatre, Oxford Martin School

34 Broad Street (corner of Holywell and Catte Streets)

Registration ((right hand column) is free.

Finally, Gluckman has published a paper on the digital economy as of Nov. 2016, which can be found here (PDF).

Investigating nanoparticles and their environmental impact for industry?

It seems the Center for the Environmental Implications of Nanotechnology (CEINT) at Duke University (North Carolina, US) is making an adjustment to its focus and opening the door to industry, as well as, government research. It has for some years (my first post about the CEINT at Duke University is an Aug. 15, 2011 post about its mesocosms) been focused on examining the impact of nanoparticles (also called nanomaterials) on plant life and aquatic systems. This Jan. 9, 2017 US National Science Foundation (NSF) news release (h/t Jan. 9, 2017 Nanotechnology Now news item) provides a general description of the work,

We can’t see them, but nanomaterials, both natural and manmade, are literally everywhere, from our personal care products to our building materials–we’re even eating and drinking them.

At the NSF-funded Center for Environmental Implications of Nanotechnology (CEINT), headquartered at Duke University, scientists and engineers are researching how some of these nanoscale materials affect living things. One of CEINT’s main goals is to develop tools that can help assess possible risks to human health and the environment. A key aspect of this research happens in mesocosms, which are outdoor experiments that simulate the natural environment – in this case, wetlands. These simulated wetlands in Duke Forest serve as a testbed for exploring how nanomaterials move through an ecosystem and impact living things.

CEINT is a collaborative effort bringing together researchers from Duke, Carnegie Mellon University, Howard University, Virginia Tech, University of Kentucky, Stanford University, and Baylor University. CEINT academic collaborations include on-going activities coordinated with faculty at Clemson, North Carolina State and North Carolina Central universities, with researchers at the National Institute of Standards and Technology and the Environmental Protection Agency labs, and with key international partners.

The research in this episode was supported by NSF award #1266252, Center for the Environmental Implications of NanoTechnology.

The mention of industry is in this video by O’Brien and Kellan, which describes CEINT’s latest work ,

Somewhat similar in approach although without a direction reference to industry, Canada’s Experimental Lakes Area (ELA) is being used as a test site for silver nanoparticles. Here’s more from the Distilling Science at the Experimental Lakes Area: Nanosilver project page,

Water researchers are interested in nanotechnology, and one of its most commonplace applications: nanosilver. Today these tiny particles with anti-microbial properties are being used in a wide range of consumer products. The problem with nanoparticles is that we don’t fully understand what happens when they are released into the environment.

The research at the IISD-ELA [International Institute for Sustainable Development Experimental Lakes Area] will look at the impacts of nanosilver on ecosystems. What happens when it gets into the food chain? And how does it affect plants and animals?

Here’s a video describing the Nanosilver project at the ELA,

You may have noticed a certain tone to the video and it is due to some political shenanigans, which are described in this Aug. 8, 2016 article by Bartley Kives for the Canadian Broadcasting Corporation’s (CBC) online news.

Prawn (shrimp) shopping bags and saving the earth

Using a material (shrimp shells) that is disposed of as waste to create a biodegradable product (shopping bags) can only be described as a major win. A Jan. 10, 2017 news item on Nanowerk makes the announcement,

Bioengineers at The University of Nottingham are trialling how to use shrimp shells to make biodegradable shopping bags, as a ‘green’ alternative to oil-based plastic, and as a new food packaging material to extend product shelf life.

The new material for these affordable ‘eco-friendly’ bags is being optimised for Egyptian conditions, as effective waste management is one of the country’s biggest challenges.

An expert in testing the properties of materials, Dr Nicola Everitt from the Faculty of Engineering at Nottingham, is leading the research together with academics at Nile University in Egypt.

“Non-degradable plastic packaging is causing environmental and public health problems in Egypt, including contamination of water supplies which particularly affects living conditions of the poor,” explains Dr Everitt.

Natural biopolymer products made from plant materials are a ‘green’ alternative growing in popularity, but with competition for land with food crops, it is not a viable solution in Egypt.

A Jan. 10, 2017 University of Nottingham press release, which originated the news item,expands on the theme,

This new project aims to turn shrimp shells, which are a part of the country’s waste problem into part of the solution.

Dr Everitt said: “Use of a degradable biopolymer made of prawn shells for carrier bags would lead to lower carbon emissions and reduce food and packaging waste accumulating in the streets or at illegal dump sites. It could also make exports more acceptable to a foreign market within a 10-15-year time frame. All priorities at a national level in Egypt.”

Degradable nanocomposite material

The research is being undertaken to produce an innovative biopolymer nanocomposite material which is degradable, affordable and suitable for shopping bags and food packaging.

Chitosan is a man-made polymer derived from the organic compound chitin, which is extracted from shrimp shells, first using acid (to remove the calcium carbonate “backbone” of the crustacean shell) and then alkali (to produce the long molecular chains which make up the biopolymer).

The dried chitosan flakes can then be dissolved into solution and polymer film made by conventional processing techniques.

Chitosan was chosen because it is a promising biodegradable polymer already used in pharmaceutical packaging due to its antimicrobial, antibacterial and biocompatible properties. The second strand of the project is to develop an active polymer film that absorbs oxygen.

Enhancing food shelf life and cutting food waste

This future generation food packaging could have the ability to enhance food shelf life with high efficiency and low energy consumption, making a positive impact on food wastage in many countries.

If successful, Dr Everitt plans to approach UK packaging manufacturers with the product.

Additionally, the research aims to identify a production route by which these degradable biopolymer materials for shopping bags and food packaging could be manufactured.

I also found the funding for this project to be of interest (from the press release),

The project is sponsored by the Newton Fund and the Newton-Mosharafa Fund grant and is one of 13 Newton-funded collaborations for The University of Nottingham.

The collaborations, which are designed to tackle community issues through science and innovation, with links formed with countries such as Brazil, Egypt, Philippines and Indonesia.

Since the Newton Fund was established in 2014, the University has been awarded a total of £4.5m in funding. It also boasts the highest number of institutional-led collaborations.

Professor Nick Miles Pro-Vice-Chancellor for Global Engagement said: “The University of Nottingham has a long and established record in global collaboration and research.

The Newton Fund plays to these strengths and enables us to work with institutions around the world to solve some of the most pressing issues facing communities.”

From a total of 68 universities, The University of Nottingham has emerged as the top awardee of British Council Newton Fund Institutional Links grants (13) and is joint top awardee from a total of 160 institutions competing for British Council Newton Fund Researcher Links Workshop awards (6).

Professor Miles added: “This is testament to the incredible research taking place across the University – both here in the UK and in the campuses in Malaysia and China – and underlines the strength of our research partnerships around the world.”

That’s it!

Panasonic and its next generation makeup mirror

Before leaping to Panasonic’s latest makeup mirror news, here’s an earlier iteration of their product at the 2016 Consumer Electronics Show (CES),

That was posted on Jan. 10, 2016 by Makeup University.

Panasonic has come back in 2017 to hype its “Snow Beauty Mirror,”  a product which builds on its predecessor’s abilities by allowing the mirror to create a makeup look which it then produces for the user. At least, they hope it will—in 2020. From a Jan. 8, 2017 article by Shusuke Murai about the mirror and Japan’s evolving appliances market for The Japan Times,

Panasonic Corp. is developing a “magic” mirror for 2020 that will use nanotechnology for high-definition TVs to offer advice on how to become more beautiful.

The aim of the Snow Beauty Mirror is “to let people become what they want to be,” said Panasonic’s Sachiko Kawaguchi, who is in charge of the product’s development.

“Since 2012 or 2013, many female high school students have taken advantage of blogs and other platforms to spread their own messages,” Kawaguchi said. “Now the trend is that, in this digital era, they change their faces (on a photo) as they like to make them appear as they want to be.”

When one sits in front of the computerized mirror, a camera and sensors start scanning the face to check the skin. It then shines a light to analyze reflection and absorption rates, find flaws like dark spots, wrinkles and large pores, and offer tips on how to improve appearances.

But this is when the real “magic” begins.

Tap print on the results screen and a special printer for the mirror churns out an ultrathin, 100-nanometer makeup-coated patch that is tailor-made for the person examined.

The patch is made of a safe material often used for surgery so it can be directly applied to the face. Once the patch settles, it is barely noticeable and resists falling off unless sprayed with water.

The technologies behind the patch involve Panasonic’s know-how in organic light-emitting diodes (OLED), Kawaguchi said. By using the company’s technology to spray OLED material precisely onto display substrates, the printer connected to the computerized mirror prints a makeup ink that is made of material similar to that used in foundation, she added.

Though the product is still in the early stages of development, Panasonic envisions the mirror allowing users to download their favorite makeups from a database and apply them. It also believes the makeup sheet can be used to cover blemishes and birthmarks.

Before coming up with the smart mirror, Panasonic conducted a survey involving more than 50 middle- to upper-class women from six major Asian cities whose ages ranged from their 20s to 40s about makeup habits and demands.

Some respondents said they were not sure how to care for their skin to make it look its best, while others said they were hesitant to visit makeup counters in department stores.

“As consumer needs are becoming increasingly diverse, the first thing to do is to offer a tailor-made solution to answer each individual’s needs,” Kawaguchi said.

Panasonic aims to introduce the smart mirror and cosmetics sheets at department stores and beauty salons by 2020.

But Kawaguchi said there are many technological and marketing hurdles that must first be overcome — including how to mass-produce the ultrathin sheets.

“We are still at about 30 percent of overall progress,” she said, adding that the company hopes to market the makeup sheet at a price as low as foundation and concealer combined.

“I hope that, by 2020, applying facial sheets will become a major way to do makeup,” she said.

For anyone interested in Japan’s appliances market, please read Murai’s article in its entirety.

US Environmental Protection Agency finalizes its one-time reporting requirements for nanomaterials

The US Environmental Protection Agency (EPA) has announced its one-time reporting requirement for  nanomaterials. From a Jan. 12, 2017 news item on Nanowerk,

The U.S. Environmental Protection Agency (EPA) is requiring one-time reporting and recordkeeping requirements on nanoscale chemical substances in the marketplace. These substances are nano-sized versions of chemicals that are already in the marketplace.
EPA seeks to facilitate innovation while ensuring safety of the substances. EPA currently reviews new chemical substances manufactured or processed as nanomaterials prior to introduction into the marketplace to ensure that they are safe.

For the first time, EPA is using [the] TSCA [Toxic Substances Control Act] to collect existing exposure and health and safety information on chemicals currently in the marketplace when manufactured or processed as nanoscale materials.

The companies will notify EPA of certain information:
– specific chemical identity;
– production volume;
– methods of manufacture; processing, use, exposure, and release information; and,available health and safety data.


David Stegon writes about the requirement in a Jan. 12, 2017 posting on Chemical Watch,

The US EPA has finalised its nanoscale materials reporting rule, completing a process that began more than 11 years ago.

The US position contrasts with that of the European Commission, which has rejected the idea of a specific mandatory reporting obligation for nanomaterials. Instead it insists such data can be collected under REACH’s registration rules for substances in general. It has told Echa [ECHA {European Chemicals Agency}] to develop ‘nano observatory’ pages on its website with existing nanomaterial information. Meanwhile, Canada set its reporting requirements in 2015.

The US rule, which comes under section 8(a) of TSCA, will take effect 120 days after publication in the Federal Register.

It defines nanomaterials as chemical substances that are:

  • solids at 25 degrees Celsius at standard atmospheric pressure;
  • manufactured or processed in a form where any particles, including aggregates and agglomerates, are between 1 and 100 nanometers (nm) in at least one dimension; and
  • manufactured or processed to exhibit one or more unique and novel property.

The rule does not apply to chemical substances manufactured or processed in forms that contain less than 1% by weight of any particles between 1 and 100nm.

Taking account of comments received on the rulemaking, the EPA made three changes to the proposed definition:

  • it added the definition of unique and novel properties to help identify substances that act differently at nano sizes;
  • it clarified that a substance is not a nanomaterial if it fits the specified size range, but does not have a size-dependent property that differs from the same chemical at sizes greater than 100nm; and
  • it eliminated part of the nanomaterial definition that had said a reportable chemical may not include a substance that only has trace amounts of primary particles, aggregates, or agglomerates in the size range of 1 to 100nm.

The EPA has added the new information gathering rule (scroll down about 50% of the way) on its Control of Nanoscale Materials under the Toxic Substances Control Act webpage.

There’s also this Jan. 17, 2017 article by Meagan Parrish for the ChemInfo which provides an alternative perspective and includes what appears to be some misinformation (Note: A link has been removed),

It was several years in the making, but in the final stages of its rule-making process for nanomaterial reporting, the Environmental Protection Agency declined to consider feedback from the industry.

Now, with the final language published and the rule set to go into effect in May, some in the industry are concerned that the agency is requiring an unnecessary amount of costly reporting that isn’t likely to reveal potential hazards. The heightened regulations could also hamper the pace of innovation underway in the industry.

“The poster child for nanotechnology is carbon nanotubes,” says James Votaw, a partner with Manatt, Phelps & Phillips, of the form of carbon that is 10,000 smaller than human hair but stronger than steel. “It can be used to make very strong materials and as an additive in plastics to make them electrically conductive or stiffer.”

The EPA has been attempting to define nanomaterials since 2004 and assess the potential for environmental or human health risks associated with their use. In 2008, the EPA launched an effort to collect voluntarily submitted information from key players in the industry, but after a few years, the agency wasn’t happy with amount of responses. The effort to create a mandatory reporting requirement was launched in 2010.

Yet, according to Votaw, after a 2015 proposal of the rule was extensively criticized by the industry for being overly ambiguous and overly inclusive of its coverage, the industry asked the EPA to reopen a dialogue on the rule. The EPA declined.

The new reporting requirement is expected to cost companies about $27.79 million during the first year and $3.09 million in subsequent years. [emphasis mine]

As far as I’m aware, this is a one-time reporting requirement. Although I’m sure many would like to see that change.

As for the Canadian situation, I mentioned the nanomaterials mandatory survey noted in Stegon’s piece in a July 29, 2015 posting. It was one of a series of mandatory surveys (currently, a survey on asbestos is underway) issued as part of Canada’s Chemicals Management Plan. You can find more information about the nanomaterials notice and approach to the survey although there doesn’t appear to have been a report made public but perhaps it’s too soon. From the Nanomaterials Mandatory Survey page,

The Government of Canada is undertaking a stepwise approach to address nanoscale forms of substances on the DSL. The proposed approach consists of three phases:

  • Establishment of a list of existing nanomaterials in Canada (this includes the section 71 Notice);
  • Prioritization of existing nanomaterials for action; and
  • Action on substances identified for further work.

The overall approach was first described in a consultation document entitled Proposed Approach to Address Nanoscale Forms of Substances on the Domestic Substances List, published on March 18, 2015. This consultation document was open for a 60-day public comment period to solicit feedback from stakeholders, particularly on the first phase of the approach.

A second consultation document entitled Proposed Prioritization Approach for Nanoscale Forms of Substances on the Domestic Substances List was published on July 27, 2016. In this document, the approach proposed for prioritization of existing nanomaterials on the DSL is described, taking into consideration the results of the section 71 Notice.  Comments on this consultation document may be submitted prior to September 25, 2016 …

I look forward to discovering a report on the Canadian nanomaterials survey should one be made public.

Essays on Frankenstein

Slate.com is dedicating a month (January 2017) to Frankenstein. This means there were will be one or more essays each week on one aspect or another of Frankenstein and science. These essays are one of a series of initiatives jointly supported by Slate, Arizona State University, and an organization known as New America. It gets confusing since these essays are listed as part of two initiatives:  Futurography and Future Tense.

The really odd part, as far as I’m concerned, is that there is no mention of Arizona State University’s (ASU) The Frankenstein Bicentennial Project (mentioned in my Oct. 26, 2016 posting). Perhaps they’re concerned that people will think ASU is advertising the project?


Getting back to the essays, a Jan. 3, 2017 article by Jacob Brogan explains, by means of a ‘Question and Answer’ format article, why the book and the monster maintain popular interest after two centuries (Note: We never do find out who or how many people are supplying the answers),

OK, fine. I get that this book is important, but why are we talking about it in a series about emerging technology?

Though people still tend to weaponize it as a simple anti-scientific screed, Frankenstein, which was first published in 1818, is much richer when we read it as a complex dialogue about our relationship to innovation—both our desire for it and our fear of the changes it brings. Mary Shelley was just a teenager when she began to compose Frankenstein, but she was already grappling with our complex relationship to new forces. Almost two centuries on, the book is just as propulsive and compelling as it was when it was first published. That’s partly because it’s so thick with ambiguity—and so resistant to easy interpretation.

Is it really ambiguous? I mean, when someone calls something frankenfood, they aren’t calling it “ethically ambiguous food.”

It’s a fair point. For decades, Frankenstein has been central to discussions in and about bioethics. Perhaps most notably, it frequently crops up as a reference point in discussions of genetically modified organisms, where the prefix Franken- functions as a sort of convenient shorthand for human attempts to meddle with the natural order. Today, the most prominent flashpoint for those anxieties is probably the clustered regularly interspaced short palindromic repeats, or CRISPR, gene-editing technique [emphasis mine]. But it’s really oversimplifying to suggest Frankenstein is a cautionary tale about monkeying with life.

As we’ll see throughout this month on Futurography, it’s become a lens for looking at the unintended consequences of things like synthetic biology, animal experimentation, artificial intelligence, and maybe even social networking. Facebook, for example, has arguably taken on a life of its own, as its algorithms seem to influence the course of elections. Mark Zuckerberg, who’s sometimes been known to disavow the power of his own platform, might well be understood as a Frankensteinian figure, amplifying his creation’s monstrosity by neglecting its practical needs.

But this book is almost 200 years old! Surely the actual science in it is bad.

Shelley herself would probably be the first to admit that the science in the novel isn’t all that accurate. Early in the novel, Victor Frankenstein meets with a professor who castigates him for having read the wrong works of “natural philosophy.” Shelley’s protagonist has mostly been studying alchemical tomes and otherwise fantastical works, the sort of things that were recognized as pseudoscience, even by the standards of the day. Near the start of the novel, Frankenstein attends a lecture in which the professor declaims on the promise of modern science. He observes that where the old masters “promised impossibilities and performed nothing,” the new scientists achieve far more in part because they “promise very little; they know that metals cannot be transmuted and that the elixir of life is a chimera.”

Is it actually about bad science, though?

Not exactly, but it has been read as a story about bad scientists.

Ultimately, Frankenstein outstrips his own teachers, of course, and pulls off the very feats they derided as mere fantasy. But Shelley never seems to confuse fact and fiction, and, in fact, she largely elides any explanation of how Frankenstein pulls off the miraculous feat of animating dead tissue. We never actually get a scene of the doctor awakening his creature. The novel spends far more dwelling on the broader reverberations of that act, showing how his attempt to create one life destroys countless others. Read in this light, Frankenstein isn’t telling us that we shouldn’t try to accomplish new things, just that we should take care when we do.

This speaks to why the novel has stuck around for so long. It’s not about particular scientific accomplishments but the vagaries of scientific progress in general.

Does that make it into a warning against playing God?

It’s probably a mistake to suggest that the novel is just a critique of those who would usurp the divine mantle. Instead, you can read it as a warning about the ways that technologists fall short of their ambitions, even in their greatest moments of triumph.

Look at what happens in the novel: After bringing his creature to life, Frankenstein effectively abandons it. Later, when it entreats him to grant it the rights it thinks it deserves, he refuses. Only then—after he reneges on his responsibilities—does his creation really go bad. We all know that Frankenstein is the doctor and his creation is the monster, but to some extent it’s the doctor himself who’s made monstrous by his inability to take responsibility for what he’s wrought.

I encourage you to read Brogan’s piece in its entirety and perhaps supplement the reading. Mary Shelley has a pretty interesting history. She ran off with Percy Bysshe Shelley who was married to another woman, in 1814  at the age of seventeen years. Her parents were both well known and respected intellectuals and philosophers, William Godwin and Mary Wollstonecraft. By the time Mary Shelley wrote her book, her first baby had died and she had given birth to a second child, a boy.  Percy Shelley was to die a few years later as was her son and a third child she’d given birth to. (Her fourth child born in 1819 did survive.) I mention the births because one analysis I read suggests the novel is also a commentary on childbirth. In fact, the Frankenstein narrative has been examined from many perspectives (other than science) including feminism and LGBTQ studies.

Getting back to the science fiction end of things, the next part of the Futurography series is titled “A Cheat-Sheet Guide to Frankenstein” and that too is written by Jacob Brogan with a publication date of Jan. 3, 2017,

Key Players

Marilyn Butler: Butler, a literary critic and English professor at the University of Cambridge, authored the seminal essay “Frankenstein and Radical Science.”

Jennifer Doudna: A professor of chemistry and biology at the University of California, Berkeley, Doudna helped develop the CRISPR gene-editing technique [emphasis mine].

Stephen Jay Gould: Gould is an evolutionary biologist and has written in defense of Frankenstein’s scientific ambitions, arguing that hubris wasn’t the doctor’s true fault.

Seán Ó hÉigeartaigh: As executive director of the Center for Existential Risk at the University of Cambridge, hÉigeartaigh leads research into technologies that threaten the existience of our species.

Jim Hightower: This columnist and activist helped popularize the term frankenfood to describe genetically modified crops.

Mary Shelley: Shelley, the author of Frankenstein, helped create science fiction as we now know it.

J. Craig Venter: A leading genomic researcher, Venter has pursued a variety of human biotechnology projects.




Popular Culture

Further Reading


‘Franken’ and CRISPR

The first essay is in a Jan. 6, 2016 article by Kay Waldman focusing on the ‘franken’ prefix (Note: links have been removed),

In a letter to the New York Times on June 2, 1992, an English professor named Paul Lewis lopped off the top of Victor Frankenstein’s surname and sewed it onto a tomato. Railing against genetically modified crops, Lewis put a new generation of natural philosophers on notice: “If they want to sell us Frankenfood, perhaps it’s time to gather the villagers, light some torches and head to the castle,” he wrote.

William Safire, in a 2000 New York Times column, tracked the creation of the franken- prefix to this moment: an academic channeling popular distrust of science by invoking the man who tried to improve upon creation and ended up disfiguring it. “There’s no telling where or how it will end,” he wrote wryly, referring to the spread of the construction. “It has enhanced the sales of the metaphysical novel that Ms. Shelley’s husband, the poet Percy Bysshe Shelley, encouraged her to write, and has not harmed sales at ‘Frank’n’Stein,’ the fast-food chain whose hot dogs and beer I find delectably inorganic.” Safire went on to quote the American Dialect Society’s Laurence Horn, who lamented that despite the ’90s flowering of frankenfruits and frankenpigs, people hadn’t used Frankensense to describe “the opposite of common sense,” as in “politicians’ motivations for a creatively stupid piece of legislation.”

A year later, however, Safire returned to franken- in dead earnest. In an op-ed for the Times avowing the ethical value of embryonic stem cell research, the columnist suggested that a White House conference on bioethics would salve the fears of Americans concerned about “the real dangers of the slippery slope to Frankenscience.”

All of this is to say that franken-, the prefix we use to talk about human efforts to interfere with nature, flips between “funny” and “scary” with ease. Like Shelley’s monster himself, an ungainly patchwork of salvaged parts, it can seem goofy until it doesn’t—until it taps into an abiding anxiety that technology raises in us, a fear of overstepping.

Waldman’s piece hints at how language can shape discussions while retaining a rather playful quality.

This series looks to be a good introduction while being a bit problematic in spots, which roughly sums up my conclusion about their ‘nano’ series in my Oct. 7, 2016 posting titled: Futurography’s nanotechnology series: a digest.

By the way, I noted the mention of CRISPR as it brought up an issue that they don’t appear to be addressing in this series (perhaps they will do this elsewhere?): intellectual property.

There’s a patent dispute over CRISPR as noted in this American Chemical Society’s Chemistry and Engineering News Jan. 9, 2017 video,

Playing God

This series on Frankenstein is taking on other contentious issues. A perennial favourite is ‘playing God’ as noted in Bina Venkataraman’s Jan. 11, 2017 essay on the topic,

Since its publication nearly 200 years ago, Shelley’s gothic novel has been read as a cautionary tale of the dangers of creation and experimentation. James Whale’s 1931 film took the message further, assigning explicitly the hubris of playing God to the mad scientist. As his monster comes to life, Dr. Frankenstein, played by Colin Clive, triumphantly exclaims: “Now I know what it feels like to be God!”

The admonition against playing God has since been ceaselessly invoked as a rhetorical bogeyman. Secular and religious, critic and journalist alike have summoned the term to deride and outright dismiss entire areas of research and technology, including stem cells, genetically modified crops, recombinant DNA, geoengineering, and gene editing. As we near the two-century commemoration of Shelley’s captivating story, we would be wise to shed this shorthand lesson—and to put this part of the Frankenstein legacy to rest in its proverbial grave.

The trouble with the term arises first from its murkiness. What exactly does it mean to play God, and why should we find it objectionable on its face? All but zealots would likely agree that it’s fine to create new forms of life through selective breeding and grafting of fruit trees, or to use in-vitro fertilization to conceive life outside the womb to aid infertile couples. No one objects when people intervene in what some deem “acts of God,” such as earthquakes, to rescue victims and provide relief. People get fully behind treating patients dying of cancer with “unnatural” solutions like chemotherapy. Most people even find it morally justified for humans to mete out decisions as to who lives or dies in the form of organ transplant lists that prize certain people’s survival over others.

So what is it—if not the imitation of a deity or the creation of life—that inspires people to invoke the idea of “playing God” to warn against, or even stop, particular technologies? A presidential commission charged in the early 1980s with studying the ethics of genetic engineering of humans, in the wake of the recombinant DNA revolution, sheds some light on underlying motivations. The commission sought to understand the concerns expressed by leaders of three major religious groups in the United States—representing Protestants, Jews, and Catholics—who had used the phrase “playing God” in a 1980 letter to President Jimmy Carter urging government oversight. Scholars from the three faiths, the commission concluded, did not see a theological reason to flat-out prohibit genetic engineering. Their concerns, it turned out, weren’t exactly moral objections to scientists acting as God. Instead, they echoed those of the secular public; namely, they feared possible negative effects from creating new human traits or new species. In other words, the religious leaders who called recombinant DNA tools “playing God” wanted precautions taken against bad consequences but did not inherently oppose the use of the technology as an act of human hubris.

She presents an interesting argument and offers this as a solution,

The lesson for contemporary science, then, is not that we should cease creating and discovering at the boundaries of current human knowledge. It’s that scientists and technologists ought to steward their inventions into society, and to more rigorously participate in public debate about their work’s social and ethical consequences. Frankenstein’s proper legacy today would be to encourage researchers to address the unsavory implications of their technologies, whether it’s the cognitive and social effects of ubiquitous smartphone use or the long-term consequences of genetically engineered organisms on ecosystems and biodiversity.

Some will undoubtedly argue that this places an undue burden on innovators. Here, again, Shelley’s novel offers a lesson. Scientists who cloister themselves as Dr. Frankenstein did—those who do not fully contemplate the consequences of their work—risk later encounters with the horror of their own inventions.

At a guess, Venkataraman seems to be assuming that if scientists communicate and make their case that the public will cease to panic with reference moralistic and other concerns. My understanding is that social scientists have found this is not the case. Someone may understand the technology quite well and still oppose it.

Frankenstein and anti-vaxxers

The Jan. 16, 2017 essay by Charles Kenny is the weakest of the lot, so far (Note: Links have been removed),

In 1780, University of Bologna physician Luigi Galvani found something peculiar: When he applied an electric current to the legs of a dead frog, they twitched. Thirty-seven years later, Mary Shelley had Galvani’s experiments in mind as she wrote her fable of Faustian overreach, wherein Dr. Victor Frankenstein plays God by reanimating flesh.

And a little less than halfway between those two dates, English physician Edward Jenner demonstrated the efficacy of a vaccine against smallpox—one of the greatest killers of the age. Given the suspicion with which Romantic thinkers like Shelley regarded scientific progress, it is no surprise that many at the time damned the procedure as against the natural order. But what is surprising is how that suspicion continues to endure, even after two centuries of spectacular successes for vaccination. This anti-vaccination stance—which now infects even the White House—demonstrates the immense harm that can be done by excessive distrust of technological advance.

Kenny employs history as a framing device. Crudely, Galvani’s experiments led to Mary Shelley’s Frankenstein which is a fable about ‘playing God’. (Kenny seems unaware there are many other readings of and perspectives on the book.) As for his statement ” … the suspicion with which Romantic thinkers like Shelley regarded scientific progress … ,” I’m not sure how he arrived at his conclusion about Romantic thinkers. According to Richard Holmes (in his book, The Age of Wonder: How the Romantic Generation Discovered the Beauty and Terror of Science), their relationship to science was more complex. Percy Bysshe Shelley ran ballooning experiments and wrote poetry about science, which included footnotes for the literature and concepts he was referencing; John Keats was a medical student prior to his establishment as a poet; and Samuel Taylor Coleridge (The Rime of the Ancient Mariner, etc.) maintained a healthy correspondence with scientists of the day sometimes influencing their research. In fact, when you analyze the matter, you realize even scientists are, on occasion, suspicious of science.

As for the anti-vaccination wars, I wish this essay had been more thoughtful. Yes, Andrew Wakefield’s research showing a link between MMR (measles, mumps, and rubella) vaccinations and autism is a sham. However, having concerns and suspicions about technology does not render you a fool who hasn’t progressed from 18th/19th Century concerns and suspicions about science and technology. For example, vaccines are being touted for all kinds of things, the latest being a possible antidote to opiate addiction (see Susan Gados’ June 28, 2016 article for ScienceNews). Are we going to be vaccinated for everything? What happens when you keep piling vaccination on top of vaccination? Instead of a debate, the discussion has devolved to: “I’m right and you’re wrong.”

For the record, I’m grateful for the vaccinations I’ve had and the diminishment of diseases that were devastating and seem to be making a comeback with this current anti-vaccination fever. That said, I think there are some important questions about vaccines.

Kenny’s essay could have been a nuanced discussion of vaccines that have clearly raised the bar for public health and some of the concerns regarding the current pursuit of yet more vaccines. Instead, he’s been quite dismissive of anyone who questions vaccination orthodoxy.

The end of this piece

There will be more essays in Slate’s Frankenstein series but I don’t have time to digest and write commentary for all of them.

Please use this piece as a critical counterpoint to some of the series and, if I’ve done my job, you’ll critique this critique. Please do let me know if you find any errors or want to add an opinion or add your own critique in the Comments of this blog.

Fusing graphene flakes for 3D graphene structures that are 10x as strong as steel

A Jan. 6, 2017 news item on Nanowerk describes how geometry may have as much or more to do with the strength of 3D graphene structures than the graphene used to create them,

A team of researchers at MIT [Massachusetts Institute of Technology] has designed one of the strongest lightweight materials known, by compressing and fusing flakes of graphene, a two-dimensional form of carbon. The new material, a sponge-like configuration with a density of just 5 percent, can have a strength 10 times that of steel.

In its two-dimensional form, graphene is thought to be the strongest of all known materials. But researchers until now have had a hard time translating that two-dimensional strength into useful three-dimensional materials.

The new findings show that the crucial aspect of the new 3-D forms has more to do with their unusual geometrical configuration than with the material itself, which suggests that similar strong, lightweight materials could be made from a variety of materials by creating similar geometric features.

The findings are being reported today [Jan. 6, 2017\ in the journal Science Advances, in a paper by Markus Buehler, the head of MIT’s Department of Civil and Environmental Engineering (CEE) and the McAfee Professor of Engineering; Zhao Qin, a CEE research scientist; Gang Seob Jung, a graduate student; and Min Jeong Kang MEng ’16, a recent graduate.

A Jan. 6, 2017 MIT news release (also on EurekAlert), which originated the news item, describes the research in more detail,

Other groups had suggested the possibility of such lightweight structures, but lab experiments so far had failed to match predictions, with some results exhibiting several orders of magnitude less strength than expected. The MIT team decided to solve the mystery by analyzing the material’s behavior down to the level of individual atoms within the structure. They were able to produce a mathematical framework that very closely matches experimental observations.

Two-dimensional materials — basically flat sheets that are just one atom in thickness but can be indefinitely large in the other dimensions — have exceptional strength as well as unique electrical properties. But because of their extraordinary thinness, “they are not very useful for making 3-D materials that could be used in vehicles, buildings, or devices,” Buehler says. “What we’ve done is to realize the wish of translating these 2-D materials into three-dimensional structures.”

The team was able to compress small flakes of graphene using a combination of heat and pressure. This process produced a strong, stable structure whose form resembles that of some corals and microscopic creatures called diatoms. These shapes, which have an enormous surface area in proportion to their volume, proved to be remarkably strong. “Once we created these 3-D structures, we wanted to see what’s the limit — what’s the strongest possible material we can produce,” says Qin. To do that, they created a variety of 3-D models and then subjected them to various tests. In computational simulations, which mimic the loading conditions in the tensile and compression tests performed in a tensile loading machine, “one of our samples has 5 percent the density of steel, but 10 times the strength,” Qin says.

Buehler says that what happens to their 3-D graphene material, which is composed of curved surfaces under deformation, resembles what would happen with sheets of paper. Paper has little strength along its length and width, and can be easily crumpled up. But when made into certain shapes, for example rolled into a tube, suddenly the strength along the length of the tube is much greater and can support substantial weight. Similarly, the geometric arrangement of the graphene flakes after treatment naturally forms a very strong configuration.

The new configurations have been made in the lab using a high-resolution, multimaterial 3-D printer. They were mechanically tested for their tensile and compressive properties, and their mechanical response under loading was simulated using the team’s theoretical models. The results from the experiments and simulations matched accurately.

The new, more accurate results, based on atomistic computational modeling by the MIT team, ruled out a possibility proposed previously by other teams: that it might be possible to make 3-D graphene structures so lightweight that they would actually be lighter than air, and could be used as a durable replacement for helium in balloons. The current work shows, however, that at such low densities, the material would not have sufficient strength and would collapse from the surrounding air pressure.

But many other possible applications of the material could eventually be feasible, the researchers say, for uses that require a combination of extreme strength and light weight. “You could either use the real graphene material or use the geometry we discovered with other materials, like polymers or metals,” Buehler says, to gain similar advantages of strength combined with advantages in cost, processing methods, or other material properties (such as transparency or electrical conductivity).

“You can replace the material itself with anything,” Buehler says. “The geometry is the dominant factor. It’s something that has the potential to transfer to many things.”

The unusual geometric shapes that graphene naturally forms under heat and pressure look something like a Nerf ball — round, but full of holes. These shapes, known as gyroids, are so complex that “actually making them using conventional manufacturing methods is probably impossible,” Buehler says. The team used 3-D-printed models of the structure, enlarged to thousands of times their natural size, for testing purposes.

For actual synthesis, the researchers say, one possibility is to use the polymer or metal particles as templates, coat them with graphene by chemical vapor deposit before heat and pressure treatments, and then chemically or physically remove the polymer or metal phases to leave 3-D graphene in the gyroid form. For this, the computational model given in the current study provides a guideline to evaluate the mechanical quality of the synthesis output.

The same geometry could even be applied to large-scale structural materials, they suggest. For example, concrete for a structure such a bridge might be made with this porous geometry, providing comparable strength with a fraction of the weight. This approach would have the additional benefit of providing good insulation because of the large amount of enclosed airspace within it.

Because the shape is riddled with very tiny pore spaces, the material might also find application in some filtration systems, for either water or chemical processing. The mathematical descriptions derived by this group could facilitate the development of a variety of applications, the researchers say.

“This is an inspiring study on the mechanics of 3-D graphene assembly,” says Huajian Gao, a professor of engineering at Brown University, who was not involved in this work. “The combination of computational modeling with 3-D-printing-based experiments used in this paper is a powerful new approach in engineering research. It is impressive to see the scaling laws initially derived from nanoscale simulations resurface in macroscale experiments under the help of 3-D printing,” he says.

This work, Gao says, “shows a promising direction of bringing the strength of 2-D materials and the power of material architecture design together.”

There’s a video describing the work,

Here’s a link to and a citation for the paper,

The mechanics and design of a lightweight three-dimensional graphene assembly by Zhao Qin, Gang Seob Jung, Min Jeong Kang, and Markus J. Buehler. Science Advances  06 Jan 2017: Vol. 3, no. 1, e1601536 DOI: 10.1126/sciadv.1601536  04 January 2017

This paper appears to be open access.

Understanding nanotechnology with Timbits; a peculiarly Canadian explanation

For the uninitiated, Timbits are also known as donut holes. Tim Hortons, founded by ex-National Hockey League player Tim Horton who has since deceased, has taken hold in the Canada’s language and culture such that one of our scientists trying to to explain nanotechnology thought it would be best understood in terms of Timbits. From a Jan. 14, 2017 article (How nanotechnology could change our lives) by Vanessa Lu for thestar.com,

The future is all in the tiny.

Known as nanoparticles, these are the tiniest particles, so small that we can’t see them or even imagine how small they are.

University of Waterloo’s Frank Gu paints a picture of their scale.

“Take a Timbit and start slicing it into smaller and smaller pieces, so small that every Canadian — about 35 million of us — can hold a piece of the treat,” he said. “And those tiny pieces are still a little bigger than a nanoparticle.”

For years, consumers have seen the benefits of nanotechnology in everything from shrinking cellphones to ultrathin televisions. Apple’s iPhones have become more powerful as they have become smaller — where a chip now holds billions of transistors.

“As you go smaller, it creates less footprint and more power,” said Gu, who holds the Canada research chair in advanced targeted delivery systems. “FaceTime, Skype — they are all powered by nanotechnology, with their retina display.”

Lu wrote a second January 14, 2017 article (Researchers developing nanoparticles to purify water) for thestar.com,

When scientists go with their gut or act on a hunch, it can pay off.

For Tim Leshuk, a PhD student in nanotechnology at the University of Waterloo, he knew it was a long shot.

Leshuk had been working with Frank Gu, who leads a nanotechnology research group, on using tiny nanoparticles that have been tweaked with certain properties to purify contaminated water.

Leshuk was working on the process, treating dirty water such as that found in Alberta’s oilsands, with the nanoparticles combined with ultraviolet light. He wondered what might happen if exposed to actual sunlight.

“I didn’t have high hopes,” he said. “For the heck of it, I took some beakers out and put them on the roof. And when I came back, it was far more effective that we had seen with regular UV light.

“It was high-fives all around,” Leshuk said. “It’s not like a Brita filter or a sponge that just soaks up pollutants. It completely breaks them down.”

Things are accelerating quickly, with a spinoff company now formally created called H2nanO, with more ongoing tests scheduled. The research has drawn attention from oilsands companies, and [a] large pre-pilot project to be funded by the Canadian Oil Sands Innovation Alliance is due to get under way soon.

The excitement comes because it’s an entirely green process, converting solar energy for cleanup, and the nanoparticle material is reuseable, over and over.

It’s good to see a couple of articles about nanotechnology. The work by Tim Leshuk was highlighted here in a Dec. 1, 2015 posting titled:  New photocatalytic approach to cleaning wastewater from oil sands. I see the company wasn’t mentioned in the posting so, it must be new; you can find H2nanO here.

Discussion of a divisive topic: the Oilsands

As for the oilsands, it’s been an interesting few days with the Prime Minister’s (Justin Trudeau) suggestion that dependence would be phased out causing a furor of sorts. From a Jan. 13, 2017 article by James Wood for the Calgary Herald,

Prime Minister Justin Trudeau’s musings about phasing out the oilsands Friday [Jan. 13, 2017] were met with a barrage of criticism from Alberta’s conservative politicians and a pledge from Premier Rachel Notley that the province’s energy industry was “not going anywhere, any time soon.”

Asked at a town hall event in Peterborough [Ontario] about the federal government’s recent approval of Kinder Morgan’s Trans Mountain pipeline expansion, Trudeau reiterated his longstanding remarks that he is attempting to balance economic and environmental concerns.

“We can’t shut down the oilsands tomorrow. We need to phase them out. We need to manage the transition off of our dependence on fossil fuels but it’s going to take time and in the meantime we have to manage that transition,” he added.

Northern Alberta’s oilsands are a prime target for environmentalists because of their significant output of greenhouse gas emissions linked to global climate change.

Trudeau, who will be in Calgary for a cabinet retreat on Jan. 23 and 24 [2017], also said again that it is the responsibility of the national government to get Canadian resources to market.

Meanwhile, Jane Fonda, Hollywood actress, weighed in on the issue of the Alberta oilsands with this (from a Jan. 11, 2017 article by Tristan Hopper for the National Post),

Fort McMurrayites might have assumed the celebrity visits would stop after the city was swept first by recession, and then by wildfire.

Or when the provincial government introduced a carbon tax and started phasing out coal.

And surely, with Donald Trump in the White House, even the oiliest corner of Canada would shift to the activist back burner.

But no; here comes Jane Fonda.

“We don’t need new pipelines,” she told a Wednesday [Jan. 11, 2017] press conference at the University of Alberta where she also dismissed Prime Minister Justin Trudeau as a “good-looking Liberal” who couldn’t be trusted.

Saying that her voice was joined with the “Indigenous people of Canada,” Fonda explained her trip to Alberta by saying “when you’re famous you can help amplify the voices of people that can’t necessarily get a lot of press people to come out.”

Fonda is in Alberta at the invitation of Greenpeace, which has brought her here in support of the Treaty Alliance Against Tar Sands Expansion — a group of Canadian First Nations and U.S. tribes opposed to new pipelines to the Athabasca oilsands.

Appearing alongside Fonda, at a table with a sign reading “Respect Indigenous Decisions,” was Grand Chief Stewart Phillip, who, as leader of the Union of B.C. Indian Chiefs, has led anti-pipeline protests and litigation in British Columbia.

“The future is going to be incredibly litigious,” he said in reference to the approved expansion of the Trans-Mountain pipeline.

The event also included Grand Chief Derek Nepinak of the Assembly of Manitoba Chiefs, which is leading a legal challenge to federal approval of the Line 3 pipeline.

Although much of Athabasca’s oil production now comes from “steam-assisted gravity drainage” projects that requires minimal surface disturbance, on Tuesday Fonda took the requisite helicopter tour of a Fort McMurray-area open pit mine.

As you can see, there are not going to be any easy answers.

Sea sponges don’t buckle under pressure

You wouldn’t think a sponge (the sea creature) was particularly tough but it is according to a Jan. 4, 2017 news item on Nanowerk,

Judging by their name alone, orange puffball sea sponges might seem unlikely paragons of structural strength. But maintaining their shape at the bottom of the churning ocean is critical to the creatures’ survival, and new research shows that tiny structural rods in their bodies have evolved the optimal shape to avoid buckling under pressure.

The rods, called strongyloxea spicules, measure about 2 millimeters long and are thinner than a human hair. Hundreds of them are bundled together, forming stiff rib-like structures inside the orange puffball’s spongy body. It was the odd and remarkably consistent shape of each spicule that caught the eye of Brown University engineers Haneesh Kesari and Michael Monn. Each one is symmetrically tapered along its length — going gradually from fatter in the middle to thinner at the ends.

Caption: Tiny rods found inside the bodies of orange puffball sea sponges have an interesting tapered shape. That shape, new research shows, turns out to be a match for the Clausen profile, a column shape shown to be optimal for resistance to buckling failure. Credit: Michael Monn, Haneesh Kesari / Brown University

A Jan. 4, 2017 Brown University news release on EurekAlert, which originated the news item, describes the research in more detail,

Using structural mechanics models and a bit of digging in obscure mathematics journals, Monn and Kesari showed the peculiar shape of the spicules to be optimal for resistance to buckling, the primary mode of failure for slender structures. This natural shape could provide a blueprint for increasing the buckling resistance in all kinds of slender human-made structures, from building columns to bicycle spokes to arterial stents, the researchers say.

“This is one of the rare examples that we’re aware of where a natural structure is not just well-suited for a given function, but actually approaches a theoretical optimum,” said Kesari, an assistant professor of engineering at Brown. “There’s no engineering analog for this shape — we don’t see any columns or other slender structures that are tapered in this way. So in this case, nature has shown us something quite new that we think could be useful in engineering.”

The findings are published in the journal Scientific Reports.

Function and form

Orange puffball sponges (Tethya aurantia) are native to the Mediterranean Sea. They live mainly in rocky coastal environments, where they’re subject to the constant stress of underwater waves and tidal forces. Sponges are filter feeders — they pump water through their bodies to extract nutrients and oxygen. To do this, their bodies need to be porous and compliant, but they also need enough stiffness to avoid being deformed too much.

“If you compress them too much, you’re essentially choking them,” Kesari said. “So maintaining their stiffness is critical to their survival.”

And that means the spicules, which make up the rib-like structures that give sponges their stiffness, are critical components. When Monn and Kesari saw the shapes of the spicules under a microscope, the consistency of the tapered shape from spicule to spicule was hard to miss.

“We saw the shape and wondered if there might be an engineering principle at work here,” Kesari said.

To figure that out, the researchers first needed to understand what forces were acting on each individual spicule. So Monn and Kesari developed a structural mechanics model of spicules bundled within a sponge’s ribs. The model showed that the mismatch in stiffness between the bulk of the sponge’s soft body and the more rigid spicules causes each spicule to experience primarily one type of mechanical loading — a compression load on each of its ends.

“You can imagine taking a toothpick and trying to squeeze it longways between your fingers,” Monn said. “That’s how these spicules see the world.”

The primary mode of failure for a structure with this mechanical load is through buckling. At a certain critical load, the structure starts to bend somewhere along its length. Once the bending starts, the force transferred by the load is amplified at the bending point, which causes the structure to break or collapse.

Once Kesari and Monn knew what forces were acting on the spicules and how they would fail, the next step was looking to see if there was anything special about them that helped them resist buckling. Scanning electron microscope images of the inside of a spicule and other tests showed that they were monolithic silica — essentially glass.

“We could see that there was no funny business going on with the material properties,” Monn said. “If there was anything contributing to its mechanical performance, it would have to be the shape.”

Optimal shape

Kesari and Monn combed the literature to see if they could find anything on tapering in slender structures. They came up empty in the modern engineering literature. But they found something interesting published more than 150 years ago by a German scientist named Thomas Clausen.

In 1851, Clausen proposed that columns that are tapered toward their ends should have more buckling resistance than plain cylinders, which had been and still are the primary design for architectural columns. In the 1960s, mathematician Joseph Keller published an ironclad mathematical proof that the Clausen column was indeed optimal for resistance to buckling — having 33 percent better resistance than a cylinder. Even compared to a very similar shape — an ellipse, which is slightly fatter in the middle and pointier at the ends — the Clausen column had 18 percent better buckling resistance.

Knowing what the optimal column shape is, Monn and Kesari started making precise dimensional measurements of dozens of spicules. They showed that their shapes were remarkably consistent and nearly identical to that of the Clausen column.

“The spicules were a match for the best shape of all possible column shapes,” Monn said.

It seems in this case, natural selection figured out something that engineers have not. Despite the fact that it’s been mathematically shown to be the optimal column shape, the Clausen profile isn’t widely known in the engineering community. Kesari and Monn hope this work might bring it out of the shadows.

“We see this as an addition to our library of structural designs,” Monn said. “We’re not just talking about an improvement of a few percent. This shape is 33 percent better than the cylinder, which is quite an improvement.”

In particular, the shape would be particularly useful in a new generation of materials made from nanoscale truss structures. “It would be easy to 3-D print the Clausen profile into these materials, and you’d get a tremendous increase in buckling resistance, which is often how these materials fail.”

Lessons from nature

The field of bio-inspired engineering began at a time when many people viewed adaptive evolution as an unceasing march toward perfection. If that were true, scientists should find untold numbers of optimal structures in nature.

But the modern understanding of evolution is a bit different. It’s now understood that in order for a trait to be conserved by natural selection, it doesn’t need to be optimal. It just needs to be good enough to work. That has put a bit of a damper on the enthusiasm for bio-inspired engineering, Kesari and Monn say.

However, they say, this work shows that nearly optimal structures are out there if researchers look in the right places. In this case, they looked at creatures from a very old phylum — sea sponges are among the very first animals on Earth — with plenty of time to evolve under consistent selection pressures.

Sponges are also fairly simple creatures, so understanding the function of a given trait is relatively straightforward. In this case, the spicule appears to have one and only one job to do — provide stiffness. Compare that to, for example, human bone, which not only provides support but must also accommodate arteries, provide attachment points for muscles and house bone marrow. Those other functions may cause tradeoffs in adaptations for strength or stiffness.

“With the sponges, you have lots of evolutionary pressure, lots of time and opportunity to respond to that pressure, and functional elements that can be easily identified,” Kesari said.

With those as guiding principles, there may well be more ideal structures out there waiting to be found.

“This work shows that nature can hit an optimum,” Kesari said, “and the biological world can still be hiding completely new designs of considerable technological significance in plain sight.”

Here’s a link to and a citation for the paper,

A new structure-property connection in the skeletal elements of the marine sponge Tethya aurantia that guards against buckling instability by Michael A. Monn & Haneesh Kesari. Scientific Reports 7, Article number: 39547 (2017) doi:10.1038/srep39547 Published online: 04 January 2017

This paper is open access.

Kesari and Monn have researched sea sponges previously as can be seen in my April 7, 2015 posting, which highlights their work on strength and Venus’ flower basket sea sponge.