Canada’s Chief Science Advisor: the first annual report

Dr. Mona Nemer, Canada’s Chief Science Advisor, and her office have issued their 2018 annual report. It is also the office’s first annual report. (Brief bit of history: There was a similar position, National Science Advisor (Canada) from 2004-2008. Dr. Arthur Carty, the advisor, was dumped and the position eliminated when a new government by what was previously the opposition party won the federal election.)

The report can be found in html format here which is where you’ll also find a link to download the full 40 pp. report as a PDF.

Being an inveterate ‘skip to the end’ kind of a reader, I took a boo at the Conclusion,

A chief science advisor plays an important role in building consensus and supporting positive change to our national institutions. Having laid the foundation for the function of the Chief Science Advisor in year one, I look forward to furthering the work that has been started and responding to new requests from the Prime Minister, the Minister of Science and members of Cabinet in my second year. I intend to continue working closely with agency heads, deputy ministers and the science community to better position Canada for global leadership in science and innovation.

I think the conclusion could have done with a little more imagination and/or personality, even by government report standards. While there is some interesting material in other parts of the report, on the whole, it is written in a pleasant, accessible style that raises questions.

Let’s start here with one of the sections that raises questions,

The Budget 2018 commitment of $2.8 billion to build multi-purpose, collaborative federal science and technology facilities presents a once-in-a-generation opportunity to establish a strong foundation for Canada’s future federal science enterprise. Similarly momentous opportunities exist to create a Digital Research Infrastructure Strategy [emphasis mine] for extramural science and a strategic national approach to Major Research Facilities. This is an important step forward for the Government of Canada in bringing principles and standards for federal science in line with practices in the broader scientific community; as such, the Model Policy will not only serve for the benefit of federal science, but will also help facilitate collaboration with the broader scientific community.

Over the past year, my office has participated in the discussion around these potentially transformative initiatives. We offered perspectives on the evidence needed to guide decisions regarding the kind of science infrastructure that can support the increasingly collaborative and multidisciplinary nature of science.

Among other things, I took an active part in the deliberations of the Deputy Ministers Science Committee (DMSC) and in the production of recommendations to Cabinet with respect to federal science infrastructure renewal. I have also analyzed the evolving nature and needs for national science facilities that support the Canadian science community.

Success in building the science infrastructure of the next 50 years and propelling Canadian science and research to new heights will require that the multiple decisions around these foundational initiatives dovetail based on a deep and integrated understanding of Canada’s science system and commitment to maintaining a long-term vision.

Ultimately, these infrastructure investments will need to support a collective vision and strategy for science in Canada, including which functions federal science should perform, and which should be shared with or carried out separately by academic and private sector researchers, in Canada and abroad. These are some of the essential considerations that must guide questions of infrastructure and resource allocation.

Canadian government digital infrastructure is a pretty thorny issue and while the article I’m referencing is old, I strongly suspect they still have many of the same problems. I’m going to excerpt sections that are focused on IT, specifically, the data centres and data storage. Prepare yourself for a bit of a shock as you discover the government did not know how many data centres it had. From a December 28, 2016 article by James Bagnall for the Ottawa Citizen,

Information technology [IT} offered an even richer vein of potential savings. For half a century, computer networks and software applications had multiplied willy-nilly as individual departments and agencies looked after their own needs. [emphases mine] The result was a patchwork of incompatible, higher-cost systems. Standardizing common, basic technologies such as email, data storage and telecommunications seemed logical. [emphasis mine]

It had been tried before. But attempts to centralize the buying of high-tech gear and services had failed, largely because federal departments were allowed to opt out. Most did so. They did want to give up control of their IT networks to a central agency.

The PCO determined this time would be different. The prime minister had the authority to create a new federal department through a simple cabinet approval known as an order-in-council. Most departments, including a reluctant Canada Revenue Agency and Department of National Defence, would be forced to carve out a significant portion of their IT groups and budgets — about 40 per cent on average — and hand them over to Shared Services.

Crucially, the move would not be subject to scrutiny by Parliament. And so Shared Services [Shared Services Canada] was born on Aug. 3, 2011.

Speed was demanded of the agency from the start. Minutes of meetings involving senior Shared Services staff are studded with references to “tight schedules” and the “urgency” of getting projects done.

Part of that had to do with the sheer age of the government’s infrastructure. The hardware was in danger of breaking down and the underlying software for many applications was so old that suppliers such as Microsoft, PeopleSoft and Adobe had stopped supporting it. [emphases mine]

The faster Shared Services could install new networks, the less money it would be forced to throw at solving the problems caused by older technology.

… Because the formation of Shared Services had been shrouded in such secrecy, the hiring of experienced outsiders happened last minute. Grant Westcott would join Shared Services as COO in mid-September.

Sixty-three years old at the time, Wescott had served as assistant deputy minister of Public Services in the late 1990s, when he managed the federal government’s telecommunications networks. In that role, he oversaw the successful effort to prepare systems for the year 2000.

More relevant to his new position at Shared Services, Westcott had also worked for nearly a decade at CIBC. As executive vice-president, he had been instrumental in overhauling the bank’s IT infrastructure. Twenty-two of the Canadian Imperial Bank of Commerce’s [CIBC] data centres were shrunk into just two, saving the bank more than $40 million annually in operating costs. Among other things, Westcott and his team consolidated 15 global communications networks into a single voice and data system. More savings resulted.

It wasn’t without glitches. CIBC was forced on several occasions to apologize to customers for a breakdown in certain banking services.

Nevertheless, Westcott’s experience suggested this was not rocket science, at least not in the private sector. [emphasis mine] Streamlining and modernizing IT networks could be done under the right conditions. …

While Shared Services is paying Bell only for the mailboxes it has already moved [there was an email project, which has since been resolved], the agency is still responsible for keeping the lights on at the government’s legacy data centres [emphasis mine]— where most of the electronic mailboxes, and much of the government’s software applications, for that matter, are currently stored.

These jumbles of computers, servers, wires and cooling systems are weighed down by far too much history — and it took Shared Services a long time to get its arms around it.

“We were not created with a transformation plan already in place because to do that, we needed to know what we had,” Forand [Liseanne Forand, then president of Shared Services Canada] explained to members of Parliament. “So we spent a year counting things.” [emphasis mine]

Shared Services didn’t have to start from scratch. Prior to the agency’s 2011 launch, the PCO’s administrative review tried to estimate the number of data centres in government by interviewing the IT czars for each of the departments. Symptomatic of the far-flung nature of the government’s IT networks — and how loosely these were managed — the PCO [Privy Council Office] didn’t even come close.

“When I was at Privy Council, we thought there might be about 200 data centres,” Benoit Long, senior assistant deputy minister of Shared Services told the House of Commons committee last May [2016], “After a year, we had counted 495 and I am still discovering others today.” [emphasis mine]

Most of these data centres were small — less than 1,000 square feet — often set up by government scientists who didn’t bother telling their superiors. Just 22 were larger than 5,000 square feet — and most of these were in the National Capital Region. These were where the bulk of the government’s data and software applications were stored.

In 2012, Grant Westcott organized a six-week tour of these data centres to see for himself what Shared Services was dealing with. According to an agency employee familiar with the tour’s findings, Westcott came away profoundly shocked.

Nearly all the large data centres were decrepit [emphasis mine], undercapitalized and inefficient users of energy. Yet just one was in the process of being replaced. This was a Heron Road facility with a leaking roof, considered key because it housed Canada Revenue Agency [CRA] data.

Bell Canada had been awarded a contract to build and operate a new data centre in Buckingham. The CRA data would be transferred there in the fall of 2013. A separate part of that facility is also meant to host the government’s email system.

The remaining data centres offered Westcott a depressing snapshot. [emphasis mine]

Most of the gear was housed in office towers, rather than facilities tailormade for heavy-duty electronics. With each new generation of computer or servers came requirements for more power and more robust cooling systems. The entire system was approaching a nervous breakdown.[emphasis mine]

Not only were the data centres inherited by Shared Services running out of space, the arrangements for emergency power bordered on makeshift. One Defence Department data centre, for instance, relied [on] a backup generator powered by a relatively ancient jet engine. [emphases mine] While unconventional, there was nothing unique about pressing military hardware into this type of service. The risk had to do with with training — the older the jet engine, the fewer employees there were who knew how to fix it. [emphasis mine]

The system for backing up data wasn’t much better. At Statistics Canada — where Westcott’s group was actually required to swear an oath to keep the data confidential before entering the storage rooms — the backup procedure called for storing copies of the data in a separate but nearby facility.While that’s OK for most disasters, the relative proximity still meant Statcan’s repository of historical data was vulnerable to a bombing or major earthquake.

Two and a quarter years later, I imagine they’ve finished counting their data centres but how much progress they might have made on replacing and upgrading the hardware and the facilities is not known to me but it’s hard to believe that all of the fixes have been made, especially after reading the Chief Science Advisor’s 2018 annual report (the relevant bits are coming up shortly.

If a more comprehensive view of the current government digital infrastructure interests you, I highly recommend Bagnall’s article. It’s heavy going for someone who’s not part of the Ottawa scene but it’s worth the effort as it gives you some insight into the federal government’s workings, which seem remarkably similar regardless as to which party is in power. Plus, it’s really well written.

Getting back to the report, I realize it’s not within the Chief Science Advisor’s scope to discuss the entirety of the government’s digital infrastructure but “… 40 percent of facilities are more than 50 years old …” doesn’t really convey the depth and breadth of the problem. Let’s start with any easy question: 40 percent of what? Just how many of these computers are over 50 year old? By the way, the situation as the Chief Science Advisor’s Office describes it, gets a little worse, keep reading,

Over the past two years, the Treasury Board Secretariat has worked with the federal science community to develop an inventory of federal science infrastructure.

The project was successful in creating a comprehensive list of federal science facilities and documenting their state of repair. It reveals that some 40 percent of facilities are more than 50 years old and another 40 percent are more than 25 years old. The problem is most acute in the National Capital Region.

A similar inventory of research equipment is now underway. This will allow for effective coordination of research activities across organizations and with external collaborators. The Canada Foundation for Innovation, which provides federal funding toward the costs of academic research infrastructure, has created a platform upon which this work will build.

Bravo to the Treasury Board for tracking down those data centres and more. Following on Bagnall’s outline of the problems, it had to be discouraging work, necessary but discouraging. Also, was this part of the Shared Services Canada counting exercise? In any event, they have even more ahead of them , both the Treasury Board and the Chief Science Advisor.

Couldn’t there have been a link to the inventory? Is it secret information? It would have been nice if Canada’s Chief Science Advisor’s website had offered additional information or a link to supplement the 2018 annual report.

Admittedly there probably aren’t that many people who’d like more information about infrastructure and federal science offices but surely they could somehow give us a little more detail. For example, I understand there’s an AI (artificial intelligence) initiative within the government and given some of the issues, such as the Phoenix payroll system debacle and the digital infrastructure consisting of ‘chewing gum and baling wire’, my confidence is shaken. Have they learned any lessons? The report doesn’t offer any assurance they are taking these ‘lessons’ into account as they forge onwards.

Lies, damned lies, and statistics

I’ve always liked that line about lies and I’ll get to its applicability with regard to the Chief Science Advisor’s 2018 report but first, from the Lies, Damned Lies, and Statistics Wikipedia entry (Note: Links have been removed),

Lies, damned lies, and statistics” is a phrase describing the persuasive power of numbers, particularly the use of statistics to bolster weak arguments. It is also sometimes colloquially used to doubt statistics used to prove an opponent’s point.

The phrase was popularized in the United States by Mark Twain (among others), who attributed it to the British prime minister Benjamin Disraeli: “There are three kinds of lies: lies, damned lies, and statistics.” However, the phrase is not found in any of Disraeli’s works and the earliest known appearances were years after his death. Several other people have been listed as originators of the quote, and it is often erroneously attributed to Twain himself.[1]

Here’s a portion of the 2018 report, which is “using the persuasive power of numbers”, in this case, to offer a suggestion about why people mistrust science,

A recent national public survey found that eight in ten respondents wanted to know more about science and how it affects our world, and roughly the same percentage of people are comfortable knowing that scientific answers may not be definitive. 1 [emphases mine] While this would seem to be a positive result, it may also suggest why some people mistrust science, [emphasis mine] believing that results are fluid and can support several different positions. Again, this reveals the importance of effectively communicating science, not only to counter misinformation, but to inspire critical thinking and an appreciation for curiosity and discovery.

I was happy to see the footnote but surprised that the writer(s) simply trotted out a statistic without any hedging about the reliability of the data. Just because someone says that eight out of ten respondents feel a certain way doesn’t give me confidence in the statistic and I explain why in the next section.

They even added that ‘people are aware that scientific answers are not necessarily definitive’. So why not be transparent about the fallibility of statistics in your own report? If that’s not possible in the body of the report, then maybe put the more detailed, nuanced analysis in an appendix.Finally, how did they derive the notion from the data suggesting that people are aware that science is a process of refining knowledge and adjusting it as needed might lead to distrust of science? It’s possible of course but how do you make that leap based on the data you’re referencing? As for that footnote, where was the link to the report? Curious, I took a look at the report.

Ontario Science Centre and its statistics

For the curious, you can find the Ontario Science Centre. Canadian Science Attitudes Research (July 6, 2018) here where you’ll see the methodology is a little light on detail. (The company doing this work, Leger, is a marketing research and analysitcs company.) Here’s the methodology, such as it is,


An online survey of 1501 Canadians was completed between June 18 and 26, 2018, using Leger’s online panel.The margin of error for this study was +/-2.5%, 19 times out of 20.


Where applicable, this year’s results have been compared back to a similar study done in 2017 (OSC Canadian Science Attitudes Research, August 2017). The use of arrows ( ) indicate significant changes between the two datasets.


Leger’s online panel has approximately 450,000 members nationally and has a retention rate of 90%.


Stringent quality assurance measures allow Leger to achieve the high-quality standards set by the company. As a result, its methods of data collection and storage outperform the norms set by WAPOR (The World Association for Public Opinion Research). These measures are applied at every stage of the project: from data collection to processing, through to analysis.We aim to answer our clients’ needs with honesty, total confidentiality, and integrity.

I didn’t find any more substantive description of the research methodology but there is a demographic breakdown at the end of the report and because they’ve given you the number of the number of people paneled (1501) at the beginning of the report, you can figure out just how many people fit into the various categories.

Now, the questions: What is a Leger online panel? How can they confirm that the people who took the survey are Canadian? How did they put together this Leger panel, e.g., do people self-seletct? Why did they establish a category for people aged 65+? (Actually, in one section of the report, they make reference to 55+.) Read on for why that last one might be a good question.

I haven’t seen any surveys or polls originating in Canada that seem to recognize that a 65 year old is not an 80 year old. After all, they don’t lump in 20 year olds with 40 year olds because these people are at different stages in their lifespans, often with substantive differences in their outlook.. Maybe there are no differences between 65 year olds and 80 year olds but we’ll never know because no one ever asks. When you add in Canada’s aging population, the tendency to lump all ‘old’ people together seems even more thoughtless.

These are just a few questions that spring to mind and most likely, the pollsters have a more substantive methodology. There may have been a perfectly good reason for not including more detail in the version of the report I’ve linked to. For example, a lot of people will stop reading a report that’s ‘bogged down’ in too much detail. Fair enough but why not give a link to a more substantive methodology or put it in an appendix?

One more thing, I couldn’t find any data in the Ontario Science Centre report that would support the Chief Science Advisor Office’s speculation about why people might not trust science. They did hedge, as they should, “… seem to be a positive result, it may also suggest why some people mistrust science …” but the hedging follows directly after the ” … eight in ten respondents …”. which structurally suggests that there is data supporting the speculation. If there is, it’s not in the pollster’s report.

Empire building?

I just wish the office of the Chief Science Advisor had created the foundation to support this,

Establishing a National Science Advisory System

Science and scientific information permeate the work of the federal government, creating a continuous need for decision-makers to negotiate uncertainty in interpreting often variable and generally incomplete scientific evidence for policy making. With science and technology increasingly a part of daily life, the need for science advice will only intensify.

As such, my office has been asked to assess and recommend ways to improve the existing science advisory function [emphasis mine] within the federal government. To that end, we examined the science advisory systems in other countries, such as Australia, New Zealand, the United Kingdom, and the United States, as well as the European Union. [emphasis mine]

A key feature of all the systems was a network of department-level science advisors. These are subject matter experts who work closely with senior departmental officials and support the mandate of the chief science advisor. They stand apart from day-to-day operations and provide a neutral sounding board for senior officials and decision-makers evaluating various streams of information. They also facilitate the incorporation of evidence in decision-making processes and act as a link between the department and external stakeholders.

I am pleased to report that our efforts to foster a Canadian science advisory network are already bearing fruit. Four organizations [emphasis mine] have so far moved to create a departmental science advisor position, and the first incumbents at the Canadian Space Agency and the National Research Council [emphasis mine] are now in place. I have every confidence that the network of departmental science advisors will play an important role in enhancing science advice and science activities planning, especially on cross-cutting issues.

Who asked the office to assess and recommend ways to improve the existing science advisory function? By the way, the European Union axed the position of Chief Science Adviser, after Anne Glover, the first such adviser in their history, completed the term of her appointment in 2014 (see James Wilsdon’s November 13, 2014 article for the Guardian). Perhaps they meant countries in the European Union?

Which four organizations have created departmental science advisor positions? Presumably the Canadian Space Agency and the National Research Council were two of the organizations?

If you’re looking at another country’s science advisory systems do you have some publicly available information, perhaps a report and analysis? Is there a set of best practices? And, how do these practices fit into the Canadian context?

More generally, how much is this going to cost? Are these departmental advisors expected to report to the Canadian federal government’s Chief Science Advisor, as well as, their own ministry deputy minister or higher? Will the Office of the Chief Science Advisor need more staff?

I could be persuaded that increasing the number of advisors and increasing costs to support these efforts are good ideas but it would be nice to see the Office of Chief Science Advisor put some effort into persuasion rather than assuming that someone outside the tight nexus of federal government institutions based in Ottawa is willing to accept statements. that aren’t detailed and/or supported by data.

Final thoughts

I’m torn. I’m supportive of the position of the Chief Science Advisor of Canada and happy to see a report. It looks like there’s some exciting work being done.

I also hold the Chief Advisor and the Office to a high standard. It’s understandable that they may have decided to jettison detail in favour of making the 2018 annual report more readable but what I don’t understand is the failure to provide a more substantive report or information that supports and details the work. There’s a lack of transparency and, also, clarity. What do they mean by the word ‘science’. At a guess, they’re not including the social sciences.

On the whole, the report looks a little sloppy and that’s the last thing I expect from the Chief Science Advisor.

Change the shape of water with nanotubes

An August 24, 2018 news item on ScienceDaily describes a ‘shapeshifting’ water technique,

First, according to Rice University engineers, get a nanotube hole. Then insert water. If the nanotube is just the right width, the water molecules will align into a square rod.

Rice materials scientist Rouzbeh Shahsavari and his team used molecular models to demonstrate their theory that weak van der Waals forces between the inner surface of the nanotube and the water molecules are strong enough to snap the oxygen and hydrogen atoms into place.

Shahsavari referred to the contents as two-dimensional “ice,” because the molecules freeze regardless of the temperature. He said the research provides valuable insight on ways to leverage atomic interactions between nanotubes and water molecules to fabricate nanochannels and energy-storing nanocapacitors.

An August 24, 2018 Rice University news release (also on EurekAlert and received via email), which originated the news item, delves further,

Shahsavari and his colleagues built molecular models of carbon and boron nitride nanotubes with adjustable widths. They discovered boron nitride is best at constraining the shape of water when the nanotubes are 10.5 angstroms wide. (One angstrom is one hundred-millionth of a centimeter.)

The researchers already knew that hydrogen atoms in tightly confined water take on interesting structural properties. Recent experiments by other labs showed strong evidence for the formation of nanotube ice and prompted the researchers to build density functional theory models to analyze the forces responsible.

Shahsavari’s team modeled water molecules, which are about 3 angstroms wide, inside carbon and boron nitride nanotubes of various chiralities (the angles of their atomic lattices) and between 8 and 12 angstroms in diameter. They discovered that nanotubes in the middle diameters had the most impact on the balance between molecular interactions and van der Waals pressure that prompted the transition from a square water tube to ice.

“If the nanotube is too small and you can only fit one water molecule, you can’t judge much,” Shahsavari said. “If it’s too large, the water keeps its amorphous shape. But at about 8 angstroms, the nanotubes’ van der Waals force [if you’re not familiar with the term, see below the link and citation for my brief explanation] starts to push water molecules into organized square shapes.”

He said the strongest interactions were found in boron nitride nanotubes due to the particular polarization of their atoms.

Shahsavari said nanotube ice could find use in molecular machines or as nanoscale capillaries, or foster ways to deliver a few molecules of water or sequestered drugs to targeted cells, like a nanoscale syringe.

Lead author Farzaneh Shayeganfar, a former visiting scholar at Rice, is an instructor at Shahid Rajaee Teacher Training University in Tehran, Iran. Co-principal investigator Javad Beheshtian is a professor at Amirkabir University, Tehran.

Supercomputer resources were provided with support from the [US] National Institutes of Health and an IBM Shared Rice University Research grant.

Here’s a link to and a citation for the paper,

First Principles Study of Water Nanotubes Captured Inside Carbon/Boron Nitride Nanotubes by Farzaneh Shayeganfar, Javad Beheshtian, and Rouzbeh Shahsavari. Langmuir, DOI: 10.1021/acs.langmuir.8b00856 Publication Date (Web): August 23, 2018

Copyright © 2018 American Chemical Society

This paper is behind a paywall.

For the purposes of the posting, van der Waals force(s) are weak adhesive forces measured at the nanoscale. Humans don’t feel them (we’re too big) but gecko lizards can exploit those forces which is why they are able to hang from the ceiling by a single toe.  There’s a more informed description here in the van der Waals force entry on Wikipedia.

Researchers, manufacturers, and administrators need to consider shared quality control challenges to advance the nanoparticle manufacturing industry ‘

Manufacturing remains a bit of an issue where nanotechnology is concerned due to the difficulties of producing nanoparticles of a consistent size and type,

Electron micrograph showing gallium arsenide nanoparticles of varying shapes and sizes. Such heterogeneity [variation]  can increase costs and limit profits when making nanoparticles into products. A new NIST study recommends that researchers, manufacturers and administrators work together to solve this, and other common problems, in nanoparticle manufacturing. Credit: A. Demotiere, E. Shevchenko/Argonne National Laboratory

The US National Institute of Standards and Technology (NIST) has produced a paper focusing on how nanoparticle manufacturing might become more effective, from an August 22, 2018 news item on ScienceDaily,

Nanoparticle manufacturing, the production of material units less than 100 nanometers in size (100,000 times smaller than a marble), is proving the adage that “good things come in small packages.” Today’s engineered nanoparticles are integral components of everything from the quantum dot nanocrystals coloring the brilliant displays of state-of-the-art televisions to the miniscule bits of silver helping bandages protect against infection. However, commercial ventures seeking to profit from these tiny building blocks face quality control issues that, if unaddressed, can reduce efficiency, increase production costs and limit commercial impact of the products that incorporate them.

To help overcome these obstacles, the National Institute of Standards and Technology (NIST) and the nonprofit World Technology Evaluation Center (WTEC) advocate that nanoparticle researchers, manufacturers and administrators “connect the dots” by considering their shared challenges broadly and tackling them collectively rather than individually. This includes transferring knowledge across disciplines, coordinating actions between organizations and sharing resources to facilitate solutions.

The recommendations are presented in a new paper in the journal ACS Applied Nano Materials.

An August 22, 2018 NIST news release, which originated the news item, describes how the authors of the ACS [American Chemical Society) Applied Nano Materials paper developed their recommendations,

“We looked at the big picture of nanoparticle manufacturing to identify problems that are common for different materials, processes and applications,” said NIST physical scientist Samuel Stavis, lead author of the paper. “Solving these problems could advance the entire enterprise.”

The new paper provides a framework to better understand these issues. It is the culmination of a study initiated by a workshop organized by NIST that focused on the fundamental challenge of reducing or mitigating heterogeneity, the inadvertent variations in nanoparticle size, shape and other characteristics that occur during their manufacture.

“Heterogeneity can have significant consequences in nanoparticle manufacturing,” said NIST chemical engineer and co-author Jeffrey Fagan.

In their paper, the authors noted that the most profitable innovations in nanoparticle manufacturing minimize heterogeneity during the early stages of the operation, reducing the need for subsequent processing. This decreases waste, simplifies characterization and improves the integration of nanoparticles into products, all of which save money.

The authors illustrated the point by comparing the production of gold nanoparticles and carbon nanotubes. For gold, they stated, the initial synthesis costs can be high, but the similarity of the nanoparticles produced requires less purification and characterization. Therefore, they can be made into a variety of products, such as sensors, at relatively low costs.

In contrast, the more heterogeneous carbon nanotubes are less expensive to synthesize but require more processing to yield those with desired properties. The added costs during manufacturing currently make nanotubes only practical for high-value applications such as digital logic devices.

“Although these nanoparticles and their end products are very different, the stakeholders in their manufacture can learn much from each other’s best practices,” said NIST materials scientist and co-author J. Alexander Liddle. “By sharing knowledge, they might be able to improve both seemingly disparate operations.”

Finding ways like this to connect the dots, the authors said, is critically important for new ventures seeking to transfer nanoparticle technologies from laboratory to market.

“Nanoparticle manufacturing can become so costly that funding expires before the end product can be commercialized,” said WTEC nanotechnology consultant and co-author Michael Stopa. “In our paper, we outlined several opportunities for improving the odds that new ventures will survive their journeys through this technology transfer ‘valley of death.’”

Finally, the authors considered how manufacturing challenges and innovations are affecting the ever-growing number of applications for nanoparticles, including those in the areas of electronics, energy, health care and materials.

Here’s a link to and a citation for the paper,

Nanoparticle Manufacturing – Heterogeneity through Processes to Products by Samuel M. Stavis, Jeffrey A. Fagan, Michael Stopa, and J. Alexander Liddle. ACS Appl. Nano Mater., Article ASAP DOI: 10.1021/acsanm.8b01239 Publication Date (Web): August 16, 2018

Copyright © 2018 American Chemical Society

This paper is behind a paywall.

I looked at this paper briefly and found it to give a good overview. The focus is on manufacturing and making money. I imagine any discussion about the life cycle of the materials and possible environmental and health risks would have been considered ‘scope creep’.

I have two postings that provide additional information about manufacturing concerns, my February 10, 2014 posting:  ‘Valley of Death’, ‘Manufacturing Middle’, and other concerns in new government report about the future of nanomanufacturing in the US and my September 5, 2016 posting: An examination of nanomanufacturing and nanofabrication.

Structural colo(u)r from transparent 3D printed nanostructures

Caption: Light hits the 3-D printed nanostructures from below. After it is transmitted through, the viewer sees only green light — the remaining colors are redirected. Credit: Thomas Auzinger [downloaded from]

An August 17, 2018 news item on ScienceDaily announces the work illustrated by the image above,

Most of the objects we see are colored by pigments, but using pigments has disadvantages: such colors can fade, industrial pigments are often toxic, and certain color effects are impossible to achieve. The natural world, however, also exhibits structural coloration, where the microstructure of an object causes various colors to appear. Peacock feathers, for instance, are pigmented brown, but — because of long hollows within the feathers — reflect the gorgeous, iridescent blues and greens we see and admire. Recent advances in technology have made it practical to fabricate the kind of nanostructures that result in structural coloration, and computer scientists from the Institute of Science and Technology Austria (IST Austria) and the King Abdullah University of Science and Technology (KAUST) have now created a computational tool that automatically creates 3D-print templates for nanostructures that correspond to user-defined colors. Their work demonstrates the great potential for structural coloring in industry, and opens up possibilities for non-experts to create their own designs. This project will be presented at this year’s top computer graphics conference, SIGGRAPH 2018, by first author and IST Austria postdoc Thomas Auzinger. This is one of five IST Austria presentations at the conference this year.

SIGGRAPH 2018, now ended, was mentioned in my Aug. 9, 2018 posting.but since this presentation is accompanied by a paper, it rates its own posting. For one more excuse, there’s my fascination with structural colour.

An August 17, 2018 Institute of Science and Technology Austria press release (also on EurekAlert), which originated the news item, delves into the work,

The changing colors of a chameleon and the iridescent blues and greens of the morpho butterfly, among many others in nature, are the result of structural coloration, where nanostructures cause interference effects in light, resulting in a variety of colors when viewed macroscopically. Structural coloration has certain advantages over coloring with pigments (where particular wavelengths are absorbed), but until recently, the limits of technology meant fabricating such nanostructures required highly specialized methods. New “direct laser writing” set-ups, however, cost about as much as a high-quality industrial 3D printer, and allow for printing at the scale of hundreds of nanometers (hundred to thousand time thinner than a human hair), opening up possibilities for scientists to experiment with structural coloration.

So far, scientists have primarily experimented with nanostructures that they had observed in nature, or with simple, regular nanostructural designs (e.g. row after row of pillars). Thomas Auzinger and Bernd Bickel of IST Austria, together with Wolfgang Heidrich of KAUST, however, took an innovative new approach that differs in several key ways. First, they solve the inverse design task: the user enters the color they want to replicate, and then the computer creates a nanostructure pattern that gives that color, rather than attempting to reproduce structures found in nature. Moreover, “our design tool is completely automatic,” says Thomas Auzinger. “No extra effort is required on the part of the user.”

Second, the nanostructures in the template do not follow a particular pattern or have a regular structure; they appear to be randomly composed—a radical break from previous methods, but one with many advantages. “When looking at the template produced by the computer I cannot tell by the structure alone, if I see a pattern for blue or red or green,” explains Auzinger. “But that means the computer is finding solutions that we, as humans, could not. This free-form structure is extremely powerful: it allows for greater flexibility and opens up possibilities for additional coloring effects.” For instance, their design tool can be used to print a square that appears red from one angle, and blue from another (known as directional coloring).

Finally, previous efforts have also stumbled when it came to actual fabrication: the designs were often impossible to print. The new design tool, however, guarantees that the user will end up with a printable template, which makes it extremely useful for the future development of structural coloration in industry. “The design tool can be used to prototype new colors and other tools, as well as to find interesting structures that could be produced industrially,” adds Auzinger. Initial tests of the design tool have already yielded successful results. “It’s amazing to see something composed entirely of clear materials appear colored, simply because of structures invisible to the human eye,” says Bernd Bickel, professor at IST Austria, “we’re eager to experiment with additional materials, to expand the range of effects we can achieve.”

“It’s particularly exciting to witness the growing role of computational tools in fabrication,” concludes Auzinger, “and even more exciting to see the expansion of ‘computer graphics’ to encompass physical as well as virtual images.”

Here’s a link to and a citation for the paper,

Computational Design of Nanostructural Color for Additive Manufacturing by Thomas Auzinger, Wolfgang Heidrich, and Bernd Bickel. ACM Trans. Graph. 37, 4, Article 159 (August 2018). 16 pages.

This appears to be open access.

There is also a project page bearing the same title as the paper, Computational Design of Nanostructural Color for Additive Manufacturing.

Nanoparticles in combination could be more toxic

It seems that one set of nanoparticles, e.g., silver nanoparticles, in combination with another material, e.g., cadmium ions, are more dangerous than either one separately according to an August 17, 2018 University of Southern Denmark press release by Birgitte Svennevig (also on EurekAlert but dated August 20, 2018),

Researchers warn that a combination of nanoparticles and contaminants may form a cocktail that is harmful to our cells. In their study, 72 pct. of cells died after exposure to a cocktail of nano-silver and cadmium ions.

Nanoparticles are becoming increasingly widespread in our environment. Thousands of products contain nanoparticles because of their unique properties. Silver nanoparticles are one example: They have an effective antibacterial effect and can be found in refrigerators, sports clothes, cosmetics, tooth brushes, water filters, etc.

There is a significant difference between how the cells react when exposed to nanosilver alone and when they are exposed to a cocktail of nanosilver and cadmium ions. Cadmium ions are naturally found everywhere around us on Earth.

In the study, 72 pct. of the cells died, when exposed to both nanosilver and cadmiun ions. When exposed to nanosilver only, 25 pct. died. When exposed to cadmium ions only, 12 pct. died.

The study was conducted on human liver cancer cells.

  • This study indicates, that we should not look at nanoparticles isolated when we investigate and discuss the effects, they may have on our health. We need to take cocktail effects into account, said Professor Frank Kjeldsen, Dept of Biochemistry and Molecular Biology, SDU, adding:
  • Products with nano particles are being developed and manufactured every day, but in most countries there are no regulations, so there is no way of knowing what and how many nanoparticles are being released into the environment. In my opinion, this should be stopped.

Other studies, led by Professor Kjeldsen have previously shown that human cells interact with metal nanoparticles.

One study showed that nano-silver leads to the formation free radicals in cells and changes in the form and amount of proteins. Many serious diseases are characterized by an overproduction of free radicals in cells. This applies to cancer and neurological diseases such as Alzheimer’s and Parkinson’s.

This is not great news but there are a few things to note about this research. First, it was conducted on cells and therefore not subject to some of the defensive systems found in complete biological organisms such as a mouse or a dandelion plant for example.

Also, since they were cancer cells one might suspect their reactions might differ from those of healthy cells. As for how the cells were exposed to the contaminants, I think (???) they were sitting in a solution of contaminants and most of us do not live in that kind of environment.. Finally, with regard to the concentrations, I have no idea if they are greater than one might expect to encounter in one’s lifecycle but it’s always worth questioning just how much exposure you might expect during yours or a mouse’s or a dandelion’s life.

These caveats aside, Professor Frank Kjeldsen’s work raises some very concerning issues and his work adds to a growing body of evidence.

Here’s a video featuring Dr. Kjeldsen talking about his work,

Here’s a link to and a citation for the paper,

Co-exposure to silver nanoparticles and cadmium induce metabolic adaptation in HepG2 cells by Renata Rank Miranda, Vladimir Gorshkov, Barbara Korzeniowska, Stefan J. Kempf, Francisco Filipak Neto, & Frank Kjeldsen. Nanotoxicology DOI: Published online: 11 Jul 2018

This paper is open access.

It’s a very ‘carbony’ time: graphene jacket, graphene-skinned airplane, and schwarzite

In August 2018, I been stumbled across several stories about graphene-based products and a new form of carbon.

Graphene jacket

The company producing this jacket has as its goal “… creating bionic clothing that is both bulletproof and intelligent.” Well, ‘bionic‘ means biologically-inspired engineering and ‘intelligent‘ usually means there’s some kind of computing capability in the product. This jacket, which is the first step towards the company’s goal, is not bionic, bulletproof, or intelligent. Nonetheless, it represents a very interesting science experiment in which you, the consumer, are part of step two in the company’s R&D (research and development).

Onto Vollebak’s graphene jacket,

Courtesy: Vollebak

From an August 14, 2018 article by Jesus Diaz for Fast Company,

Graphene is the thinnest possible form of graphite, which you can find in your everyday pencil. It’s purely bi-dimensional, a single layer of carbon atoms that has unbelievable properties that have long threatened to revolutionize everything from aerospace engineering to medicine. …

Despite its immense promise, graphene still hasn’t found much use in consumer products, thanks to the fact that it’s hard to manipulate and manufacture in industrial quantities. The process of developing Vollebak’s jacket, according to the company’s cofounders, brothers Steve and Nick Tidball, took years of intensive research, during which the company worked with the same material scientists who built Michael Phelps’ 2008 Olympic Speedo swimsuit (which was famously banned for shattering records at the event).

The jacket is made out of a two-sided material, which the company invented during the extensive R&D process. The graphene side looks gunmetal gray, while the flipside appears matte black. To create it, the scientists turned raw graphite into something called graphene “nanoplatelets,” which are stacks of graphene that were then blended with polyurethane to create a membrane. That, in turn, is bonded to nylon to form the other side of the material, which Vollebak says alters the properties of the nylon itself. “Adding graphene to the nylon fundamentally changes its mechanical and chemical properties–a nylon fabric that couldn’t naturally conduct heat or energy, for instance, now can,” the company claims.

The company says that it’s reversible so you can enjoy graphene’s properties in different ways as the material interacts with either your skin or the world around you. “As physicists at the Max Planck Institute revealed, graphene challenges the fundamental laws of heat conduction, which means your jacket will not only conduct the heat from your body around itself to equalize your skin temperature and increase it, but the jacket can also theoretically store an unlimited amount of heat, which means it can work like a radiator,” Tidball explains.

He means it literally. You can leave the jacket out in the sun, or on another source of warmth, as it absorbs heat. Then, the company explains on its website, “If you then turn it inside out and wear the graphene next to your skin, it acts like a radiator, retaining its heat and spreading it around your body. The effect can be visibly demonstrated by placing your hand on the fabric, taking it away and then shooting the jacket with a thermal imaging camera. The heat of the handprint stays long after the hand has left.”

There’s a lot more to the article although it does feature some hype and I’m not sure I believe Diaz’s claim (August 14, 2018 article) that ‘graphene-based’ hair dye is perfectly safe ( Note: A link has been removed),

Graphene is the thinnest possible form of graphite, which you can find in your everyday pencil. It’s purely bi-dimensional, a single layer of carbon atoms that has unbelievable properties that will one day revolutionize everything from aerospace engineering to medicine. Its diverse uses are seemingly endless: It can stop a bullet if you add enough layers. It can change the color of your hair with no adverse effects. [emphasis mine] It can turn the walls of your home into a giant fire detector. “It’s so strong and so stretchy that the fibers of a spider web coated in graphene could catch a falling plane,” as Vollebak puts it in its marketing materials.

Not unless things have changed greatly since March 2018. My August 2, 2018 posting featured the graphene-based hair dye announcement from March 2018 and a cautionary note from Dr. Andrew Maynard (scroll down ab out 50% of the way for a longer excerpt of Maynard’s comments),

Northwestern University’s press release proudly announced, “Graphene finds new application as nontoxic, anti-static hair dye.” The announcement spawned headlines like “Enough with the toxic hair dyes. We could use graphene instead,” and “’Miracle material’ graphene used to create the ultimate hair dye.”

From these headlines, you might be forgiven for getting the idea that the safety of graphene-based hair dyes is a done deal. Yet having studied the potential health and environmental impacts of engineered nanomaterials for more years than I care to remember, I find such overly optimistic pronouncements worrying – especially when they’re not backed up by clear evidence.

These studies need to be approached with care, as the precise risks of graphene exposure will depend on how the material is used, how exposure occurs and how much of it is encountered. Yet there’s sufficient evidence to suggest that this substance should be used with caution – especially where there’s a high chance of exposure or that it could be released into the environment.

The full text of Dr. Maynard’s comments about graphene hair dyes and risk can be found here.

Bearing in mind  that graphene-based hair dye is an entirely different class of product from the jacket, I wouldn’t necessarily dismiss risks; I would like to know what kind of risk assessment and safety testing has been done. Due to their understandable enthusiasm, the brothers Tidball have focused all their marketing on the benefits and the opportunity for the consumer to test their product (from graphene jacket product webpage),

While it’s completely invisible and only a single atom thick, graphene is the lightest, strongest, most conductive material ever discovered, and has the same potential to change life on Earth as stone, bronze and iron once did. But it remains difficult to work with, extremely expensive to produce at scale, and lives mostly in pioneering research labs. So following in the footsteps of the scientists who discovered it through their own highly speculative experiments, we’re releasing graphene-coated jackets into the world as experimental prototypes. Our aim is to open up our R&D and accelerate discovery by getting graphene out of the lab and into the field so that we can harness the collective power of early adopters as a test group. No-one yet knows the true limits of what graphene can do, so the first edition of the Graphene Jacket is fully reversible with one side coated in graphene and the other side not. If you’d like to take part in the next stage of this supermaterial’s history, the experiment is now open. You can now buy it, test it and tell us about it. [emphasis mine]

How maverick experiments won the Nobel Prize

While graphene’s existence was first theorised in the 1940s, it wasn’t until 2004 that two maverick scientists, Andre Geim and Konstantin Novoselov, were able to isolate and test it. Through highly speculative and unfunded experimentation known as their ‘Friday night experiments,’ they peeled layer after layer off a shaving of graphite using Scotch tape until they produced a sample of graphene just one atom thick. After similarly leftfield thinking won Geim the 2000 Ig Nobel prize for levitating frogs using magnets, the pair won the Nobel prize in 2010 for the isolation of graphene.

Should you be interested, in beta-testing the jacket, it will cost you $695 (presumably USD); order here. One last thing, Vollebak is based in the UK.

Graphene skinned plane

An August 14, 2018 news item (also published as an August 1, 2018 Haydale press release) by Sue Keighley on Azonano heralds a new technology for airplans,

Haydale, (AIM: HAYD), the global advanced materials group, notes the announcement made yesterday from the University of Central Lancashire (UCLAN) about the recent unveiling of the world’s first graphene skinned plane at the internationally renowned Farnborough air show.

The prepreg material, developed by Haydale, has potential value for fuselage and wing surfaces in larger scale aero and space applications especially for the rapidly expanding drone market and, in the longer term, the commercial aerospace sector. By incorporating functionalised nanoparticles into epoxy resins, the electrical conductivity of fibre-reinforced composites has been significantly improved for lightning-strike protection, thereby achieving substantial weight saving and removing some manufacturing complexities.

Before getting to the photo, here’s a definition for pre-preg from its Wikipedia entry (Note: Links have been removed),

Pre-preg is “pre-impregnated” composite fibers where a thermoset polymer matrix material, such as epoxy, or a thermoplastic resin is already present. The fibers often take the form of a weave and the matrix is used to bond them together and to other components during manufacture.

Haydale has supplied graphene enhanced prepreg material for Juno, a three-metre wide graphene-enhanced composite skinned aircraft, that was revealed as part of the ‘Futures Day’ at Farnborough Air Show 2018. [downloaded from]

A July 31, 2018 University of Central Lancashire (UCLan) press release provides a tiny bit more (pun intended) detail,

The University of Central Lancashire (UCLan) has unveiled the world’s first graphene skinned plane at an internationally renowned air show.

Juno, a three-and-a-half-metre wide graphene skinned aircraft, was revealed on the North West Aerospace Alliance (NWAA) stand as part of the ‘Futures Day’ at Farnborough Air Show 2018.

The University’s aerospace engineering team has worked in partnership with the Sheffield Advanced Manufacturing Research Centre (AMRC), the University of Manchester’s National Graphene Institute (NGI), Haydale Graphene Industries (Haydale) and a range of other businesses to develop the unmanned aerial vehicle (UAV), which also includes graphene batteries and 3D printed parts.

Billy Beggs, UCLan’s Engineering Innovation Manager, said: “The industry reaction to Juno at Farnborough was superb with many positive comments about the work we’re doing. Having Juno at one the world’s biggest air shows demonstrates the great strides we’re making in leading a programme to accelerate the uptake of graphene and other nano-materials into industry.

“The programme supports the objectives of the UK Industrial Strategy and the University’s Engineering Innovation Centre (EIC) to increase industry relevant research and applications linked to key local specialisms. Given that Lancashire represents the fourth largest aerospace cluster in the world, there is perhaps no better place to be developing next generation technologies for the UK aerospace industry.”

Previous graphene developments at UCLan have included the world’s first flight of a graphene skinned wing and the launch of a specially designed graphene-enhanced capsule into near space using high altitude balloons.

UCLan engineering students have been involved in the hands-on project, helping build Juno on the Preston Campus.

Haydale supplied much of the material and all the graphene used in the aircraft. Ray Gibbs, Chief Executive Officer, said: “We are delighted to be part of the project team. Juno has highlighted the capability and benefit of using graphene to meet key issues faced by the market, such as reducing weight to increase range and payload, defeating lightning strike and protecting aircraft skins against ice build-up.”

David Bailey Chief Executive of the North West Aerospace Alliance added: “The North West aerospace cluster contributes over £7 billion to the UK economy, accounting for one quarter of the UK aerospace turnover. It is essential that the sector continues to develop next generation technologies so that it can help the UK retain its competitive advantage. It has been a pleasure to support the Engineering Innovation Centre team at the University in developing the world’s first full graphene skinned aircraft.”

The Juno project team represents the latest phase in a long-term strategic partnership between the University and a range of organisations. The partnership is expected to go from strength to strength following the opening of the £32m EIC facility in February 2019.

The next step is to fly Juno and conduct further tests over the next two months.

Next item, a new carbon material.


I love watching this gif of a schwarzite,

The three-dimensional cage structure of a schwarzite that was formed inside the pores of a zeolite. (Graphics by Yongjin Lee and Efrem Braun)

An August 13, 2018 news item on Nanowerk announces the new carbon structure,

The discovery of buckyballs [also known as fullerenes, C60, or buckminsterfullerenes] surprised and delighted chemists in the 1980s, nanotubes jazzed physicists in the 1990s, and graphene charged up materials scientists in the 2000s, but one nanoscale carbon structure – a negatively curved surface called a schwarzite – has eluded everyone. Until now.

University of California, Berkeley [UC Berkeley], chemists have proved that three carbon structures recently created by scientists in South Korea and Japan are in fact the long-sought schwarzites, which researchers predict will have unique electrical and storage properties like those now being discovered in buckminsterfullerenes (buckyballs or fullerenes for short), nanotubes and graphene.

An August 13, 2018 UC Berkeley news release by Robert Sanders, which originated the news item, describes how the Berkeley scientists and the members of their international  collaboration from Germany, Switzerland, Russia, and Italy, have contributed to the current state of schwarzite research,

The new structures were built inside the pores of zeolites, crystalline forms of silicon dioxide – sand – more commonly used as water softeners in laundry detergents and to catalytically crack petroleum into gasoline. Called zeolite-templated carbons (ZTC), the structures were being investigated for possible interesting properties, though the creators were unaware of their identity as schwarzites, which theoretical chemists have worked on for decades.

Based on this theoretical work, chemists predict that schwarzites will have unique electronic, magnetic and optical properties that would make them useful as supercapacitors, battery electrodes and catalysts, and with large internal spaces ideal for gas storage and separation.

UC Berkeley postdoctoral fellow Efrem Braun and his colleagues identified these ZTC materials as schwarzites based of their negative curvature, and developed a way to predict which zeolites can be used to make schwarzites and which can’t.

“We now have the recipe for how to make these structures, which is important because, if we can make them, we can explore their behavior, which we are working hard to do now,” said Berend Smit, an adjunct professor of chemical and biomolecular engineering at UC Berkeley and an expert on porous materials such as zeolites and metal-organic frameworks.

Smit, the paper’s corresponding author, Braun and their colleagues in Switzerland, China, Germany, Italy and Russia will report their discovery this week in the journal Proceedings of the National Academy of Sciences. Smit is also a faculty scientist at Lawrence Berkeley National Laboratory.

Playing with carbon

Diamond and graphite are well-known three-dimensional crystalline arrangements of pure carbon, but carbon atoms can also form two-dimensional “crystals” — hexagonal arrangements patterned like chicken wire. Graphene is one such arrangement: a flat sheet of carbon atoms that is not only the strongest material on Earth, but also has a high electrical conductivity that makes it a promising component of electronic devices.

schwarzite carbon cage

The cage structure of a schwarzite that was formed inside the pores of a zeolite. The zeolite is subsequently dissolved to release the new material. (Graphics by Yongjin Lee and Efrem Braun)

Graphene sheets can be wadded up to form soccer ball-shaped fullerenes – spherical carbon cages that can store molecules and are being used today to deliver drugs and genes into the body. Rolling graphene into a cylinder yields fullerenes called nanotubes, which are being explored today as highly conductive wires in electronics and storage vessels for gases like hydrogen and carbon dioxide. All of these are submicroscopic, 10,000 times smaller than the width of a human hair.

To date, however, only positively curved fullerenes and graphene, which has zero curvature, have been synthesized, feats rewarded by Nobel Prizes in 1996 and 2010, respectively.

In the 1880s, German physicist Hermann Schwarz investigated negatively curved structures that resemble soap-bubble surfaces, and when theoretical work on carbon cage molecules ramped up in the 1990s, Schwarz’s name became attached to the hypothetical negatively curved carbon sheets.

“The experimental validation of schwarzites thus completes the triumvirate of possible curvatures to graphene; positively curved, flat, and now negatively curved,” Braun added.

Minimize me

Like soap bubbles on wire frames, schwarzites are topologically minimal surfaces. When made inside a zeolite, a vapor of carbon-containing molecules is injected, allowing the carbon to assemble into a two-dimensional graphene-like sheet lining the walls of the pores in the zeolite. The surface is stretched tautly to minimize its area, which makes all the surfaces curve negatively, like a saddle. The zeolite is then dissolved, leaving behind the schwarzite.

soap bubble schwarzite structure

A computer-rendered negatively curved soap bubble that exhibits the geometry of a carbon schwarzite. (Felix Knöppel image)

“These negatively-curved carbons have been very hard to synthesize on their own, but it turns out that you can grow the carbon film catalytically at the surface of a zeolite,” Braun said. “But the schwarzites synthesized to date have been made by choosing zeolite templates through trial and error. We provide very simple instructions you can follow to rationally make schwarzites and we show that, by choosing the right zeolite, you can tune schwarzites to optimize the properties you want.”

Researchers should be able to pack unusually large amounts of electrical charge into schwarzites, which would make them better capacitors than conventional ones used today in electronics. Their large interior volume would also allow storage of atoms and molecules, which is also being explored with fullerenes and nanotubes. And their large surface area, equivalent to the surface areas of the zeolites they’re grown in, could make them as versatile as zeolites for catalyzing reactions in the petroleum and natural gas industries.

Braun modeled ZTC structures computationally using the known structures of zeolites, and worked with topological mathematician Senja Barthel of the École Polytechnique Fédérale de Lausanne in Sion, Switzerland, to determine which of the minimal surfaces the structures resembled.

The team determined that, of the approximately 200 zeolites created to date, only 15 can be used as a template to make schwarzites, and only three of them have been used to date to produce schwarzite ZTCs. Over a million zeolite structures have been predicted, however, so there could be many more possible schwarzite carbon structures made using the zeolite-templating method.

Other co-authors of the paper are Yongjin Lee, Seyed Mohamad Moosavi and Barthel of the École Polytechnique Fédérale de Lausanne, Rocio Mercado of UC Berkeley, Igor Baburin of the Technische Universität Dresden in Germany and Davide Proserpio of the Università degli Studi di Milano in Italy and Samara State Technical University in Russia.

Here’s a link to and a citation for the paper,

Generating carbon schwarzites via zeolite-templating by Efrem Braun, Yongjin Lee, Seyed Mohamad Moosavi, Senja Barthel, Rocio Mercado, Igor A. Baburin, Davide M. Proserpio, and Berend Smit. PNAS August 14, 2018. 201805062; published ahead of print August 14, 2018.

This paper appears to be open access.

Happy International Women’s Day on March 8, 2019—with a shout-out to women in science

I did a very quick search for today’s (March 8, 2019) women in science stories and found three to highlight here. First, a somewhat downbeat Canadian story.

Can Canadians name a woman scientist or engineer?

According to Emily Chung’s March 8, 2019 article on the Canadian Broadcasting Corporation’s (CBC) online news site, the answer is: no,

You’ve probably heard of Stephen Hawking, Albert Einstein and Mark Zuckerberg.

But can you name a woman scientist or engineer? Half of Canadians can’t, suggests a new poll.

The online survey of 1,511 Canadians was commissioned by the non-profit group Girls Who Code and conducted by the market research firm Maru/Blue from March 1-3 and released for International Women’s Day today [March 8, 2019].

It was intended to collect data about how people felt about science, technology, engineering and math (STEM) careers and education in Canada, said Reshma Saujani, founder and CEO of the group, which aims to close the gender gap in technology by teaching girls coding skills.

The poll found:

When asked how many women scientists/engineers they could name, 52 per cent of respondents said “none.”

When asked to picture a computer scientist, 82 per cent of respondents immediately imagined a man rather than a woman.

77 per cent of respondents think increased media representation of women in STEM careers or leadership roles would help close the gender gap in STEM.

Sandra Corbeil, who’s involved a Women in STEM initiative at Ingenium, the organization that oversees Canada’s national museums of science and innovation, agrees that women scientists are under-recognized.

… Ingenium organized an event where volunteers from the public collaborated to add more women scientists to the online encyclopedia Wikipedia for the International Day of Women and Girls in Science this past February [2019].

The 21 participants added four articles, including Dr. Anna Marion Hilliard, who developed a simple pap test for early detection of cervical cancer and Marla Sokolowski, who discovered an important gene that affects both metabolism and behaviour in fruit flies. The volunteer editors also updated and translated several other entries.

Similar events have been held around the world to boost the representation of women on Wikipedia, where as of March 4, 2019, only 17.7 per cent of biographies were of women — even 2018’s winner of the Nobel Prize in Physics, Donna Strickland, didn’t have a Wikipedia entry until the prize was announced.

Corbeil acknowledged that in science, the individual contributions of scientists, whether they are men or women, tend to not be well known by the public.[emphasis mine]

“We don’t treat them like superstars … to me, it’s something that we probably should change because their contributions matter.”

Chung points to a criticism of the Girls Who Code poll, they didn’t ask Canadians whether they could name male scientists or engineers. While Reshma Saujani acknowledged the criticism, she also brushed it off (from Chung’s article),

Saujani acknowledges that the poll didn’t ask how many male scientists or engineers they could name, but thinks the answer would “probablybe different. [emphasis mine]

Chung seems to be hinting (with the double quotes around the word probably) but I’m going to be blunt, that isn’t good science but, then, Saujani is not a scientist (from the’s About page),

Reshma began her career as an attorney and activist. In 2010, she surged onto the political scene as the first Indian American woman to run for U.S. Congress. During the race, Reshma visited local schools and saw the gender gap in computing classes firsthand, which led her to start Girls Who Code. She has also served as Deputy Public Advocate for New York City and ran a spirited campaign for Public Advocate in 2013.

I’m inclined to believe that Saujani is right but I’d want to test the hypothesis. I have looked at what I believe to be the entire report here. I’m happy to see the questions but I do have a few questions about the methodology (happily, also included in the report),

… online survey was commissioned by Girls Who Code of 1,511 randomly selected Canadian adults who are Maru Voice panelists.

If it’s an online survey, how can the pollsters be sure the respondents are Canadian or sure about any other of the demographic details? What is a Maru Voice panelist? Is there some form of self-selection inherent in being a Maru Voice panelist? (If I remember my social science research guidelines properly, self-selected groups are not the same as the general population.)

All I’m saying, this report is interesting but seems problematic so treat it with a little caution.

Celebrating women in science in UK (United Kingdom)

This story comes from the UK’s N8 Research Partnership (I’m pretty sure that N8 is meant to be pronounced as ‘innate’). On March 7, 2019 they put up a webpage celebrating women in science,

All #N8women deliver our vision of making the N8 Research Partnership an exceptionally effective cluster of research innovation and training excellence; we celebrate all of your contributions and thank you for everything that you do. Read more about the women below or find out about them on our social channels by searching #N8Women.

Professor Dame Sue Black

Professor Dame Sue Black from Lancaster University pioneered research techniques to identify an individual by their hand alone, a technique that has been used successfully in Court to identify perpetrators in relation to child abuse cases. Images have been taken from more than 5000 participants to form an open-source dataset which has allowed a breakthrough in the study of anatomical variation.

Professor Diana Williams

Professor Diana Williams from The University of Liverpool has led research with Farming Online into a digital application that predict when and where disease is likely to occur. This is hoped to help combat the £300m per year UK agriculture loses per year through the liver fluke parasite which affects livestock across the globe.

Professor Louise Heathwaite

Professor Louise Heathwaite from Lancaster University has gained not only international recognition for her research into environmental pollution and water quality, but she also received the royal seal of approval after being awarded a CBE in the Queen’s Birthday Honours 2018.

Professor Sue Black

Professor Sue Black from Durham University has helped support 100 women retrain into tech roles thanks to the development of online programme, TechUP. Supported by the Institute of Coding, the programme lasts six months and concludes with a job interview, internship or apprenticeship.

Dr Anna Olsson-Brown

Dr Anna Olsson-Brown from the University of Liverpool has been instrumental in research into next-generation drugs that can treat patients with more advanced, malignant cancers and help them deal with the toxicity that can accompany novel therapies.

Professor Katherine Denby

Professor Katherine Denby, Director of N8 Agrifood, based at the University of York has been at the forefront of developing novel ways to enhance and enable breeding of crops resistance to environmental stress and disease.

Most recently, she was involved in the development of a genetic control system that enables plants to strengthen their defence response against deadly pathogens.

Doctor Louise Ellis

Dr Louise Ellis, Director of Sustainability at the University of Leeds has been leading their campaign – Single Out: 2023PlasticFree – crucially commits the University and Union to phase out single-use plastic across the board, not just in catering and office spaces.

Professor Philippa Browning

Professor Philippa Browning from the University of Manchester wanted to be an astronaut when she was a child but found that there was a lack of female role models in her field. She is leading work on the interactions between plasmas and magnetic fields and is a mentor for young solar physicists.

Doctor Anh Phan

Dr Anh Phan is a Lecturer of Chemical Engineering in the School of Engineering at Newcastle University. She has been leading research into cold plasma pyrolysis, a process that could be used to turn plastic waste into green energy. This is a novel process that could revolutionise our problem with plastic and realise the true value of plastic waste.

So, Canadians take note of these women and the ones featured in the next item.

Canada Science and Technology Museum’s (an Ingenium museum) International Women’s Day video

It was posted on YouTube in 2017 but given the somewhat downbeat Canadian story I started with I thought this appropriate,

It’s never too late to learn about women in science and engineering. The women featured in the video are: Ursula Franklin, Maude Abbott, Janice Zinck, and Indira Samarasekera

Popcorn-powered robots

A soft robotic device powered by popcorn, constructed by researchers in Cornell’s Collective Embodied Intelligence Lab. Courtesy: Cornell University

What an intriguing idea, popcorn-powered robots, and one I have difficulty imagining even with the help of the image above. A July 26, 2018 Cornell University news release (an edited version is on EurekAlert) by Melanie Lefkowitz describes the concept,

Cornell researchers have discovered how to power simple robots with a novel substance that, when heated, can expand more than 10 times in size, change its viscosity by a factor of 10 and transition from regular to highly irregular granules with surprising force.

You can also eat it with a little butter and salt.

“Popcorn-Driven Robotic Actuators,” a recent paper co-authored by doctoral student Steven Ceron, mechanical engineering, and Kirstin H. Petersen, assistant professor of electrical and computer engineering, examines how popcorn’s unique qualities can power inexpensive robotic devices that grip, expand or change rigidity.

“The goal of our lab is to try to make very minimalistic robots which, when deployed in high numbers, can still accomplish great things,” said Petersen, who runs Cornell’s Collective Embodied Intelligence Lab. “Simple robots are cheap and less prone to failures and wear, so we can have many operating autonomously over a long time. So we are always looking for new and innovative ideas that will permit us to have more functionalities for less, and popcorn is one of those.”

The study is the first to consider powering robots with popcorn, which is inexpensive, readily available, biodegradable and of course, edible. Since kernels can expand rapidly, exerting force and motion when heated, they could potentially power miniature jumping robots. Edible devices could be ingested for medical procedures. The mix of hard, unpopped granules and lighter popped corn could replace fluids in soft robots without the need for air pumps or compressors.

“Pumps and compressors tend to be more expensive, and they add a lot of weight and expense to your robot,” said Ceron, the paper’s lead author. “With popcorn, in some of the demonstrations that we showed, you just need to apply voltage to get the kernels to pop, so it would take all the bulky and expensive parts out of the robots.”

Since kernels can’t shrink once they’ve popped, a popcorn-powered mechanism can generally be used only once, though multiple uses are conceivable because popped kernels can dissolve in water, Ceron said.

The researchers experimented with Amish Country Extra Small popcorn, which they chose because the brand did not use additives. The extra-small variety had the highest expansion ratio of those they tested.

After studying popcorn’s properties using different types of heating, the researchers constructed three simple robotic actuators – devices used to perform a function.

For a jamming actuator, 36 kernels of popcorn heated with nichrome wire were used to stiffen a flexible silicone beam. For an elastomer actuator, they constructed a three-fingered soft gripper, whose silicone fingers were stuffed with popcorn heated by nichrome wire. When the kernels popped, the expansion exerted pressure against the outer walls of the fingers, causing them to curl. For an origami actuator, they folded recycled Newman’s Own organic popcorn bags into origami bellows folds, filled them with kernels and microwaved them. The expansion of the kernels was strong enough to support the weight of a nine-pound kettlebell.

The paper was presented at the IEEE [Institute of Electrical and Electronics Engineers] International Conference on Robotics and Automation in May and co-authored with Aleena Kurumunda ’19, Eashan Garg ’20, Mira Kim ’20 and Tosin Yeku ’20. Petersen said she hopes it inspires researchers to explore the possibilities of other nontraditional materials.

“Robotics is really good at embracing new ideas, and we can be super creative about what we use to generate multifunctional properties,” she said. “In the end we come up with very simple solutions to fairly complex problems. We don’t always have to look for high-tech solutions. Sometimes the answer is right in front of us.”

The work was supported by the Cornell Engineering Learning Initiative, the Cornell Electrical and Computer Engineering Early Career Award and the Cornell Sloan Fellowship.

Here’s a link to and a citation for the paper,

Popcorn-Driven Robotic Actuators by Steven Ceron, Aleena Kurumunda, Eashan Garg, Mira Kim, Tosin Yeku, and Kirstin Petersen. Presented at the IEEE International Conference on Robotics and Automation held in May 21-25, 2018 in Brisbane, Australia.

The researchers have made this video demonstrating the technology,

Bristly hybrid materials

Caption: [Image 1] A carbon fiber covered with a spiky forest of NiCoHC nanowires. Credit: All images reproduced from reference 1 under a Creative Commons Attribution 4.0 International License© 2018 KAUST

It makes me think of small, cuddly things like cats and dogs but it’s not. From an August 7, 2018 King Abdullah University of Science and Technology (KAUST; Saudi Arabia) news release (also published on August 12, 2018 on EurekAlert),

By combining multiple nanomaterials into a single structure, scientists can create hybrid materials that incorporate the best properties of each component and outperform any single substance. A controlled method for making triple-layered hollow nanostructures has now been developed at KAUST. The hybrid structures consist of a conductive organic core sandwiched between layers of electrocatalytically active metals: their potential uses range from better battery electrodes to renewable fuel production.

Although several methods exist to create two-layer materials, making three-layered structures has proven much more difficult, says Peng Wang from the Water Desalination and Reuse Center who co-led the current research with Professor Yu Han, member of the Advanced Membranes and Porous Materials Center at KAUST. The researchers developed a new, dual-template approach, explains Sifei Zhuo, a postdoctoral member of Wang’s team.

The researchers grew their hybrid nanomaterial directly on carbon paper–a mat of electrically conductive carbon fibers. They first produced a bristling forest of nickel cobalt hydroxyl carbonate (NiCoHC) nanowires onto the surface of each carbon fiber (image 1). Each tiny inorganic bristle was coated with an organic layer called hydrogen substituted graphdiyne (HsGDY) (image 2 [not included here]).

Next was the key dual-template step. When the team added a chemical mixture that reacts with the inner NiCoHC, the HsGDY acted as a partial barrier. Some nickel and cobalt ions from the inner layer diffused outward, where they reacted with thiomolybdate from the surrounding solution to form the outer nickel-, cobalt-co-doped MoS2 (Ni,Co-MoS2) layer. Meanwhile, some sulfur ions from the added chemicals diffused inwards to react with the remaining nickel and cobalt. The resulting substance (image 3 [not included here]) had the structure Co9S8, Ni3S2@HsGDY@Ni,Co-MoS2, in which the conductive organic HsGDY layer is sandwiched between two inorganic layers (image 4 [not included here]).

The triple layer material showed good performance at electrocatalytically breaking up water molecules to generate hydrogen, a potential renewable fuel. The researchers also created other triple-layer materials using the dual-template approach

“These triple-layered nanostructures hold great potential in energy conversion and storage,” says Zhuo. “We believe it could be extended to serve as a promising electrode in many electrochemical applications, such as in supercapacitors and sodium-/lithium-ion batteries, and for use in water desalination.”

Here’s a link to and a citation for the paper,

Dual-template engineering of triple-layered nanoarray electrode of metal chalcogenides sandwiched with hydrogen-substituted graphdiyne by Sifei Zhuo, Yusuf Shi, Lingmei Liu, Renyuan Li, Le Shi, Dalaver H. Anjum, Yu Han, & Peng Wang. Nature Communicationsvolume 9, Article number: 3132 (2018) DOI: Published 07 August 2018

This paper is open access.


Ooblek (non-Newtonian goo) and bras from Reebok

I have taken a liberty in the title for this piece, strictly speaking the non-Newtonian goo in the bra isn’t the stuff (ooblek) made of cornstarch and water from your childhood science experiments but it has many of the same qualities. The material in the Reebok bra, PureMove, is called Shear Thickening Fluid and was developed at the University of Delaware in 2005 and subsequently employed by NASA (US National Aeronautics and Space Administration) for use in the suits used by astronauts as noted in an August 6, 2018 article by Elizabeth Secgran for Fast Company who explains how it came be used for the latest sports bra,

While the activewear industry floods the market with hundreds of different sports bras every season, research shows that most female consumers are unsatisfied with their sports bra options, and 1 in 5 women avoid exercise altogether because they don’t have a sports bra that fits them properly.

Reebok wants to make that experience a thing of the past. Today, it launches a new bra, the PureMove, that adapts to your movements, tightening up when you’re moving fast and relaxing when you’re not. …

When I visited Reebok’s Boston headquarters, Witek [Danielle Witek, Reebok designer who spearheaded the R&D making the bra possible] handed me a jar of the fluid with a stick in it. When I moved the stick quickly, it seemed to turn into a solid, and when I moved it slowly, it had the texture of honey. Witek and the scientists have incorporated this fluid into a fabric that Reebok dubs “Motion Sense Technology.” The fluid is woven into the textile, so that on the surface, it looks and feels like the synthetic material you might find in any sports bra. But what you can’t see is that the fabric adapts to the body’s shape, the velocity of the breast tissue in motion, and the type and force of movement. It stretches less with high-impact movements and then stretches more during rest and lower intensity activities.

I tested an early version of the PureMove bra a few months ago, before it had even gone into production. I did a high-intensity workout that involved doing jumping jacks and sprints, followed by a cool-down session. The best thing about the bra was that I didn’t notice it at all. I didn’t feel stifled when I was just strolling around the gym, and I didn’t feel like I was unsupported when I was running around. Ultimately, the best bras are the ones that you don’t have to think about so you can focus on getting on with your life.

Since this technology is so new, Reebok had to do a lot of testing to make sure the bra would actually do what it advertised. The company set up a breast biomechanics testing center with the help of the University of Delaware, with 54 separate motion sensors tracking and measuring various parts of a tester’s chest area. This is a far more rigorous approach than most testing facilities in the industry that typically only use between two to four sensors. Over the course of a year, the facility gathered the data required for the scientists and Reebok product designers to develop the PureMove bra.

… If it’s well-received, the logical next step would be to incorporate the Motion Sense Technology into other products, like running tights or swimsuits, since transitioning between compression and looseness is something that we want in all of our sportswear. ..

According to the Reebok PureMove bra webpage, it was available from August 16, 2018,

Credit: Reebok

It’s $60 (I imagine those are US dollars).

For anyone interested in the science of non-Newtonian goo, shear thickening fluid, and NASA, there’s a November 24, 2015 article by Lydia Chain for Popular Science (Note: Links have been removed),

There’s an experiment you may have done in high school: When you mix cornstarch with water—a concoction colloquially called oobleck—and give it a stir, it acts like a liquid. But scrape it quickly or hit it hard, and it stiffens up into a solid. If you set the right pace, you can even run on top of a pool of the stuff. This phenomenon is called shear force thickening, and scientists have been trying to understand how it happens for decades.

There are two main theories, and figuring out which is right could affect the way we make things like cement, body armor, concussion preventing helmets, and even spacesuits.

The prevailing theory is that it’s all about the fluid dynamics (the nature of how fluids move) of the liquid and the particles in a solution. As the particles are pushed closer and closer together, it becomes harder to squeeze the liquid out from between them. Eventually, it’s too hard to squeeze out any more fluid and the particles lock up into hydrodynamic clusters, still separated by a thin film of fluid. They then move together, thickening the mixture and forming a solid.

The other idea is that contact forces like friction keep the particles locked together. Under this theory, when force is applied, the particles actually touch. The shearing force and friction keep them pressed together, which makes the solution more solid.

“The debate has been raging, and we’ve been wracking our brains to think of a method to conclusively go one way or the other,” says Itai Cohen, a physicist at Cornell University. He and his team recently ran a new experiment that seems to point to friction as the driving cause of shear thickening.

Norman Wagner, a chemical engineer at the University of Delaware, says that research into frictional interactions like this is important, but notes that he isn’t completely convinced as Cohen’s team didn’t measure friction directly (they inferred it was friction from their modeling however they didn’t find the exact measurement of the friction between the particles). He also says that there’s a lot of data in the field already that strongly indicates hydrodynamic clusters as the cause for shear thickening.

Wagner and his team are working on a NASA funded project to improve space suits so that micrometeorites or other debris can’t puncture them. They have also bent their technology to make padding for helmets and shin guards that would do a better job protecting athletes from harmful impacts. They are even making puncture resistant gloves that would give healthcare workers the same dexterity as current ones but with extra protection against accidental needle sticks.

“It’s a very exciting area,” says Wagner. He’s very interested in designing materials that automatically protect someone, without robotics or power. …

I guess that in 2015 Wagner didn’t realize his work would also end up in a 2018 sports bra.