Tag Archives: Tim Harper

Better hair dyes with graphene and a cautionary note

Beauty products aren’t usually the first applications that come to mind when discussing graphene or any other research and development (R&D) as I learned when teaching a course a few years ago. But research and development  in that field are imperative as every company is scrambling for a short-lived competitive advantage for a truly new products or a perceived competitive advantage in a field where a lot of products are pretty much the same.

This March 15, 2018 news item on ScienceDaily describes graphene as a potential hair dye,

Graphene, a naturally black material, could provide a new strategy for dyeing hair in difficult-to-create dark shades. And because it’s a conductive material, hair dyed with graphene might also be less prone to staticky flyaways. Now, researchers have put it to the test. In an article published March 15 [2018] in the journal Chem, they used sheets of graphene to make a dye that adheres to the surface of hair, forming a coating that is resistant to at least 30 washes without the need for chemicals that open up and damage the hair cuticle.

Courtesy: Northwestern University

A March 15, 2018 Cell Press news release on EurekAlert, which originated the news item, fills in more the of the story,

Most permanent hair dyes used today are harmful to hair. “Your hair is covered in these cuticle scales like the scales of a fish, and people have to use ammonia or organic amines to lift the scales and allow dye molecules to get inside a lot quicker,” says senior author Jiaxing Huang, a materials scientist at Northwestern University. But lifting the cuticle makes the strands of the hair more brittle, and the damage is only exacerbated by the hydrogen peroxide that is used to trigger the reaction that synthesizes the dye once the pigment molecules are inside the hair.

These problems could theoretically be solved by a dye that coats rather than penetrates the hair. “However, the obvious problem of coating-based dyes is that they tend to wash out very easily,” says Huang. But when he and his team coated samples of human hair with a solution of graphene sheets, they were able to turn platinum blond hair black and keep it that way for at least 30 washes–the number necessary for a hair dye to be considered “permanent.”

This effectiveness has to do with the structure of graphene: it’s made of up thin, flexible sheets that can adapt to uneven surfaces. “Imagine a piece of paper. A business card is very rigid and doesn’t flex by itself. But if you take a much bigger sheet of newspaper–if you still can find one nowadays–it can bend easily. This makes graphene sheets a good coating material,” he says. And once the coating is formed, the graphene sheets are particularly good at keeping out water during washes, which keeps the water from eroding both the graphene and the polymer binder that the team also added to the dye solution to help with adhesion.

The graphene dye has additional advantages. Each coated hair is like a little wire in that it is able to conduct heat and electricity. This means that it’s easy for graphene-dyed hair to dissipate static electricity, eliminating the problem of flyaways on dry winter days. The graphene flakes are large enough that they won’t absorb through the skin like other dye molecules. And although graphene is typically black, its precursor, graphene oxide, is light brown. But the color of graphene oxide can be gradually darkened with heat or chemical reactions, meaning that this dye could be used for a variety of shades or even for an ombre effect.

What Huang thinks is particularly striking about this application of graphene is that it takes advantage of graphene’s most obvious property. “In many potential graphene applications, the black color of graphene is somewhat undesirable and something of a sore point,” he says. Here, though, it’s applied to a field where creating dark colors has historically been a problem.

The graphene used for hair dye also doesn’t need to be of the same high quality as it does for other applications. “For hair dye, the most important property is graphene being black. You can have graphene that is too lousy for higher-end electronic applications, but it’s perfectly okay for this. So I think this application can leverage the current graphene product as is, and that’s why I think that this could happen a lot sooner than many of the other proposed applications,” he says.

Making it happen is his next goal. He hopes to get funding to continue the research and make these dyes a reality for the people whose lives they would improve. “This is an idea that was inspired by curiosity. It was very fun to do, but it didn’t sound very big and noble when we started working on it,” he says. “But after we deep-dived into studying hair dyes, we realized that, wow, this is actually not at all a small problem. And it’s one that graphene could really help to solve.”

Northwestern University’s Amanda Morris also wrote a March 15, 2018 news release (it’s repetitive but there are some interesting new details; Note: Links have been removed),

It’s an issue that has plagued the beauty industry for more than a century: Dying hair too often can irreparably damage your silky strands.

Now a Northwestern University team has used materials science to solve this age-old problem. The team has leveraged super material graphene to develop a new hair dye that is less harmful [emphasis mine], non-damaging and lasts through many washes without fading. Graphene’s conductive nature also opens up new opportunities for hair, such as turning it into in situ electrodes or integrating it with wearable electronic devices.

Dying hair might seem simple and ordinary, but it’s actually a sophisticated chemical process. Called the cuticle, the outermost layer of a hair is made of cells that overlap in a scale-like pattern. Commercial dyes work by using harsh chemicals, such as ammonia and bleach, to first pry open the cuticle scales to allow colorant molecules inside and then trigger a reaction inside the hair to produce more color. Not only does this process cause hair to become more fragile, some of the small molecules are also quite toxic.

Huang and his team bypassed harmful chemicals altogether by leveraging the natural geometry of graphene sheets. While current hair dyes use a cocktail of small molecules that work by chemically altering the hair, graphene sheets are soft and flexible, so they wrap around each hair for an even coat. Huang’s ink formula also incorporates edible, non-toxic polymer binders to ensure that the graphene sticks — and lasts through at least 30 washes, which is the commercial requirement for permanent hair dye. An added bonus: graphene is anti-static, so it keeps winter-weather flyaways to a minimum.

“It’s similar to the difference between a wet paper towel and a tennis ball,” Huang explained, comparing the geometry of graphene to that of other black pigment particles, such as carbon black or iron oxide, which can only be used in temporary hair dyes. “The paper towel is going to wrap and stick much better. The ball-like particles are much more easily removed with shampoo.”

This geometry also contributes to why graphene is a safer alternative. Whereas small molecules can easily be inhaled or pass through the skin barrier, graphene is too big to enter the body. “Compared to those small molecules used in current hair dyes, graphene flakes are humongous,” said Huang, who is a member of Northwestern’s International Institute of Nanotechnology.

Ever since graphene — the two-dimensional network of carbon atoms — burst onto the science scene in 2004, the possibilities for the promising material have seemed nearly endless. With its ultra-strong and lightweight structure, graphene has potential for many applications in high-performance electronics, high-strength materials and energy devices. But development of those applications often require graphene materials to be as structurally perfect as possible in order to achieve extraordinary electrical, mechanical or thermal properties.

The most important graphene property for Huang’s hair dye, however, is simply its color: black. So Huang’s team used graphene oxide, an imperfect version of graphene that is a cheaper, more available oxidized derivative.

“Our hair dye solves a real-world problem without relying on very high-quality graphene, which is not easy to make,” Huang said. “Obviously more work needs to be done, but I feel optimistic about this application.”

Still, future versions of the dye could someday potentially leverage graphene’s notable properties, including its highly conductive nature.

“People could apply this dye to make hair conductive on the surface,” Huang said. “It could then be integrated with wearable electronics or become a conductive probe. We are only limited by our imagination.”

So far, Huang has developed graphene-based hair dyes in multiple shades of brown and black. Next, he plans to experiment with more colors.

Interestingly, the tiny note of caution”less harmful” doesn’t appear in the Cell Press news release. Never fear, Dr. Andrew Maynard (Director Risk Innovation Lab at Arizona State University) has written a March 20, 2018 essay on The Conversation suggesting a little further investigation (Note: Links have been removed),

Northwestern University’s press release proudly announced, “Graphene finds new application as nontoxic, anti-static hair dye.” The announcement spawned headlines like “Enough with the toxic hair dyes. We could use graphene instead,” and “’Miracle material’ graphene used to create the ultimate hair dye.”

From these headlines, you might be forgiven for getting the idea that the safety of graphene-based hair dyes is a done deal. Yet having studied the potential health and environmental impacts of engineered nanomaterials for more years than I care to remember, I find such overly optimistic pronouncements worrying – especially when they’re not backed up by clear evidence.

Tiny materials, potentially bigger problems

Engineered nanomaterials like graphene and graphene oxide (the particular form used in the dye experiments) aren’t necessarily harmful. But nanomaterials can behave in unusual ways that depend on particle size, shape, chemistry and application. Because of this, researchers have long been cautious about giving them a clean bill of health without first testing them extensively. And while a large body of research to date doesn’t indicate graphene is particularly dangerous, neither does it suggest it’s completely safe.

A quick search of scientific papers over the past few years shows that, since 2004, over 2,000 studies have been published that mention graphene toxicity; nearly 500 were published in 2017 alone.

This growing body of research suggests that if graphene gets into your body or the environment in sufficient quantities, it could cause harm. A 2016 review, for instance, indicated that graphene oxide particles could result in lung damage at high doses (equivalent to around 0.7 grams of inhaled material). Another review published in 2017 suggested that these materials could affect the biology of some plants and algae, as well as invertebrates and vertebrates toward the lower end of the ecological pyramid. The authors of the 2017 study concluded that research “unequivocally confirms that graphene in any of its numerous forms and derivatives must be approached as a potentially hazardous material.”

These studies need to be approached with care, as the precise risks of graphene exposure will depend on how the material is used, how exposure occurs and how much of it is encountered. Yet there’s sufficient evidence to suggest that this substance should be used with caution – especially where there’s a high chance of exposure or that it could be released into the environment.

Unfortunately, graphene-based hair dyes tick both of these boxes. Used in this way, the substance is potentially inhalable (especially with spray-on products) and ingestible through careless use. It’s also almost guaranteed that excess graphene-containing dye will wash down the drain and into the environment.

Undermining other efforts?

I was alerted to just how counterproductive such headlines can be by my colleague Tim Harper, founder of G2O Water Technologies – a company that uses graphene oxide-coated membranes to treat wastewater. Like many companies in this area, G2O has been working to use graphene responsibly by minimizing the amount of graphene that ends up released to the environment.

Yet as Tim pointed out to me, if people are led to believe “that bunging a few grams of graphene down the drain every time you dye your hair is OK, this invalidates all the work we are doing making sure the few nanograms of graphene on our membranes stay put.” Many companies that use nanomaterials are trying to do the right thing, but it’s hard to justify the time and expense of being responsible when someone else’s more cavalier actions undercut your efforts.

Overpromising results and overlooking risk

This is where researchers and their institutions need to move beyond an “economy of promises” that spurs on hyperbole and discourages caution, and think more critically about how their statements may ultimately undermine responsible and beneficial development of a technology. They may even want to consider using guidelines, such as the Principles for Responsible Innovation developed by the organization Society Inside, for instance, to guide what they do and say.

If you have time, I encourage you to read Andrew’s piece in its entirety.

Here’s a link to and a citation for the paper,

Multifunctional Graphene Hair Dye by Chong Luo, Lingye Zhou, Kevin Chiou, and Jiaxing Huang. Chem DOI: https://doi.org/10.1016/j.chempr.2018.02.02 Publication stage: In Press Corrected Proof

This paper appears to be open access.

*Two paragraphs (repetitions) were deleted from the excerpt of Dr. Andrew Maynard’s essay on August 14, 2018

FrogHeart’s good-bye to 2017 and hello to 2018

This is going to be relatively short and sweet(ish). Starting with the 2017 review:

Nano blogosphere and the Canadian blogosphere

From my perspective there’s been a change taking place in the nano blogosphere over the last few years. There are fewer blogs along with fewer postings from those who still blog. Interestingly, some blogs are becoming more generalized. At the same time, Foresight Institute’s Nanodot blog (as has FrogHeart) has expanded its range of topics to include artificial intelligence and other topics. Andrew Maynard’s 2020 Science blog now exists in an archived from but before its demise, it, too, had started to include other topics, notably risk in its many forms as opposed to risk and nanomaterials. Dexter Johnson’s blog, Nanoclast (on the IEEE [Institute for Electrical and Electronics Engineers] website), maintains its 3x weekly postings. Tim Harper who often wrote about nanotechnology on his Cientifica blog appears to have found a more freewheeling approach that is dominated by his Twitter feed although he also seems (I can’t confirm that the latest posts were written in 2017) to blog here on timharper.net.

The Canadian science blogosphere seems to be getting quieter if Science Borealis (blog aggregator) is a measure. My overall impression is that the bloggers have been a bit quieter this year with fewer postings on the feed or perhaps that’s due to some technical issues (sometimes FrogHeart posts do not get onto the feed). On the promising side, Science Borealis teamed with the Science Writers and Communicators of Canada Association to run a contest, “2017 People’s Choice Awards: Canada’s Favourite Science Online!”  There were two categories (Favourite Science Blog and Favourite Science Site) and you can find a list of the finalists with links to the winners here.

Big congratulations for the winners: Canada’s Favourite Blog 2017: Body of Evidence (Dec. 6, 2017 article by Alina Fisher for Science Borealis) and Let’s Talk Science won Canada’s Favourite Science Online 2017 category as per this announcement.

However, I can’t help wondering: where were ASAP Science, Acapella Science, Quirks & Quarks, IFLS (I f***ing love science), and others on the list for finalists? I would have thought any of these would have a lock on a position as a finalist. These are Canadian online science purveyors and they are hugely popular, which should mean they’d have no problem getting nominated and getting votes. I can’t find the criteria for nominations (or any hint there will be a 2018 contest) so I imagine their nonpresence on the 2017 finalists list will remain a mystery to me.

Looking forward to 2018, I think that the nano blogosphere will continue with its transformation into a more general science/technology-oriented community. To some extent, I believe this reflects the fact that nanotechnology is being absorbed into the larger science/technology effort as foundational (something wiser folks than me predicted some years ago).

As for Science Borealis and the Canadian science online effort, I’m going to interpret the quieter feeds as a sign of a maturing community. After all, there are always ups and downs in terms of enthusiasm and participation and as I noted earlier the launch of an online contest is promising as is the collaboration with Science Writers and Communicators of Canada.

Canadian science policy

It was a big year.

Canada’s Chief Science Advisor

With Canada’s first chief science advisor in many years, being announced Dr. Mona Nemer stepped into her position sometime in Fall 2017. The official announcement was made on Sept. 26, 2017. I covered the event in my Sept. 26, 2017 posting, which includes a few more details than found the official announcement.

You’ll also find in that Sept. 26, 2017 posting a brief discourse on the Naylor report (also known as the Review of Fundamental Science) and some speculation on why, to my knowledge, there has been no action taken as a consequence.  The Naylor report was released April 10, 2017 and was covered here in a three-part review, published on June 8, 2017,

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 1 of 3

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 2 of 3

INVESTING IN CANADA’S FUTURE; Strengthening the Foundations of Canadian Research (Review of fundamental research final report): 3 of 3

I have found another commentary (much briefer than mine) by Paul Dufour on the Canadian Science Policy Centre website. (November 9, 2017)

Subnational and regional science funding

This began in 2016 with a workshop mentioned in my November 10, 2016 posting: ‘Council of Canadian Academies and science policy for Alberta.” By the time the report was published the endeavour had been transformed into: Science Policy: Considerations for Subnational Governments (report here and my June 22, 2017 commentary here).

I don’t know what will come of this but I imagine scientists will be supportive as it means more money and they are always looking for more money. Still, the new government in British Columbia has only one ‘science entity’ and I’m not sure it’s still operational but i was called the Premier’s Technology Council. To my knowledge, there is no ministry or other agency that is focused primarily or partially on science.

Meanwhile, a couple of representatives from the health sciences (neither of whom were involved in the production of the report) seem quite enthused about the prospects for provincial money in their (Bev Holmes, Interim CEO, Michael Smith Foundation for Health Research, British Columbia, and Patrick Odnokon (CEO, Saskatchewan Health Research Foundation) October 27, 2017 opinion piece for the Canadian Science Policy Centre.

Artificial intelligence and Canadians

An event which I find more interesting with time was the announcement of the Pan=Canadian Artificial Intelligence Strategy in the 2017 Canadian federal budget. Since then there has been a veritable gold rush mentality with regard to artificial intelligence in Canada. One announcement after the next about various corporations opening new offices in Toronto or Montréal has been made in the months since.

What has really piqued my interest recently is a report being written for Canada’s Treasury Board by Michael Karlin (you can learn more from his Twitter feed although you may need to scroll down past some of his more personal tweets (something cassoulet in the Dec. 29, 2017 tweets).  As for Karlin’s report, which is a work in progress, you can find out more about the report and Karlin in a December 12, 2017 article by Rob Hunt for the Algorithmic Media Observatory (sponsored by the Social Sciences and Humanities Research Council of Canada [SHRCC], the Centre for Study of Democratic Citizenship, and the Fonds de recherche du Québec: Société et culture).

You can ring in 2018 by reading and making comments, which could influence the final version, on Karlin’s “Responsible Artificial Intelligence in the Government of Canada” part of the government’s Digital Disruption White Paper Series.

As for other 2018 news, the Council of Canadian Academies is expected to publish “The State of Science and Technology and Industrial Research and Development in Canada” at some point soon (we hope). This report follows and incorporates two previous ‘states’, The State of Science and Technology in Canada, 2012 (the first of these was a 2006 report) and the 2013 version of The State of Industrial R&D in Canada. There is already some preliminary data for this latest ‘state of’  (you can find a link and commentary in my December 15, 2016 posting).

FrogHeart then (2017) and soon (2018)

On looking back I see that the year started out at quite a clip as I was attempting to hit the 5000th blog posting mark, which I did on March 3,  2017. I have cut back somewhat from the 3 postings/day high to approximately 1 posting/day. It makes things more manageable allowing me to focus on other matters.

By the way, you may note that the ‘Donate’ button has disappeared from my sidebard. I thank everyone who donated from the bottom of my heart. The money was more than currency, it also symbolized encouragement. On the sad side, I moved from one hosting service to a new one (Sibername) late in December 2016 and have been experiencing serious bandwidth issues which result on FrogHeart’s disappearance from the web for days at a time. I am trying to resolve the issues and hope that such actions as removing the ‘Donate’ button will help.

I wish my readers all the best for 2018 as we explore nanotechnology and other emerging technologies!

(I apologize for any and all errors. I usually take a little more time to write this end-of-year and coming-year piece but due to bandwidth issues I was unable to access my draft and give it at least one review. And at this point, I’m too tired to try spotting error. If you see any, please do let me know.)

Job posting: G20 Water Technologies is looking for a PhD level scientist to join a fast-growing and well-funded start-up company developing graphene based water treatment.

This is the June 6, 2017 G20 Water Technologies notice I received via email,

Senior Application Scientist Vacancy

This is an opportunity for a PhD level scientist to join a fast growing
and well funded start up developing graphene based water treatment.

The company has developed coatings for existing filter materials with
applications in oil/water separation, waste water treatment, dehydration
of organic liquids and desalination, with addressable markets in excess
of £2.8Bn.

We have a vacancy for an exceptional individual with an in depth
understanding of membranes and 2D materials to join our team as a Senior
Application Scientist. This post carries a high degree of responsibility
to deliver results, a salary to match, will report directly to the
company’s CEO and will be based with the Water@Leeds interdisciplinary
group in the University of Leeds.

Key responsibilities include:

* Managing the company’s internal and external research and
development activities with both academic and commercial partners;
* To further develop graphene oxide (GO)-based coatings/membranes for
highly efficient water purification. This will involve working closely
with materials suppliers and end users to understand and deliver the
required performance;
* Developing test methodologies to quantify membrane performance
* Supporting current and future government funded grant work;
* Further developing and strengthening G2O’s IP portfolio.

The successful candidate will be expected to have:

* The ability to design, manage and deliver technology R&D projects;
* Experience in working with academic institutions in an industrial
environment;
* An in depth knowledge of formulation of 2D material dispersions;
* A PhD or other suitable academic qualifications to be accepted as a
Visiting Fellow by the company’s academic partners.

Before getting to the contact information, a few words about one of the company’s principles, Tim Harper, G20 Chief Executive Officer. I’ve never met him in person but have known him online for many years (we’ve exchanged emails and tweets). He has been an active member of the ‘nano’ blogosphere and social media environment for many years. He has run his own consultation company (on emerging technologies), Cientifica (About Us) since 1997, and other companies. He’s been involved with the World Economic Forum and has consulted internationally for governments and other entities. That said, there are no guarantees with start-up companies and you do need to perform your own due diligence as I’m sure Tim Harper would counsel you. One other piece of information before you dash off, the company’s headquarters are in Manchester where its university boasts it’s the ‘home of graphene’ and houses the National Graphene Institute.

Here are a few places you might want to check:

G20 website

About G20 webpage

Contact us (for more details about the position)

Good Luck!

2016 thoughts and 2017 hopes from FrogHeart

This is the 4900th post on this blog and as FrogHeart moves forward to 5000, I’m thinking there will be some changes although I’m not sure what they’ll be. In the meantime, here are some random thoughts on the year that was in Canadian science and on the FrogHeart blog.

Changeover to Liberal government: year one

Hopes were high after the Trudeau government was elected. Certainly, there seems to have been a loosening where science communication policies have been concerned although it may not have been quite the open and transparent process people dreamed of. On the plus side, it’s been easier to participate in public consultations but there has been no move (perceptible to me) towards open government science or better access to government-funded science papers.

Open Science in Québec

As far as I know, la crème de la crème of open science (internationally) is the Montreal Neurological Institute (Montreal Neuro; affiliated with McGill University. They bookended the year with two announcements. In January 2016, Montreal Neuro announced it was going to be an “Open Science institution (my Jan. 22, 2016 posting),

The Montreal Neurological Institute (MNI) in Québec, Canada, known informally and widely as Montreal Neuro, has ‘opened’ its science research to the world. David Bruggeman tells the story in a Jan. 21, 2016 posting on his Pasco Phronesis blog (Note: Links have been removed),

The Montreal Neurological Institute (MNI) at McGill University announced that it will be the first academic research institute to become what it calls ‘Open Science.’  As Science is reporting, the MNI will make available all research results and research data at the time of publication.  Additionally it will not seek patents on any of the discoveries made on research at the Institute.

Will this catch on?  I have no idea if this particular combination of open access research data and results with no patents will spread to other university research institutes.  But I do believe that those elements will continue to spread.  More universities and federal agencies are pursuing open access options for research they support.  Elon Musk has opted to not pursue patent litigation for any of Tesla Motors’ patents, and has not pursued patents for SpaceX technology (though it has pursued litigation over patents in rocket technology). …

Then, there’s my Dec. 19, 2016 posting about this Montreal Neuro announcement,

It’s one heck of a Christmas present. Canadian businessmen Larry Tannenbaum and his wife Judy have given the Montreal Neurological Institute (Montreal Neuro), which is affiliated with McGill University, a $20M donation. From a Dec. 16, 2016 McGill University news release,

The Prime Minister of Canada, Justin Trudeau, was present today at the Montreal Neurological Institute and Hospital (MNI) for the announcement of an important donation of $20 million by the Larry and Judy Tanenbaum family. This transformative gift will help to establish the Tanenbaum Open Science Institute, a bold initiative that will facilitate the sharing of neuroscience findings worldwide to accelerate the discovery of leading edge therapeutics to treat patients suffering from neurological diseases.

‟Today, we take an important step forward in opening up new horizons in neuroscience research and discovery,” said Mr. Larry Tanenbaum. ‟Our digital world provides for unprecedented opportunities to leverage advances in technology to the benefit of science.  That is what we are celebrating here today: the transformation of research, the removal of barriers, the breaking of silos and, most of all, the courage of researchers to put patients and progress ahead of all other considerations.”

Neuroscience has reached a new frontier, and advances in technology now allow scientists to better understand the brain and all its complexities in ways that were previously deemed impossible. The sharing of research findings amongst scientists is critical, not only due to the sheer scale of data involved, but also because diseases of the brain and the nervous system are amongst the most compelling unmet medical needs of our time.

Neurological diseases, mental illnesses, addictions, and brain and spinal cord injuries directly impact 1 in 3 Canadians, representing approximately 11 million people across the country.

“As internationally-recognized leaders in the field of brain research, we are uniquely placed to deliver on this ambitious initiative and reinforce our reputation as an institution that drives innovation, discovery and advanced patient care,” said Dr. Guy Rouleau, Director of the Montreal Neurological Institute and Hospital and Chair of McGill University’s Department of Neurology and Neurosurgery. “Part of the Tanenbaum family’s donation will be used to incentivize other Canadian researchers and institutions to adopt an Open Science model, thus strengthening the network of like-minded institutes working in this field.”

Chief Science Advisor

Getting back to the federal government, we’re still waiting for a Chief Science Advisor. Should you be interested in the job, apply here. The job search was launched in early Dec. 2016 (see my Dec. 7, 2016 posting for details) a little over a year after the Liberal government was elected. I’m not sure why the process is taking so long. It’s not like the Canadian government is inventing a position or trailblazing in this regard. Many, many countries and jurisdictions have chief science advisors. Heck the European Union managed to find their first chief science advisor in considerably less time than we’ve spent on the project. My guess, it just wasn’t a priority.

Prime Minister Trudeau, quantum, nano, and Canada’s 150th birthday

In April 2016, Prime Minister Justin Trudeau stunned many when he was able to answer, in an articulate and informed manner, a question about quantum physics during a press conference at the Perimeter Institute in Waterloo, Ontario (my April 18, 2016 post discussing that incident and the so called ‘quantum valley’ in Ontario).

In Sept. 2016, the University of Waterloo publicized the world’s smallest Canadian flag to celebrate the country’s upcoming 150th birthday and to announce its presence in QUANTUM: The Exhibition (a show which will tour across Canada). Here’s more from my Sept. 20, 2016 posting,

The record-setting flag was unveiled at IQC’s [Institute of Quantum Computing at the University of Waterloo] open house on September 17 [2016], which attracted nearly 1,000 visitors. It will also be on display in QUANTUM: The Exhibition, a Canada 150 Fund Signature Initiative, and part of Innovation150, a consortium of five leading Canadian science-outreach organizations. QUANTUM: The Exhibition is a 4,000-square-foot, interactive, travelling exhibit IQC developed highlighting Canada’s leadership in quantum information science and technology.

“I’m delighted that IQC is celebrating Canadian innovation through QUANTUM: The Exhibition and Innovation150,” said Raymond Laflamme, executive director of IQC. “It’s an opportunity to share the transformative technologies resulting from Canadian research and bring quantum computing to fellow Canadians from coast to coast to coast.”

The first of its kind, the exhibition will open at THEMUSEUM in downtown Kitchener on October 14 [2016], and then travel to science centres across the country throughout 2017.

You can find the English language version of QUANTUM: The Exhibition website here and the French language version of QUANTUM: The Exhibition website here.

There are currently four other venues for the show once finishes its run in Waterloo. From QUANTUM’S Join the Celebration webpage,

2017

  • Science World at TELUS World of Science, Vancouver
  • TELUS Spark, Calgary
  • Discovery Centre, Halifax
  • Canada Science and Technology Museum, Ottawa

I gather they’re still looking for other venues to host the exhibition. If interested, there’s this: Contact us.

Other than the flag which is both nanoscale and microscale, they haven’t revealed what else will be included in their 4000 square foot exhibit but it will be “bilingual, accessible, and interactive.” Also, there will be stories.

Hmm. The exhibition is opening in roughly three weeks and they have no details. Strategy or disorganization? Only time will tell.

Calgary and quantum teleportation

This is one of my favourite stories of the year. Scientists at the University of Calgary teleported photons six kilometers from the university to city hall breaking the teleportation record. What I found particularly interesting was the support for science from Calgary City Hall. Here’s more from my Sept. 21, 2016 post,

Through a collaboration between the University of Calgary, The City of Calgary and researchers in the United States, a group of physicists led by Wolfgang Tittel, professor in the Department of Physics and Astronomy at the University of Calgary have successfully demonstrated teleportation of a photon (an elementary particle of light) over a straight-line distance of six kilometres using The City of Calgary’s fibre optic cable infrastructure. The project began with an Urban Alliance seed grant in 2014.

This accomplishment, which set a new record for distance of transferring a quantum state by teleportation, has landed the researchers a spot in the prestigious Nature Photonics scientific journal. The finding was published back-to-back with a similar demonstration by a group of Chinese researchers.

The research could not be possible without access to the proper technology. One of the critical pieces of infrastructure that support quantum networking is accessible dark fibre. Dark fibre, so named because of its composition — a single optical cable with no electronics or network equipment on the alignment — doesn’t interfere with quantum technology.

The City of Calgary is building and provisioning dark fibre to enable next-generation municipal services today and for the future.

“By opening The City’s dark fibre infrastructure to the private and public sector, non-profit companies, and academia, we help enable the development of projects like quantum encryption and create opportunities for further research, innovation and economic growth in Calgary,” said Tyler Andruschak, project manager with Innovation and Collaboration at The City of Calgary.

As for the science of it (also from my post),

A Sept. 20, 2016 article by Robson Fletcher for CBC (Canadian Broadcasting News) online provides a bit more insight from the lead researcher (Note: A link has been removed),

“What is remarkable about this is that this information transfer happens in what we call a disembodied manner,” said physics professor Wolfgang Tittel, whose team’s work was published this week in the journal Nature Photonics.

“Our transfer happens without any need for an object to move between these two particles.”

A Sept. 20, 2016 University of Calgary news release by Drew Scherban, which originated the news item, provides more insight into the research,

“Such a network will enable secure communication without having to worry about eavesdropping, and allow distant quantum computers to connect,” says Tittel.

Experiment draws on ‘spooky action at a distance’

The experiment is based on the entanglement property of quantum mechanics, also known as “spooky action at a distance” — a property so mysterious that not even Einstein could come to terms with it.

“Being entangled means that the two photons that form an entangled pair have properties that are linked regardless of how far the two are separated,” explains Tittel. “When one of the photons was sent over to City Hall, it remained entangled with the photon that stayed at the University of Calgary.”

Next, the photon whose state was teleported to the university was generated in a third location in Calgary and then also travelled to City Hall where it met the photon that was part of the entangled pair.

“What happened is the instantaneous and disembodied transfer of the photon’s quantum state onto the remaining photon of the entangled pair, which is the one that remained six kilometres away at the university,” says Tittel.

Council of Canadian Academies and The State of Science and Technology and Industrial Research and Development in Canada

Preliminary data was released by the CCA’s expert panel in mid-December 2016. I reviewed that material briefly in my Dec. 15, 2016 post but am eagerly awaiting the full report due late 2017 when, hopefully, I’ll have the time to critique the material, and which I hope will have more surprises and offer greater insights than the preliminary report did.

Colleagues

Thank you to my online colleagues. While we don’t interact much it’s impossible to estimate how encouraging it is to know that these people continually participate and help create the nano and/or science blogosphere.

David Bruggeman at his Pasco Phronesis blog keeps me up-to-date on science policy both in the US, Canada, and internationally, as well as, keeping me abreast of the performing arts/science scene. Also, kudos to David for raising my (and his audience’s) awareness of just how much science is discussed on late night US television. Also, I don’t know how he does it but he keeps scooping me on Canadian science policy matters. Thankfully, I’m not bitter and hope he continues to scoop me which will mean that I will get the information from somewhere since it won’t be from the Canadian government.

Tim Harper of Cientifica Research keeps me on my toes as he keeps shifting his focus. Most lately, it’s been on smart textiles and wearables. You can download his latest White Paper titled, Fashion, Smart Textiles, Wearables and Disappearables, from his website. Tim consults on nanotechnology and other emerging technologies at the international level.

Dexter Johnson of the Nanoclast blog on the IEEE (Institute of Electrical and Electronics Engineers) website consistently provides informed insight into how a particular piece of research fits into the nano scene and often provides historical details that you’re not likely to get from anyone else.

Dr. Andrew Maynard is currently the founding Director of the Risk Innovation Lab at the University of Arizona. I know him through his 2020 Science blog where he posts text and videos on many topics including emerging technologies, nanotechnologies, risk, science communication, and much more. Do check out 2020 Science as it is a treasure trove.

2017 hopes and dreams

I hope Canada’s Chief Science Advisor brings some fresh thinking to science in government and that the Council of Canadian Academies’ upcoming assessment on The State of Science and Technology and Industrial Research and Development in Canada is visionary. Also, let’s send up some collective prayers for the Canada Science and Technology Museum which has been closed since 2014 (?) due to black mold (?). It would be lovely to see it open in time for Canada’s 150th anniversary.

I’d like to see the nanotechnology promise come closer to a reality, which benefits as many people as possible.

As for me and FrogHeart, I’m not sure about the future. I do know there’s one more Steep project (I’m working with Raewyn Turner on a multiple project endeavour known as Steep; this project will involve sound and gold nanoparticles).

Should anything sparkling occur to me, I will add it at a future date.

In the meantime, Happy New Year and thank you from the bottom of my heart for reading this blog!

Tempest in a teapot or a sign of things to come? UK’s National Graphene Institute kerfuffle

A scandal-in-the-offing, intellectual property, miffed academics, a chortling businessman, graphene, and much more make this a fascinating story.

Before launching into the main attractions, those unfamiliar with the UK graphene effort might find this background informal useful. Graphene, was first isolated at the University of Manchester in 2004 by scientists Andre Geim* and Konstantin Novoselov, Russian immigrants, both of whom have since become Nobel laureates and knights of the realm. The excitement in the UK and elsewhere is due to graphene’s extraordinary properties which could lead to transparent electronics, foldable/bendable electronics, better implants, efficient and inexpensive (they hope) water filters, and more. The UK government has invested a lot of money in graphene as has the European Union (1B Euros in the Graphene Flagship) in the hope that huge economic benefits will be reaped.

Dexter Johnson’s March 15, 2016 posting on his Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers] website) provides details about the situation (Note: Links have been removed),

A technology that, a year ago, was being lauded as the “first commercially viable consumer product” using graphene now appears to be caught up in an imbroglio over who owns its intellectual property rights. The resulting controversy has left the research institute behind the technology in a bit of a public relations quagmire.

The venerable UK publication The Sunday Times reported this week on what appeared to be a mutiny occurring at the National Graphene Institute (NGI) located at the University of Manchester. Researchers at the NGI had reportedly stayed away from working at the institute’s gleaming new $71 million research facility over fears that their research was going to end up in the hands of foreign companies, in particular a Taiwan-based company called BGT Materials.

The “first commercially viable consumer product” noted in Dexter’s posting was a graphene-based lightbulb which was announced by the NGI to much loud crowing in March 2015 (see my March 30, 2015 posting). The company producing the lightbulb was announced as “… Graphene Lighting PLC is a spin-out based on a strategic partnership with the National Graphene Institute (NGI) at The University of Manchester to create graphene applications.” There was no mention of BGT.

Dexter describes the situation from the BGT perspective (from his March 15, 2016 posting), Note: Links have been removed,

… BGT did not demur when asked by  the Times whether it owned the technology. In fact, Chung Ping Lai, BGT’s CEO, claimed it was his company that had invented the technology for the light bulb and not the NGI. The Times report further stated that Lai controls all the key patents and claims to be delighted with his joint venture with the university. “I believe in luck and I have had luck in Manchester,” Lai told the Times.

With companies outside the UK holding majority stakes in the companies spun out of the NGI—allowing them to claim ownership of the technologies developed at the institute—one is left to wonder what was the purpose of the £50 million (US $79 million) earmarked for graphene research in the UK more than four years ago? Was it to develop a local economy based around graphene—a “Graphene Valley”, if you will? Or was it to prop up the local construction industry through the building of shiny new buildings that reportedly few people occupy? That’s the charge leveled by Andre Geim, Nobel laureate for his discovery of graphene, and NGI’s shining star. Geim reportedly described the new NGI building as: “Money put in the British building industry rather than science.”

Dexter ends his March 15, 2016 posting with an observation  that will seem familiar to Canadians,

Now, it seems the government’s eagerness to invest in graphene research—or at least, the facilities for conducting that research—might have ended up bringing it to the same place as its previous lack of investment: the science is done in the UK and the exploitation of the technology is done elsewhere.

The March 13, 2016 Sunday Times article [ETA on April 3, 2016: This article is now behind a paywall] by Tom Harper, Jon Ungoed-Thomas and Michael Sheridan, which seems to be the source of Dexter’s posting, takes a more partisan approach,

ACADEMICS are boycotting a top research facility after a company linked to China was given access to lucrative confidential material from one of Britain’s greatest scientific breakthroughs.

Some scientists at Manchester University working on graphene, a wonder substance 200 times stronger than steel, refuse to work at the new £61m national institution, set up to find ways to exploit the material, amid concerns over a deal struck between senior university management and BGT Materials.

The academics are concerned that the National Graphene Institute (NGI), which was opened last year by George Osborne, the chancellor, and forms one of the key planks of his “northern powerhouse” industrial strategy, does not have the necessary safeguards to protect their confidential research, which could revolutionise the electronics, energy, health and building industries.

BGT, which is controlled by a Taiwanese businessman, subsequently agreed to work with a Chinese manufacturing company and university to develop similar graphene technology.

BGT says its work in Manchester has been successful and it is “offensive” and “untrue” to suggest that it would unfairly use intellectual property. The university say there is no evidence “whatsoever” of unfair use of confidential information. Manchester says it is understandable that some scientists are cautious about the collaborative environment of the new institute. But one senior academic said the arrangement with BGT had caused the university’s graphene research to descend into “complete anarchy”.

The academic said: “The NGI is a national facility, and why should we use it for a company, which is not even an English [owned] company? How much [intellectual property] is staying in England and how much is going to Taiwan?”

The row highlights concerns that the UK has dawdled in developing one of its greatest discoveries. Nearly 50% of ­graphene-related patents have been filed in China, and just 1% in Britain.

Manchester signed a £5m “research collaboration agreement” with BGT Materials in October 2013. Although the company is controlled by a Taiwanese businessman, Chung-ping Lai, the university does have a 17.5% shareholding.

Manchester claimed that the commercial deal would “attract a significant number of jobs to the city” and “benefit the UK economy”.

However, an investigation by The Sunday Times has established:

Only four jobs have been created as a result of the deal and BGT has not paid the full £5m due under the agreement after two projects were cancelled.

Pictures sent to The Sunday Times by a source at the university last month show that the offices at the NGI [National Graphene Institute], which can accommodate 120 staff, were deserted.

British-based businessmen working with graphene have also told The Sunday Times of their concerns about the institute’s information security. Tim Harper, a Manchester-based graphene entrepreneur, said: “We looked at locating there [at the NGI] but we take intellectual property extremely seriously and it is a problem locating in such a facility.

“If you don’t have control over your computer systems or the keys to your lab, then you’ve got a problem.”

I recommend reading Dexter’s post and the Sunday Times article as they provide some compelling insight into the UK situation vis à vis nanotechnology, science, and innovation.

*’Gheim’ corrected to ‘Geim’ on March 30, 2016.

Swinging from 2015 to 2016 with FrogHeart

On Thursday, Dec. 31, 2015, the bear ate me (borrowed from Joan Armatrading’s song “Eating the bear”) or, if you prefer this phrase, I had a meltdown when I lost more than 1/2 of a post that I’d worked on for hours.

There’s been a problem dogging me for some months. I will write up something and save it as a draft only to find that most of the text has been replaced by a single URL repeated several times. I have not been able to source the problem which is intermittent. (sigh)

Moving on to happier thoughts, it’s a new year. Happy 2016!

As a way of swinging into the new year, here’s a brief wrap up for 2015.

International colleagues

As always, I thank my international colleagues David Bruggeman (Pasco Phronesis blog), Dexter Johnson (Nanoclast blog on the IEEE [International Electrical and Electronics Engineers website]), and Dr. Andrew Maynard (2020 science blog and Risk Innovation Laboratory at Arizona State University), all of whom have been blogging as long or longer than I have (FYI, FrogHeart began in April/May 2008). More importantly, they have been wonderful sources of information and inspiration.

In particular, David, thank you for keeping me up to date on the Canadian and international science policy situations. Also, darn you for scooping me on the Canadian science policy scene, on more than one occasion.

Dexter, thank you for all those tidbits about the science and the business of nanotechnology that you tuck into your curated blog. There’s always a revelation or two to be found in your writings.

Andrew, congratulations on your move to Arizona State University (from the University of Michigan Risk Science Center) where you are founding their Risk Innovation Lab.

While Andrew’s blog has become more focused on the topic of risk, Andrew continues to write about nanotechnology by extending the topic to emerging technologies.

In fact, I have a Dec. 3, 2015 post featuring a recent Nature article by Andrew on the occasion of the upcoming 2016 World Economic Forum in Davos. In it he discusses new approaches to risk as occasioned by the rise of emerging technologies such synthetic biology, nanotechnology, and more.

While Tim Harper, serial entrepreneur and scientist, is not actively blogging about nanotechnology these days, his writings do pop up in various places, notably on the Azonano website where he is listed as an expert, which he most assuredly is. His focus these days is in establishing graphene-based startups.

Moving on to another somewhat related topic. While no one else seems to be writing about nanotechnology as extensively as I do, there are many, many Canadian science bloggers.

Canadian colleagues

Thank you to Gregor Wolbring, ur Canadian science blogger and professor at the University of Calgary. His writing about human enhancement has become increasingly timely as we continue to introduce electronics onto and into our bodies. While he writes regularly, I don’t believe he’s blogging regularly. However, you can find out more about Gregor and his work  at  http://www.crds.org/research/faculty/Gregor_Wolbring2.shtml
or on his facebook page
https://www.facebook.com/GregorWolbring

Science Borealis (scroll down to get to the feeds), a Canadian science blog aggregator, is my main source of information on the Canadian scene. Thank you for my second Editors Pick award. In 2014 the award was in the Science in Society category and in 2015 it’s in the Engineering & Tech category (last item on the list).

While I haven’t yet heard about the results of Paige Jarreau’s and Science Borealis’ joint survey on the Canadian science blog readers (the reader doesn’t have to be Canadian but the science blog has to be), I was delighted to be asked and to participate. My Dec. 14, 2015 posting listed preliminary results,

They have compiled some preliminary results:

  • 21 bloggers + Science Borealis hosted the survey.
  • 523 respondents began the survey.
  • 338 respondents entered their email addresses to win a prize
  • 63% of 400 Respondents are not science bloggers
  • 56% of 402 Respondents describe themselves as scientists
  • 76% of 431 Respondents were not familiar with Science Borealis before taking the survey
  • 85% of 403 Respondents often, very often or always seek out science information online.
  • 59% of 402 Respondents rarely or never seek science content that is specifically Canadian
  • Of 400 Respondents, locations were: 35% Canada, 35% US, 30% Other.

And most of all, a heartfelt thank you to all who read this blog.

FrogHeart and 2015

There won’t be any statistics from the software packaged with my  hosting service (AWSTATS and Webalizer). Google and its efforts to minimize spam (or so it claims) had a devastating effect on my visit numbers. As I used those numbers as motivation, fantasizing that my readership was increasing, I had to find other means for motivation and am not quite sure how I did it but I upped publication to three posts per day (five-day week) throughout most of the year.

With 260 working days (roughly) in a year that would have meant a total of 780 posts. I’ve rounded that down to 700 posts to allow for days off and days where I didn’t manage three.

In 2015 I logged my 4000th post and substantially contributed to the Science Borealis 2015 output. In the editors’ Dec. 20, 2015 post,

… Science Borealis now boasts a membership of 122 blogs  — about a dozen up from last year. Together, this year, our members have posted over 4,400 posts, with two weeks still to go….

At a rough guess, I’d estimate that FrogHeart was responsible for 15% of the Science Borealis output and 121 bloggers were responsible for the other 85%.

That’s enough for 2015.

FrogHeart and 2016

Bluntly, I do not know anything other than a change of some sort is likely.

Hopefully, I will be doing more art/science projects (my last one was ‘A digital poetry of gold nanoparticles’). I was awarded a small grant ($400 CAD) from the Canadian Academy of Independent Scholars (thank you!) for a spoken word project to be accomplished later this year.

As for this blog, I hope to continue.

In closing, I think it’s only fair to share Joan Armatrading’s song, ‘Eating the bear’. May we all do so in 2016,

Bonne Année!

Nanopores and a new technique for desalination

There’s been more than one piece here about water desalination and purification and/or remediation efforts and at least one of them claims to have successfully overcome issues such as reverse osmosis energy needs which are hampering adoption of various technologies. Now, researchers at the University of Illinois at Champaign Urbana have developed another new technique for desalinating water while reverse osmosis issues according to a Nov. 11, 2015 news item on Nanowerk (Note: A link has been removed) ,

University of Illinois engineers have found an energy-efficient material for removing salt from seawater that could provide a rebuttal to poet Samuel Taylor Coleridge’s lament, “Water, water, every where, nor any drop to drink.”

The material, a nanometer-thick sheet of molybdenum disulfide (MoS2) riddled with tiny holes called nanopores, is specially designed to let high volumes of water through but keep salt and other contaminates out, a process called desalination. In a study published in the journal Nature Communications (“Water desalination with a single-layer MoS2 nanopore”), the Illinois team modeled various thin-film membranes and found that MoS2 showed the greatest efficiency, filtering through up to 70 percent more water than graphene membranes. [emphasis mine]

I’ll get to the professor’s comments about graphene membranes in a minute. Meanwhile, a Nov. 11, 2015 University of Illinois news release (also on EurekAlert), which originated the news item, provides more information about the research,

“Even though we have a lot of water on this planet, there is very little that is drinkable,” said study leader Narayana Aluru, a U. of I. professor of mechanical science and engineering. “If we could find a low-cost, efficient way to purify sea water, we would be making good strides in solving the water crisis.

“Finding materials for efficient desalination has been a big issue, and I think this work lays the foundation for next-generation materials. These materials are efficient in terms of energy usage and fouling, which are issues that have plagued desalination technology for a long time,” said Aluru, who also is affiliated with the Beckman Institute for Advanced Science and Technology at the U. of I.

Most available desalination technologies rely on a process called reverse osmosis to push seawater through a thin plastic membrane to make fresh water. The membrane has holes in it small enough to not let salt or dirt through, but large enough to let water through. They are very good at filtering out salt, but yield only a trickle of fresh water. Although thin to the eye, these membranes are still relatively thick for filtering on the molecular level, so a lot of pressure has to be applied to push the water through.

“Reverse osmosis is a very expensive process,” Aluru said. “It’s very energy intensive. A lot of power is required to do this process, and it’s not very efficient. In addition, the membranes fail because of clogging. So we’d like to make it cheaper and make the membranes more efficient so they don’t fail as often. We also don’t want to have to use a lot of pressure to get a high flow rate of water.”

One way to dramatically increase the water flow is to make the membrane thinner, since the required force is proportional to the membrane thickness. Researchers have been looking at nanometer-thin membranes such as graphene. However, graphene presents its own challenges in the way it interacts with water.

Aluru’s group has previously studied MoS2 nanopores as a platform for DNA sequencing and decided to explore its properties for water desalination. Using the Blue Waters supercomputer at the National Center for Supercomputing Applications at the U. of I., they found that a single-layer sheet of MoS2 outperformed its competitors thanks to a combination of thinness, pore geometry and chemical properties.

A MoS2 molecule has one molybdenum atom sandwiched between two sulfur atoms. A sheet of MoS2, then, has sulfur coating either side with the molybdenum in the center. The researchers found that creating a pore in the sheet that left an exposed ring of molybdenum around the center of the pore created a nozzle-like shape that drew water through the pore.

“MoS2 has inherent advantages in that the molybdenum in the center attracts water, then the sulfur on the other side pushes it away, so we have much higher rate of water going through the pore,” said graduate student Mohammad Heiranian, the first author of the study. “It’s inherent in the chemistry of MoS2 and the geometry of the pore, so we don’t have to functionalize the pore, which is a very complex process with graphene.”

In addition to the chemical properties, the single-layer sheets of MoS2 have the advantages of thinness, requiring much less energy, which in turn dramatically reduces operating costs. MoS2 also is a robust material, so even such a thin sheet is able to withstand the necessary pressures and water volumes.

The Illinois researchers are establishing collaborations to experimentally test MoS2 for water desalination and to test its rate of fouling, or clogging of the pores, a major problem for plastic membranes. MoS2 is a relatively new material, but the researchers believe that manufacturing techniques will improve as its high performance becomes more sought-after for various applications.

“Nanotechnology could play a great role in reducing the cost of desalination plants and making them energy efficient,” said Amir Barati Farimani, who worked on the study as a graduate student at Illinois and is now a postdoctoral fellow at Stanford University. “I’m in California now, and there’s a lot of talk about the drought and how to tackle it. I’m very hopeful that this work can help the designers of desalination plants. This type of thin membrane can increase return on investment because they are much more energy efficient.”

Here’s a link to and a citation for the paper,

Water desalination with a single-layer MoS2 nanopore by Mohammad Heiranian, Amir Barati Farimani, & Narayana R. Aluru. Nature Communications 6, Article number: 8616 doi:10.1038/ncomms9616 Published 14 October 2015

Graphene membranes

In a July 13, 2015 essay on Nanotechnology Now, Tim Harper provides an overview of the research into using graphene for water desalination and purification/remediation about which he is quite hopeful. There is no mention of an issue with interactions between water and graphene. It should be noted that Tim Harper is the Chief Executive Officer of G20, a company which produces a graphene-based solution (graphene oxide sheets), which can desalinate water and can purify/remediate it. Tim is a scientist and while you might have some hesitation given his fiscal interests, his essay is worthwhile reading as he supplies context and explanations of the science.

Graphene and water (G20 Water commentary)

Tim Harper’s, Chief Executive Officer (CEO) of G2O Water, July 13, 2015 commentary was published on Nanotechnology Now. Harper, a longtime figure in the nanotechnology community (formerly CEO of Cientifica, an emerging technologies consultancy and current member of the World Economic Forum, not unexpectedly focused on water,

In the 2015 World Economic Forum’s Global Risks Report survey participants ranked Water Crises as the biggest of all risks, higher than Weapons of Mass Destruction, Interstate Conflict and the Spread of Infectious Diseases (pandemics). Our dependence on the availability of fresh water is well documented, and the United Nations World Water Development Report 2015 highlights a 40% global shortfall between forecast water demand and available supply within the next fifteen years. Agriculture accounts for much of the demand, up to 90% in most of the world’s least-developed countries, and there is a clear relationship between water availability, health, food production and the potential for civil unrest or interstate conflict.

The looming crisis is not limited to water for drinking or agriculture. Heavy metals from urban pollution are finding their way into the aquatic ecosystem, as are drug residues and nitrates from fertilizer use that can result in massive algal blooms. To date, there has been little to stop this accretion of pollutants and in closed systems such as lakes these pollutants are being concentrated with unknown long term effects.

Ten years ago, following discussions with former Israeli Prime Minister Shimon Peres, I organised a conference in Amsterdam called Nanowater to look at how nanotechnology could address global water issues. [emphasis mine] While the meeting raised many interesting points, and many companies proposed potential solutions, there was little subsequent progress.

Rather than a simple mix of one or two contaminants, most real world water can contain hundreds of different materials, and pollutants like heavy metals may be in the form of metal ions that can be removed, but are equally likely to be bound to other larger pieces of organic matter which cannot be simply filtered through nanopores. In fact the biggest obstacle to using nanotechnology in water treatment is the simple fact that small holes are easily blocked, and susceptibility to fouling means that most nanopore membranes quickly become barriers instead of filters.

Fortunately some recent developments in the ‘wonder material’ graphene may change the economics of water. One of the major challenges in the commercialisation of graphene is the ability to create large areas of defect-free material that would be suitable for displays or electronics, and this is a major research topic in Europe where the European Commission is funding graphene research to the tune of a billion euros. …

Tim goes on to describe some graphene-based solutions including a technology developed at the University of South Carolina, which is also mentioned in a July 16, 2015 G20 Water press release,

Fouling of nano/ultrafiltration membranes in oil/water separation is a longstanding issue and a major economic barrier for their widespread adoption. Currently membranes typically show severe fouling, resulting from the strong adhesion of oil on the membrane surface and/or oil penetration inside the membranes. This greatly degrades their performance and shortens service lifetime as well as increasing the energy usage.

G2O™s bio inspired approach uses graphene oxide (GO) for the fabrication of fully-recoverable membranes for high flux, antifouling oil/water separation via functional and structural mimicking of fish scales. The ultra-thin, amphiphilic, water-locking GO coating mimics the thin mucus layer covering fish scales, while the combination of corrugated GO flakes and intrinsic roughness of the porous supports successfully reproduces the hierarchical roughness of fish scales. Cyclic membrane performance evaluation tests revealed circa 100% membrane recovery by facile surface water flushing, establishing their excellent easy-to-recover capability.

The pore sizes can be tuned to specific applications such as water desalination, oil/water separation, storm water treatment and industrial waste water recovery. By varying the GO concentration in water, GO membranes with different thickness can be easily fabricated via a one-time filtration process.
G2O™s patented graphene oxide technology acts as a functional coating for modifying the surface properties of existing filter media resulting in:
Higher pure water flux;
High fouling resistance;
Excellent mechanical strength;
High chemical stability;
Good thermal stability;
Low cost.

We’re going through a water shortage here in Vancouver, Canada after a long spring season which distinguished itself with a lack of rain and the introduction of a heatwave extending into summer. It is by no means equivalent to the situation in many parts of the world but it does give even those of us who are usually waterlogged some insight into what it means when there isn’t enough water.

For more insight into water crises with a special focus on the Middle East (notice Harper mentioned Israel’s former Prime Minister Shimon Peres in his commentary), I have a Feb. 24, 2014 posting (Water desalination to be researched at Oman’s newly opened Nanotechnology Laboratory at Sultan Qaboos University) and a June 25, 2013 post (Nanotechnology-enabled water resource collaboraton between Israel and Chicago).

You can check out the World Economic Forum’s Outlook on the Global Agenda 2015 here.

The Outlook on the Global Agenda 2015 features an analysis of the Top 10 trends which will preoccupy our experts for the next 12-18 months as well as the key challenges facing the world’s regions, an overview of global leadership and governance, and the emerging issues that will define our future.

G20 Water can be found here.

Europe’s search for raw materials and hopes for nanotechnology-enabled solutions

A Feb. 27, 2015 news item on Nanowerk highlights the concerns over the availability of raw materials and European efforts to address those concerns,

Critical raw materials’ are crucial to many European industries but they are vulnerable to scarcity and supply disruption. As such, it is vital that Europe develops strategies for meeting the demand for raw materials. One such strategy is finding methods or substances that can replace the raw materials that we currently use. With this in mind, four EU projects working on substitution in catalysis, electronics and photonics presented their work at the Third Innovation Network Workshop on substitution of Critical Raw Materials hosted by the CRM_INNONET project in Brussels earlier this month [February 2015].

A Feb. 26, 2015 CORDIS press release, which originated the news item, goes on to describe four European Union projects working on nanotechnology-enabled solutions,

NOVACAM

NOVACAM, a coordinated Japan-EU project, aims to develop catalysts using non-critical elements designed to unlock the potential of biomass into a viable energy and chemical feedstock source.

The project is using a ‘catalyst by design’ approach for the development of next generation catalysts (nanoscale inorganic catalysts), as NOVACAM project coordinator Prof. Emiel Hensen from Eindhoven University of Technology in the Netherlands explained. Launched in September 2013, the project is developing catalysts which incorporate non-critical metals to catalyse the conversion of lignocellulose into industrial chemical feedstocks and bio-fuels. The first part of the project has been to develop the principle chemistry while the second part is to demonstrate proof of process. Prof. Hensen predicts that perhaps only two of three concepts will survive to this phase.

The project has already made significant progress in glucose and ethanol conversion, according to Prof. Hensen, and has produced some important scientific publications. The consortium is working with and industrial advisory board comprising Shell in the EU and Nippon Shokubai in Japan.

FREECATS

The FREECATS project, presented by project coordinator Prof. Magnus Rønning from the Norwegian University of Science and Technology, has been working over the past three years to develop new metal-free catalysts. These would be either in the form of bulk nanomaterials or in hierarchically organised structures – both of which would be capable of replacing traditional noble metal-based catalysts in catalytic transformations of strategic importance.

Prof. Magnus Rønning explained that the application of the new materials could eliminate the need for the use for platinum group metals (PGM) and rare earth metals – in both cases Europe is very reliant on other countries for these materials. Over the course of its research, FREECATS targeted three areas in particular – fuel cells, the production of light olefins and water and wastewater purification.

By working to replace the platinum in fuel cells, the project is supporting the EU’s aim of replacing the internal combustion engine by 2050. However, as Prof. Rønning noted, while platinum has been optimized for use over several decades, the materials FREECATS are using are new and thus come with their new challenges which the project is addressing.

HARFIR

Prof. Atsufumi Hirohata of the University of York in the United Kingdom, project coordinator of HARFIR, described how the project aims to discover an antiferromagnetic alloy that does not contain the rare metal Iridium. Iridium is becoming more and more widely used in numerous spin electronic storage devices, including read heads in hard disk drives. The world supply depends on Platinum ore that comes mainly from South Africa. The situation is much worse than for other rare earth elements as the price has been shooting up over recent years, according to Prof. Hirohata.

The HARFIR team, divided between Europe and Japan, aims to replace Iridium alloys with Heusler alloys. The EU team, led by Prof. Hirohata, has been working on the preparation of polycrystalline and epitaxial thin films of Heusler Alloys, with the material design led by theoretical calculations. The Japanese team, led by Prof. Koki Takanashi at Tohoku University, is meanwhile working on the preparation of epitaxial thin films, measurements of fundamental properties and structural/magnetic characterisation by neutron and synchrotron x-ray beams.

One of the biggest challenges has been that Heusler alloys have a relatively complicated atomic structure. In terms of HARFIR’s work, if any atomic disordering at the edge of nanopillar devices, the magnetic properties that are needed are lost. The team is exploring solutions to this challenge.

IRENA

Prof. of Esko Kauppinen Aalto University in Finland closed off the first session of the morning with his presentation of the IRENA project. Launched in September 2013, the project will run until mid 2017 working towards the aim of developing high performance materials, specifically metallic and semiconducting single-walled carbon nanotube (SWCNT) thin films to completely eliminate the use of the critical metals in electron devices. The ultimate aim is to replace Indium in transparent conducting films, and Indium and Gallium as a semiconductor in thin film field effect transistors (TFTs).

The IRENA team is developing an alternative that is flexible, transparent and stretchable so that it can meet the demands of the electronics of the future – including the possibility to print electronics.

IRENA involves three partners from Europe and three from Japan. The team has expertise in nanotube synthesis, thin film manufacturing and flexible device manufacturing, modelling of nanotube growth and thin film charge transport processes, and the project has benefitted from exchanges of team members between institutions. One of the key achievements so far is that the project has succeeded in using a nanotube thin film for the first time as the both the electrode and hole blocking layer in an organic solar cell.

You’ll note that Japan is a partner in all of these projects. In all probability, these initiatives have something to do with rare earths which are used in much of today’s electronics technology and Japan is sorely lacking in those materials. China, by comparison, has dominated the rare earths export industry and here’s an excerpt from my Nov. 1, 2013 posting where I outline the situation (which I suspect hasn’t changed much since),

As for the short supply mentioned in the first line of the news item, the world’s largest exporter of rare earth elements at 90% of the market, China, recently announced a cap according to a Sept. 6, 2013 article by David Stanway for Reuters. The Chinese government appears to be curtailing exports as part of an ongoing, multi-year strategy. Here’s how Cientifica‘s (an emerging technologies consultancy, etc.) white paper (Simply No Substitute?) about critical materials published in 2012 (?), described the situation,

Despite their name, REE are not that rare in the Earth’s crust. What has happened in the past decade is that REE exports from China undercut prices elsewhere, leading to the closure of mines such as the Mountain Pass REE mine in California. Once China had acquired a dominant market position, prices began to rise. But this situation will likely ease. The US will probably begin REE production from the Mountain Pass mine later in 2012, and mines in other countries are expected to start operation soon as well.

Nevertheless, owing to their broad range of uses REE will continue to exert pressures on their supply – especially for countries without notable REE deposits. This highlights two aspects of importance for strategic materials: actual rarity and strategic supply issues such as these seen for REE. Although strategic and diplomatic supply issues may have easier solutions, their consideration for manufacturing industries will almost be the same – a shortage of crucial supply lines.

Furthermore, as the example of REE shows, the identification of long-term supply problems can often be difficult, and not every government has the same strategic foresight that the Chinese demonstrated. And as new technologies emerge, new elements may see an unexpected, sudden demand in supply. (pp. 16-17)

Meanwhile, in response to China’s decision to cap its 2013 REE exports, the Russian government announced a $1B investment to 2018 in rare earth production,, according to a Sept. 10, 2013 article by Polina Devitt for Reuters.

I’m not sure you’ll be able to access Tim Harper’s white paper as he is now an independent, serial entrepreneur. I most recently mentioned him in relation to his articles (on Azonano) about the nanotechnology scene in a Feb. 12, 2015 posting where you’ll also find contact details for him.

The business of nano; the business of graphene

There are a couple of recent columns by Tim Harper, a well known and longstanding figure in the ‘nano community’, about business predictions and the nanotechnology and graphene markets, respectively I want to feature here. (See my July 15, 2011 interview with Harper about his report on global funding of nanotechnology for a description of him, his then-business, Cientifica, and his perspective on the nanotechnology enterprise at that time.)

One of Harper’s most recent writings, in a Jan. 2, 2015 column on Azonano, is a look back at business predictions for nanotechnology (Note: Links have been removed). What makes this particularly interesting is that Tim was part of the UK effort in its earliest days and has consulted with governments (including Canada) on their nanotechnology and commercialization efforts,

One of the most widely repeated predictions for nanotechnologies was its role in the creation of a trillion dollar industry by 2015, predicted by Mike Roco [one of the moving forces behind the US National Nanotechnology Initiative enacted in 2000] and his colleagues at the National Science Foundation.2

Looking back at the original National Nanotechnology Initiative forecasts, the biggest economic contributions of nanotechnology came from materials ($340bn), electronics ($300bn), pharmaceuticals ($180bn), chemicals ($100bn), transportation ($70bn) and sustainability ($100bn).

But as is often the case with headline numbers, these were not the product of a huge data collection exercise, but estimates based on a few reports and private communications (see below).

….

The large numbers caused some debate at the time as to whether it was the value of the nanotechnology, or the value of the product, that should be used. One oft-cited example was that in some analyses, the addition of a nanotech-based anti-scratch paint to an automobile would result in the entire value of the car being added to the “nanotechnology market’ column, while in others it would be just the value of the nanoparticles used.

My preference at the time was to use the value of something that would not have existed without the nanotechnology; the automobile clearly would have done, but the anti-scratch paint would not.

While market numbers are always speculative I can still point to one prediction I got right: “there is not, and never will be, a nanotechnology industry”.3

Fifteen years on from the inception of the National Nanotechnology Initiative, there’s not much to carp about. Nanotechnology research is well funded globally, and leading to exactly the kind of breakthroughs that were envisaged back in the late 90’s. As nobody managed to predict the iPhone, Twitter or Facebook, that is remarkable.

The greatest legacy of the mythical “trillion dollar market” was the fear of missing out (or even of allowing the US to dominate), and that was sufficient to spur many similar efforts in other countries. This, combined with widespread adoption of the Internet, made nanotechnology the first truly global scientific revolution.

For anyone who likes to research, Tim provides a list of references used to support his contentions.

He then writes a Feb. 4, 2015 column on Azonano about graphene , which provides an interesting contrast (Note: Links have been removed),

The discussion of the trillion dollar market for nanotechnologies has generated quite a bit of interest and discussion. Anyone who remembers nanotechnology a decade ago will notice that graphene is going through a similar period of hype.

The one thing missing from all the discussion of graphene is any inflated market numbers. In fact, compared to the frenzied overhyping of nanotechnologies, the estimates for graphene markets tend to be conservative in the extreme.

A rash of recent market estimates towards the end of last year put the international market for graphene in the range of a few hundred million dollars. That’s pretty much the same amount as has been raised by or invested in graphene producers around the world, and investing $150 million to unlock a market worth $150 million doesn’t seem to make very much sense to me. So are graphene producers completely wrong, or are the market estimates wildly inaccurate?

Confusingly, it appears that everybody is right. It just happens that we are talking about different kinds of graphene at different points in the value chain.

… Some have bought pure graphene to play with themselves, but in reality industry wants to buy inks, dispersions and master batches, rather than have the hassle of taking a bag of black powder and adapting it for applications which may be rather ill-defined at this point. Providing those ready-to-use products is what will unlock the market for graphene.

This turns out to be rather good news for graphene producers, because in general an ink containing perhaps a 20% loading of graphene nano platelets (GNPs) can represent a 5000% markup over the cost of the raw material. A rather simplistic extrapolation from this suggests a $1 billion graphene intermediates market within five years.

And it gets better. Some of the GNPs show good potential as a carbon black substitute – a 2% loading of GNP could perform at least as well as a 20% loading of carbon black. Even if the GNP price is 7-8 times higher than carbon black, there is still a significant margin for the end user to play with.

Woohoo! Now that’s something I can probably talk to investors about without being shown the door after my second PowerPoint slide. And when the inevitable comment, “you predict a market of a billion while these guys say 100 million,” comes up, I’ll have a snappy comeback.

There’s more information about Tim and there are more posts on his website, timharper.net. While, he does offer three different links to additional biographical information from timharper.net, I have a particular affection for his Visualize.me bio page.