Tag Archives: Korea

Courtesy of graphene: world’s thinnest light bulb

Columbia University’s (US) School of Engineering and Applied Science is trumpeting an achievement with graphene, i.e., the world’s thinnest light bulb. From a June 15, 2015 Columbia Engineering news release (also on EurekAlert),

Led by Young Duck Kim, a postdoctoral research scientist in James Hone’s group at Columbia Engineering, a team of scientists from Columbia, Seoul National University (SNU), and Korea Research Institute of Standards and Science (KRISS) reported today that they have demonstrated — for the first time — an on-chip visible light source using graphene, an atomically thin and perfectly crystalline form of carbon, as a filament. They attached small strips of graphene to metal electrodes, suspended the strips above the substrate, and passed a current through the filaments to cause them to heat up.

“We’ve created what is essentially the world’s thinnest light bulb,” says Hone, Wang Fon-Jen Professor of Mechanical Engineering at Columbia Engineering and coauthor of the study. “This new type of ‘broadband’ light emitter can be integrated into chips and will pave the way towards the realization of atomically thin, flexible, and transparent displays, and graphene-based on-chip optical communications.”

The news release goes on to describe some of the issues associated with generating light on a chip and how the researchers approached the problems (quick answer: they used graphene as the filament),

Creating light in small structures on the surface of a chip is crucial for developing fully integrated “photonic” circuits that do with light what is now done with electric currents in semiconductor integrated circuits. Researchers have developed many approaches to do this, but have not yet been able to put the oldest and simplest artificial light source—the incandescent light bulb—onto a chip. This is primarily because light bulb filaments must be extremely hot—thousands of degrees Celsius—in order to glow in the visible range and micro-scale metal wires cannot withstand such temperatures. In addition, heat transfer from the hot filament to its surroundings is extremely efficient at the microscale, making such structures impractical and leading to damage of the surrounding chip.

By measuring the spectrum of the light emitted from the graphene, the team was able to show that the graphene was reaching temperatures of above 2500 degrees Celsius, hot enough to glow brightly. “The visible light from atomically thin graphene is so intense that it is visible even to the naked eye, without any additional magnification,” explains Kim, first and co-lead author on the paper.

Interestingly, the spectrum of the emitted light showed peaks at specific wavelengths, which the team discovered was due to interference between the light emitted directly from the graphene and light reflecting off the silicon substrate and passing back through the graphene. Kim notes, “This is only possible because graphene is transparent, unlike any conventional filament, and allows us to tune the emission spectrum by changing the distance to the substrate.”

The ability of graphene to achieve such high temperatures without melting the substrate or the metal electrodes is due to another interesting property: as it heats up, graphene becomes a much poorer conductor of heat. This means that the high temperatures stay confined to a small “hot spot” in the center.

“At the highest temperatures, the electron temperature is much higher than that of acoustic vibrational modes of the graphene lattice, so that less energy is needed to attain temperatures needed for visible light emission,” Myung-Ho Bae, a senior researcher at KRISS and co-lead author, observes. “These unique thermal properties allow us to heat the suspended graphene up to half of the temperature of the sun, and improve efficiency 1000 times, as compared to graphene on a solid substrate.”

The team also demonstrated the scalability of their technique by realizing large-scale of arrays of chemical-vapor-deposited (CVD) graphene light emitters.

Yun Daniel Park, professor in the Department of Physics and Astronomy at Seoul National University and co-lead author, notes that they are working with the same material that Thomas Edison used when he invented the incandescent light bulb: “Edison originally used carbon as a filament for his light bulb and here we are going back to the same element, but using it in its pure form—graphene—and at its ultimate size limit—one atom thick.”

The group is currently working to further characterize the performance of these devices—for example, how fast they can be turned on and off to create “bits” for optical communications—and to develop techniques for integrating them into flexible substrates.

Hone adds, “We are just starting to dream about other uses for these structures—for example, as micro-hotplates that can be heated to thousands of degrees in a fraction of a second to study high-temperature chemical reactions or catalysis.”

Here’s a link to and a citation for the paper,

Bright visible light emission from graphene by Young Duck Kim, Hakseong Kim, Yujin Cho, Ji Hoon Ryoo, Cheol-Hwan Park, Pilkwang Kim, Yong Seung Kim, Sunwoo Lee, Yilei Li, Seung-Nam Park, Yong Shim Yoo, Duhee Yoon, Vincent E. Dorgan, Eric Pop, Tony F. Heinz, James Hone, Seung-Hyun Chun, Hyeonsik Cheong, Sang Wook Lee,    Myung-Ho Bae, & Yun Daniel Park. Nature Nanotechnology (2015) doi:10.1038/nnano.2015.118 Published online 15 June 2015

This paper is behind a paywall.

Two final notes: there was an announcement earlier this year (mentioned in my March 30, 2015 post) that a graphene light bulb would be in stores this year. Dexter Johnson notes in his June 15, 2015 post (Nanoclast blog on the IEEE [International Institute of Electrical and Electronics Engineers] website) that the earlier light bulb has a graphene coating. You may want to check out Dexter’s posting about the latest light bulb achievement as he also includes an embedded video illustrating how Columbia Engineering’s graphene filament works.

Nanotech and the oil and gas industry: a webinar

How serendipitous! I stumbled on an announcement from Park Systems for a webinar designed for the oil and gas industry after my June 8, 2015 post featuring Abakan and its new Alberta (Canada)-based cladding facility designed for oil and gas pipes in particular. From a June 8, 2015 news item on Nanowerk,

Park Systems, world-leader in atomic force microscopy (AFM) today announced a webinar to provide next generation technology to improve oil and gas production in both traditional drilling and hydraulic fracturing for oil & gas producers and equipment manufacturers as they continue to pursue the latest developments in production efficiencies.

A June 8, 2015 Park Systems news release, which originated the news item, expands on the theme,

The oil and gas industry is ripe for innovation and the cost of extracting oil can be reduced. Research at PETRO Case Consortium is uncovering new materials, chemicals and coatings that improves yield and reduce costs and with an eye towards diminishing the impact on our environment. This webinar is part of an ongoing series offered by Park System’s new Nano Academy, a platform for providing education and shared knowledge on the latest advancements across a wide spectrum of nanosciences.

This webinar titled Nanostructured Polymers and Nanomaterials for Oil & Gas will be given June 11 [2015] by Dr. Rigoberto Advincula, Director of the Petro Case Consortium and Professor with the Department of Macromolecular Science and Engineering at Case Western Reserve University and is designed to offer innovations in microscopy nanotechnology for oil & gas producers and suppliers.

“Our best in class AFM equipment registers nanoparticle observations and analysis not previously available that extends the ability to analyze chemicals and materials to develop the optimum efficiency,” said Keibock Lee, President of Park Systems. “We are proud to offer this webinar for the oil & gas industry, showcasing Dr. Advincula’s outstanding contribution towards cost reduction and sustainability for the current energy producers and paving the way for future innovations that can enable global energy solutions.”

PETRO Case Consortium at Case Western [Reserve] University, led by Dr. Advincula, is working hard to ensure that the industry can catch up with new technology and apply it to oil & gas production that improves productivity by creating longer lasting concrete, coatings and apply other methods to increase yield in production. This webinar is the first of a series that will cover multiple topics related to nano scale developments across a wide variety of research applications and bio scientific fields.
“Hydraulic fracturing and directional drilling has unlocked many resources,” states Dr. Advincula. “Revolutionary new microscopy technology provided thru Park Systems AFM (Atomic Force Microscopy) and new innovations in chemical and material research indicates that there is a defined opportunity to use the advances in chemistry, materials, and nanoscience to make valuable industry process updates.”

For the last 10 years there has been an increase in interest and research for new materials useful for upstream, midstream, and downstream processes to effectively find function in demanding environments including directional drilling and hydraulic fracturing. High temperature high pressure (HT/HP) and brine conditions pose a challenge for emulsification, demulsification, and viscosity of drilling fluids. Usually the “easy” oil or conventional oil has allowed technologies even dating back to the first oil well in Pennsylvania to become very profitable. But with high pressure high temperature (HPHT) conditions in the most challenging wells, many of the established technologies and materials do not suffice.

The discovery driven group, PETRO Case Consortium at Case Western University, a Park AFM user, investigates the area of molecular, macromolecular, and supramolecular synthesis and structure of polymers and nanomaterials capable of controlled-assembly to form ultrathin films and dispersions with the aim of finding new technologies and materials that improve and replace established oil and gas field formations.

For instance, the evaluation of chemicals and changing or altering the formulas can greatly improve production yields. Different chemicals used for the field include inhibitors for scaling, fouling, corrosion, asphaltene control, formation damage, differential pressures in multiphase environments which will be met by new synthesis methods including metathesis reactions, bio based feedstocks, new polymer surfactants, living polymers, and nanoparticle. Other uses of new chemical technologies include tracers and reporters for geomapping and well connectivity, as well as different types of fluid loss agents that prevent formation damage or keep well integrity, and smart and stimuli-responsive nanoparticles that can be used for improving gelation.

This webinar is available at no cost and is part of Park Systems Nano Academy which will offer valuable education and shared knowledge across many Nano Science Disciplines and Industries as a way to further enable NanoScale advancements. To register go to: http://bit.do/polyoilgas

Webinar logistics (from the Park Systems news release),

About Webinar
Title: Nanostructured Polymers and Nanomaterials for Oil & Gas
Date: June 11, 2015
Time: 9am PST
To Register, go to: http://bit.do/polyoilgas
Pre-requisite: Knowledge of oil field chemicals and rubber materials is preferred but not required.

Here’s more about the expert (from the news release),

About Prof. Rigoberto Advincula
Prof. Rigoberto Advincula, Director of the Petro Case Consortium, is recognized industry-wide as an expert regarding polymer and materials challenges of the oil-gas industry. He is currently a Professor with the Department of Macromolecular Science and Engineering at Case Western Reserve University and is the recipient of numerous awards including Fellow of the American Chemical Society, Herman Mark Scholar Award of the Polymer Division, and Humboldt Fellow.

The news release also included some information about Park Systems,

About Park Systems
Park Systems is a world-leading manufacturer of atomic force microscopy (AFM) systems with a complete range of products for researchers and industry engineers in chemistry, materials, physics, life sciences, semiconductor and data storage industries. Park’s products are used by over a thousand of institutions and corporations worldwide. Park’s AFM provides highest data accuracy at nanoscale resolution, superior productivity, and lowest operating cost thanks to its unique technology and innovative engineering. Park Systems, Inc. is headquartered in Santa Clara, California with its global manufacturing, and R&D headquarters in Korea. Park’s products are sold and supported worldwide with regional headquarters in the US, Korea, Japan, and Singapore, and distribution partners throughout Europe, Asia, and America. Please visit http://www.parkafm.com or call 408-986-1110 for more information.

So there you have it.

The birth of a molecule

This research comes from Korea’s Institute of Basic Science in a Feb. 27, 2015 news item on Azonano,

The research team of the Center for Nanomaterials and Chemical Reactions at the Institute for Basic Science (IBS) has successfully visualized the entire process of bond formation in solution by using femtosecond time-resolved X-ray liquidography (femtosecond TRXL) for the first time in the world.

A Feb. 18, 2015 IBS press release, which originated the news item, provides more details,

Every researcher’s longstanding dream to observe real-time bond formation in chemical reactions has come true. Since this formation takes less than one picosecond, researchers have not been able to visualize the birth of molecules.

The research team has used femtosecond TRXL in order to visualize the formation of a gold trimer complex in real time without being limited by slow diffusion.

They have focused on the process of photoinduced bond formation between gold (Au) atoms dissolved in water. In the ground (S0) state, Au atoms are weakly bound to each other in a bent geometry by van der Waals interactions. On photoexcitation, the S0 state rapidly converts into an excited (S1) state, leading to the formation of covalent Au-Au bonds and bent-to-linear transition. Then, the S1 state changes to a triplet (T1) state with a time constant of 1.6 picosecond, accompanying further bond contraction by 0.1 Å. Later, the T1 state of the trimer transforms to a tetramer on nanosecond time scale, and Au atoms return to their original bent structure.

“By using femtosecond TRXL, we will be able to observe molecular vibration and rotation in the solution phase in real time,” says Hyotcherl Ihee, the group leader of the Center for Nanomaterials at IBS, as well as the professor of the Department of Chemistry at Korea Advanced Institute of Science and Technology.

Here’s a link to and a citation for the paper,

Direct observation of bond formation in solution with femtosecond X-ray scattering by Kyung Hwan Kim, Jong Goo Kim, Shunsuke Nozawa, Tokushi Sato, Key Young Oang, Tae Wu Kim, Hosung Ki, Junbeom Jo, Sungjun Park, Changyong Song, Takahiro Sato, Kanade Ogawa, Tadashi Togashi, Kensuke Tono, Makina Yabashi, Tetsuya Ishikawa, Joonghan Kim, Ryong Ryoo, Jeongho Kim, Hyotcherl Ihee & Shin-ichi Adachi. Nature 518, 385–389 (19 February 2015) doi:10.1038/nature14163 Published online 18 February 2015

This paper is behind a paywall although there is a free preview via ReadCube access.

Part 2 (b) of 3: Science Culture: Where Canada Stands; an expert assessment (reconstructed)

Carrying on from part 2 (a) of this commentary on the Science Culture: Where Canada Stands assessment by the Council of Canadian Academies (CAC).

One of the most intriguing aspects of this assessment was the reliance on an unpublished inventory of Canadian science outreach initiatives (informal science education) that was commissioned by the Korean Foundation for the Advancement of Science and Creativity,

The system of organizations, programs, and initiatives that supports science culture in any country is dynamic. As a result, any inventory provides only a snapshot at a single point in time, and risks quickly becoming out of date. No sustained effort has been made to track public science outreach and engagement efforts in Canada at the national or regional level. Some of the Panel’s analysis relies on data from an unpublished inventory of public science communication initiatives in Canada undertaken in 2011 by Bernard Schiele, Anik Landry, and Alexandre Schiele for the Korean Foundation for the Advancement of Science and Creativity (Schiele et al., 2011). This inventory identified over 700 programs and organizations across all provinces and regions in Canada, including over 400 initiatives related to museums, science centres, zoos, or aquariums; 64 associations or NGOs involved in public science outreach; 49 educational initiatives; 60 government policies and programs; and 27 media programs. (An update of this inventory completed by the Panel brings the total closer to 800 programs.) The inventory is used throughout the chapter [chapter five] to characterize different components of the Canadian system supporting public science outreach, communication, and engagement. (p. 130 PDF; p. 98 print)

I’m fascinated by the Korean interest and wonder if this due to perceived excellence or to budgetary considerations. The cynic in me suspects the Korean foundation was interested in the US scene but decided that information from the Canadian scene would be cheaper to acquire and the data could be extrapolated to give a perspective on the US scene.

In addition to the usual suspects (newspapers, television, radio, science centres, etc.), the Expert Panel did recognize the importance of online science sources (they would have looked foolish if they hadn’t),

Canadians are increasingly using the internet to seek out information relating to science. This activity can take the form of generalized searches about science-related issues or more targeted forms of information acquisition. For example, Canadians report using the internet to seek out information on health and medical issues an average of 47 times a year, or nearly every week. Other forms of online exposure to scientific content also appear to be common. For example, 46% of Canadians report having read a blog post or listserv related to science and technology at least once in the last three months, and 62% having watched an online video related to science and technology.

An increasing reliance on the internet as the main source of information about science and technology is consistent with the evolution of the media environment, as well as with survey data from other countries. Based on the Panel’s survey, 17% of Canadians, for example, report reading a printed newspaper daily, while 40% report reading about the news or current events online every day. (p. 13/2 PDF; p. 100/1 print)

In common with the rest of the world, Canadians are producing and enjoying science festivals,

In Canada there are two established, large-scale science festivals. Science Rendezvous [founded in 2008 as per its Wikipedia entry] takes place in about 20 cities across the country and combines a variety of programming to comprise a day-long free event (Science Rendezvous, 2013).

The annual Eureka! Festival in Montréal (see Figure 5.6 [founded in 2007 as per its program list]) has over 100 activities over three days; it attracted over 68,000 attendees in 2012 (Eureka! Festival, 2013). More science festivals have recently been created. The University of Toronto launched the Toronto Science Festival in fall 2013 (UofT, 2013), and Beakerhead, a new festival described as a “collision of art and culture, technology, and engineering,” was launched in 2013 in Calgary (Beakerhead, 2013). Two Canadian cities have also recently won bids to host STEMfest (Saskatoon in 2015 and Halifax in 2018), an international festival of science, technology, engineering, and mathematics (Global STEM States, 2014). (pp. 145/6 PDF; pp. 113/4 PDF)

The assessment notes have a grand total of five radio and television programmes devoted to science: The Nature of Things, Daily Planet, Quirks and Quarks, Découverte, and Les années lumière (p. 150 PDF; p. 118 print) and a dearth of science journalism,

Dedicated science coverage is notably absent from the majority of newspapers and other print journalism in Canada. As shown in Table 5.3, none of the top 11 newspapers by weekly readership in Canada has a dedicated science section, including nationals such as The Globe and Mail and National Post. Nine of these newspapers have dedicated technology sections, which sometimes contain sub-sections with broader coverage of science or environment stories; however, story coverage tends to be dominated by technology or business (or gaming) stories. Few Canadian newspapers have dedicated science journalists on staff, and The Globe and Mail is unique among Canadian papers in having a science reporter, a medicine and health reporter, and a technology reporter. (p. 152 PDF; p. 120 print)

Not stated explicitly in the assessment is this: those science and technology stories you see in the newspaper are syndicated stories, i.e., written by reporters for the Associated Press, Reuters, and other international press organizations or simply reprinted (with credit) from another newspaper.

The report does cover science blogging with this,

Science blogs are another potential source of information about developments in science and technology. A database compiled by the Canadian Science Writers’ Association, as of March of 2013, lists 143 Canadian science blogs, covering all areas of science and other aspects of science such as science policy and science culture (CSWA, 2013). Some blogs are individually authored and administered, while others are affiliated with larger networks or other organizations (e.g., Agence Science-Presse, PLOS Blogs). Canadian science blogger Maryse de la Giroday has also published an annual round-up of Canadian science blogs on her blog (www.frogheart.ca) for the past three years, and a new aggregator of Canadian science blogs was launched in 2013 (www.scienceborealis.ca). [emphases mine]

Data from the Panel’s survey suggest that blogs are becoming a more prominent source of information about science and technology for the general public. As noted at the beginning of the chapter, 46% of Canadians report having read a blog post about science or technology at least once in the past three months. Blogs are also influencing the way that scientific research is carried out and disseminated. A technical critique in a blog post by Canadian microbiologist Rosie Redfield in 2010, for example, catalyzed a widely publicized debate on the validity of a study published in Science, exploring the ability of bacteria to incorporate arsenic into their DNA. The incident demonstrated the potential impact of blogs on mainstream scientific research. CBC highlighted the episode as the Canadian science story of the year (Strauss, 2011), and Nature magazine identified Redfield as one of its 10 newsmakers of the year in 2011 as a result of her efforts to replicate the initial study and publicly document her progress and results (Hayden, 2011).

The impact of online information sources, however, is not limited to blogs, with 42% of Canadians reporting having heard about a science and technology news story though social media sources like Twitter and Facebook in the last three months. And, as noted earlier, the internet is often used to search for information about specific science and technology topics, both for general issues such as climate change, and more personalized information on medical and health issues.(pp. 153/4 PDF; pp. 121/2 print)

Yes, I got a shout out as did Rosie Redfield. We were the only two science bloggers namechecked. (Years ago, the Guardian newspaper was developing a science blog network and the editor claimed he couldn’t find many female science bloggers after fierce criticism of its first list of bloggers. This was immediately repudiated not only by individuals but someone compiled a list of hundreds of female science bloggers.) Still, the perception persists and I’m thrilled that the panel struck out in a different direction. I was also pleased to see Science Borealis (a Canadian science blog aggregator) mentioned. Having been involved with its founding, I’m also delighted its first anniversary was celebrated in Nov. 2014.

I doubt many people know we have a science press organization in Canada, Agence Science-Presse, but perhaps this mention in the assessment will help raise awareness in Canada’s English language media,

Founded in 1978 with the motto Parce que tout le monde s’intéresse à la science (“because everyone is interested in science”), Agence Science-Presse is a not-for-profit organization in Quebec that supports media coverage of science by distributing articles on scientific research or other topical science and technology issues to media outlets in Canada and abroad. The organization also supports science promotion activities aimed at youth. For example, it currently edits and maintains an aggregation of blogs designed for young science enthusiasts and science journalists (Blogue ta science). (p. 154 PDF; p. 122)

The final chapter (the 6th) of the assessment makes five key recommendations for ‘Cultivating a strong science culture':

  1. Support lifelong science learning
  2. Make science inclusive
  3. Adapt to new technologies
  4. Enhance science communication and engagement
  5. Provide national or regional leadership

Presumably the agriculture reference in the chapter title is tongue-in-cheek. Assuming that’s not one of my fantasies, it’s good to see a little humour.

On to the first recommendation, lifelong learning,

… Science centres and museums, science programs on radio and television, science magazines and journalism, and online resources can all help fulfil this function by providing accessible resources for adult science learning, and by anticipating emerging information needs based on topical issues.

Most informal science learning organizations already provide these opportunities to varying degrees; however, this conception of the relative roles of informal and formal science learning providers differs from the traditional understanding, which often emphasizes how informal environments can foster engagement in science (particularly among youth), thereby triggering additional interest and the later acquisition of knowledge (Miller, 2010b). [emphasis mine] Such a focus may be appropriate for youth programming, but neglects the role that these institutions can play in ongoing education for adults, who often seek out information on science based on specific, well-defined interests or needs (e.g., a medical diagnosis, a newspaper article on the threat of a viral pandemic, a new technology brought into the workplace) (Miller, 2012). [emphases mine] Informal science learning providers can take advantage of such opportunities by anticipating these needs, providing useful and accessible information, and then simultaneously building and deepening knowledge of the underlying science through additional content.

I’m glad to see the interest in adult informal science education although the emphasis on health/medical and workplace technology issues suggests the panel underestimates, despite the data from its own survey, Canadians’ curiosity about and interest in science and technology. The panel also underestimates the tenacity with which many gatekeepers hold to the belief that no one is interested in science. It took me two years before a local organizer would talk to me about including one science-themed meeting in his programme (the final paragraph in my April 14, 2014 post describes some of the process  and my April 18, 2014 post describes the somewhat disappointing outcome). In the end, it was great to see a science-themed ‘city conversation’ but I don’t believe the organizer found it to be a success, which means it’s likely to be a long time before there’s another one.

The next recommendation, ‘Making science inclusive’, is something that I think needs better practice. If one is going to be the change one wants to see that means getting people onto your expert panels that reflect your inclusiveness and explaining to your audience how your expert panel is inclusive.

The ‘Adapting to new technologies’ recommendation is where I expected to see some mention of the social impact of such emerging technologies as robotics, nanotechnology, synthetic biology, etc. That wasn’t the case,

Science culture in Canada and other countries is now evolving in a rapidly changing technological environment. Individuals are increasingly turning to online sources for information about science and technology, and science communicators and the media are also adapting to the new channels of communication and outreach provided over the internet. As people engage more with new forms of technology in their home and work lives, organizations may be able to identify new ways to take advantage of available technologies to support learning and foster science interest and engagement. At the same time, as noted in Chapter 2, this transition is also challenging traditional models of operation for many organizations such as science centres, museums, and science media providers, forcing them to develop new strategies.

Examples of the use of new technologies to support learning are now commonplace. Nesta, an innovation-oriented organization based in the United Kingdom, conducted a study investigating the extent to which new technologies are transforming learning among students (Luckin et al., 2012) (p. 185 PDF; p. 153 print)

Admittedly, the panel was not charged with looking too far into the future but it does seem odd that in a science culture report there isn’t much mention (other than a cursory comment in an early chapter) of these emerging technologies and the major changes they are bringing with them. If nothing else, the panel might have wanted to make mention of artificial intelligence how the increasing role of automated systems may be affecting science culture in Canada. For example, in my July 16, 2014 post I made described a deal Associated Press (AP) signed with a company that automates the process of writing sports and business stories. You may well have read a business story (AP contracted for business stories) written by an artificial intelligence system or, if you prefer the term, an algorithm.

The recommendation for ‘Enhancing science communication and engagement’ is where I believe the Expert Panel should be offered a bouquet,

… Given the significance of government science in many areas of research, government science communication constitutes an important vector for increasing public awareness and understanding about science. In Canada current policies governing how scientists working in federal departments and agencies are allowed to interact with the media and the public have come under heavy criticism in recent years …

Concerns about the federal government’s current policies on government scientists’ communication with the media have been widely reported in Canadian and international
press in recent years (e.g., Ghosh, 2012; CBC, 2013c; Gatehouse, 2013; Hume, 2013; Mancini, 2013; Munro, 2013). These concerns were also recently voiced by the editorial board of Nature (2012), which unfavourably compared Canada’s current approach with the more open policies now in place in the United States. Scientists at many U.S. federal agencies are free to speak to the media without prior departmental approval, and to
express their personal views as long as they clearly state that they are not speaking on behalf of the government. In response to such concerns, and to a formal complaint filed by the Environmental Law Clinic at the University of Victoria and Democracy Watch, on April 2, 2013 Canada’s Information Commissioner launched an investigation into whether current policies and policy instruments in seven federal departments and agencies are “restricting or prohibiting government scientists from speaking with or sharing research with the media and the Canadian public” (OICC, 2013).

Since these concerns have come to light, many current and former government scientists have discussed how these policies have affected their interactions with the media. Marley Waiser, a former scientist with Environment Canada, has spoken about how that department’s policies prevented her from discussing her research on chemical pollutants in Wascana Creek near Regina (CBC, 2013c). Dr. Kristi Miller, a geneticist with the Department of Fisheries and Oceans, was reportedly prevented from speaking publicly about a study she published in Science, which investigated whether a viral infection might be the cause of declines in Sockeye salmon stocks in the Fraser River (Munro, 2011).

According to data from Statistics Canada (2012), nearly 20,000 science and technology professionals work for the federal government. The ability of these researchers to communicate with the media and the Canadian public has a clear bearing on Canada’s science culture. Properly supported, government scientists can serve as a useful conduit for informing the public about their scientific work, and engaging the public in discussions about the social relevance of their research; however, the concerns reported above raise questions about the extent to which current federal policies in Canada are limiting these opportunities for public communication and engagement. (pp. 190/1 PDF; p. 158/9 print)

Kudos for including the information and for this passage as well,

Many organizations including science centres and museums, research centres, and even governments may be perceived as having a science promotion agenda that portrays only the benefits of science. As a result, these organizations are not always seen as promoters of debate through questioning, which is a crucial part of the scientific process. Acknowledging complexity and controversy is another means to improve the quality of public engagement in science in a range of different contexts. (p. 195 PDF; p. 163 print)

One last happy note, which is about integrating the arts and design into the STEM (science, technology, engineering, and mathematics communities),

Linking Science to the Arts and Design U.S. advocates for “STEM to STEAM” call for an incorporation of the arts in discussions of science, technology, engineering, and mathematics in an effort to “achieve a synergistic balance” (Piro, 2010). They cite positive outcomes such as cognitive development, reasoning skills, and concentration abilities. Piro (2010) argues that “if creativity, collaboration, communication, and critical thinking — all touted as hallmark skills for 21st-century success — are to be cultivated, we need to ensure that STEM subjects are drawn closer to the arts.” Such approaches offer new techniques to engage both student and adult audiences in science learning and engagement opportunities.

What I find fascinating about this STEM to STEAM movement is that many of these folks don’t seem to realize is that until fairly recently the arts and sciences recently have always been closely allied.  James Clerk Maxwell was also a poet, not uncommon amongst 19th century scientists.

In Canada one example of this approach is found in the work of Michael R. Hayden, who has conducted extensive genetic research on Huntington disease. In the lead-up to the 2000 Human Genome Project World Conference, Hayden commissioned Vancouver’s Electric Company Theatre to fuse “the spheres of science and art in a play that explored the implications of the revolutionary technology of the Human Genome Project” (ECT, n.d.). This play, The Score, was later adapted into a film. Hayden believes that his play “transforms the scientific ideas explored in the world of the laboratory into universal themes of human identity, freedom and creativity, and opens up a door for a discussion between the scientific community and the public in general” (Genome Canada, 2006). (p. 196 PDF; p. 164 print)

I’m not sure why the last recommendation presents an either/or choice, ‘Providing national or regional leadership’, while the following content suggests a much more fluid state,

…  it should be recognized that establishing a national or regional vision for science culture is not solely the prerogative of government. Such a vision requires broad support and participation from the community of affected stakeholders to be effective, and can also emerge from that community in the absence of a strong governmental role.

The final chapter (the seventh) restates the points the panel has made throughout its report. Unexpectedly, part 2 got bigger, ’nuff said.

Part 2 (a) of 3: Science Culture: Where Canada Stands; an expert assessment (reconstructed)

Losing over 2000 words, i.e., part 2 of this commentary on the Science Culture: Where Canada Stands assessment by the Council of Canadian Academies (CAC) on New Year’s Eve 2014 was a bit of blow. So, here’s my attempt at reconstructing my much mourned part 2.

There was acknowledgement of Canada as a Arctic country and an acknowledgement of this country’s an extraordinary geographical relationship to the world’s marine environment,

Canada’s status as an Arctic nation also has a bearing on science and science culture. Canada’s large and ecologically diverse Arctic landscape spans a substantial part of the circumpolar Arctic, and comprises almost 40% of the country’s landmass (Statistics Canada, 2009). This has influenced the development of Canadian culture more broadly, and also created opportunities in the advancement of Arctic science. Canada’s northern inhabitants, the majority of whom are Indigenous peoples, represent a source of knowledge that contributes to scientific research in the North (CCA, 2008).

These characteristics have contributed to the exploration of many scientific questions including those related to environmental science, resource development, and the health and well-being of northern populations. Canada also has the longest coastline of any country, and these extensive coastlines and marine areas give rise to unique research opportunities in ocean science (CCA, 2013a). (p. 55 PDF; p. 23 print)

Canada’s aging population is acknowledged in a backhand way,

Like most developed countries, Canada’s population is also aging. In 2011 the median age in Canada was 39.9 years, up from 26.2 years in 1971 (Statistics Canada, n.d.). This ongoing demographic transition will have an impact on science culture in Canada in years to come. An aging population will be increasingly interested in health and medical issues. The ability to make use of this kind of information will depend in large part on the combination of access to the internet, skill in navigating it, and a conceptual toolbox that includes an understanding of genes, probability, and related constructs (Miller, 2010b). (p. 56 PDF; p. 24 print)

Yes, the only science topics of interest for an old person are health and medicine. Couldn’t they have included one sentence suggesting an aging population’s other interests and other possible impacts on science culture?

On the plus side, the report offers a list of selected Canadian science culture milestones,

• 1882 – Royal Society of Canada is established.
• 1916 – National Research Council is established.
• 1923 – Association canadienne-française pour l’avancement des sciences (ACFAS) is established.
• 1930 – Canadian Geographic is first published by the Royal Canadian Geographical Society.
• 1951 – Massey–Lévesque Commission calls for the creation of a national science and technology museum.
• 1959 – Canada sees its first science fairs in Winnipeg, Edmonton, Hamilton, Toronto, Montréal, and Vancouver; volunteer coordination eventually grows into Youth Science Canada.
• 1960 – CBC’s Nature of Things debuts on television; Fernand Séguin hosts “Aux frontières de la science.”
• 1962 – ACFAS creates Le Jeune scientifique, which becomes Québec Science in 1970.
• 1966 – Science Council of Canada is created to advise Parliament on science and technology issues.
• 1967 – Canada Museum of Science and Technology is created.
• 1969 – Ontario Science Centre opens its doors (the Exploratorium in San Francisco opens the same year).
• 1971 – Canadian Science Writers’ Association is formed.
• 1975 – Symons Royal Commission on Canadian Studies speaks to how understanding the role of science in society is important to understanding Canadian culture and identity.
• 1975 – Quirks and Quarks debuts on CBC Radio.
• 1976 – OWL children’s magazine begins publication.
• 1977 – Association des communicateurs scientifiques du Québec is established.
• 1978 – L’Agence Science-Presse is created.
• 1981 – Association des communicateurs scientifiques creates the Fernand-Séguin scholarship to identify promising young science journalists.
• 1982 – Les Débrouillards is launched in Quebec. (p. 58 PDF; p. 26 print)

The list spills onto the next page and into the 2000’s.

It’s a relief to see the Expert Panel give a measured response to the claims made about science culture and its various impacts, especially on the economy (in my book, some of the claims have bordered on hysteria),

The Panel found little definitive empirical evidence of causal relationships between the dimensions of science culture and higher-level social objectives like stronger economic performance or more effective public policies. As is the case with much social science research, isolating the impacts of a single variable on complex social phenomena is methodologically challenging, and few studies have attempted to establish such relationships in any detail. As noted in 1985 by the Bodmer report (a still-influential report on public understanding of science in the United Kingdom), although there is good reason prima facie to believe that improving public understanding of science has national economic benefits, empirical proof for such a link is often elusive (RS & Bodmer, 1985). This remains the case today. Nevertheless, many pieces of evidence suggest why a modern, industrialized society should cultivate a strong science culture. Literature from the domains of cognitive science, sociology, cultural studies, economics, innovation, political science, and public policy provides relevant insights. (p. 63 PDF; p. 31 print)

Intriguingly, while the panel has made extensive use of social science methods for this assessment there are some assumptions made about skill sets required for the future,

Technological innovation depends on the presence of science and technology skills in the workforce. While at one point it may have been possible for relatively low-skilled individuals to substantively contribute to technological development, in the 21st century this is no longer the case. [emphasis mine] Advanced science and technology skills are now a prerequisite for most types of technological innovation. (p. 72 PDF; p. 40 print)

Really, it’s no longer possible for relatively low-skilled individuals to contribute to technological development? Maybe the expert panel missed this bit in my March 27, 2013 post,

Getting back to Bittel’s Slate article, he mentions Foldit (here’s my first piece in an Aug. 6, 2010 posting [scroll down about 1/2 way]), a protein-folding game which has generated some very exciting science. He also notes some of that science was generated by older, ‘uneducated’ women. Bittel linked to Jeff Howe’s Feb. 27, 2012 article about Foldit and other crowdsourced science projects for Slate where I found this very intriguing bit,

“You’d think a Ph.D. in biochemistry would be very good at designing protein molecules,” says Zoran Popović, the University of Washington game designer behind Foldit. Not so. “Biochemists are good at other things. But Foldit requires a narrow, deeper expertise.”

Or as it turns out, more than one. Some gamers have a preternatural ability to recognize patterns, an innate form of spatial reasoning most of us lack. Others—often “grandmothers without a high school education,” says Popovic—exercise a particular social skill. “They’re good at getting people unstuck. They get them to approach the problem differently.” What big pharmaceutical company would have anticipated the need to hire uneducated grandmothers? (I know a few, if Eli Lilly HR is thinking of rejiggering its recruitment strategy.) [emphases mine]

It’s not the idea that technical and scientific skills are needed that concerns me; it’s the report’s hard line about ‘low skills’ (which is a term that is not defined). In addition to the notion that future jobs require only individuals with ‘high level’ skills; there’s the notion (not mentioned in this report but gaining general acceptance in the media) that we shouldn’t ever have to perform repetitive and boring activities. It’s a notion which completely ignores a certain aspect of the learning process. Very young children repeat over and over and over and over … . Apprenticeships in many skills-based crafts were designed with years of boring, repetitive work as part of the training. It seems counter-intuitive but boring, repetitive activities can lead to very high level skills such as the ability to ‘unstick’ a problem for an expert with a PhD in biochemistry.

Back to the assessment, the panel commissioned a survey, conducted in 2013, to gather data about science culture in Canada,

The Panel’s survey of Canadian science culture, designed to be comparable to surveys undertaken in other countries as well as to the 1989 Canadian survey, assessed public attitudes towards science and technology, levels and modes of public engagement in science, and public science knowledge or understanding. (The evidence reported in this chapter on the fourth dimension, science and technology skills, is drawn from other sources such as Statistics Canada and the OECD).

Conducted in April 2013, the survey relied on a combination of landline and mobile phone respondents (60%) and internet respondents (40%), randomly recruited from the general population. In analyzing the results, responses to the survey were weighted based on Statistics Canada data according to region, age, education, and gender to ensure that the sample was representative of the Canadian public. 7 A total of 2,004 survey responses were received, with regional breakdowns presented in Table 4.1. At a national level, survey results are accurate within a range of plus or minus 2.2% 19 times out of 20 (i.e., at the 95% confidence interval), and margins of error for regional results range from 3.8% to 7.1%). Three open-ended questions were also included in the survey, which were coded using protocols previously applied to these questions in other international surveys. 8 All open-ended questions were coded independently by at least three bilingual coders, and any discrepancies in coding were settled through a review by a fourth coder. (p. 79 PDF; p. 47 print)

The infographic’s data in part 1 of this commentary, What Do Canadians Think About Science and Technology (S&T)? is based on the survey and other statistical information included in the report especially Chapter four focused on measurements (pp. 77  – 127 PDF; pp. 45 – 95 print). While the survey presents a somewhat rosier picture of the Canadian science culture than the one I experience on a daily basis, the data seems to have been gathered in a thoughtful fashion. Regardless of the assessment’s findings and my opinions,  how Canadians view science became a matter of passionate debate in the Canadian science blogging community (at least parts of it) in late 2014 as per a Dec. 3, 2014 posting by the Science Borealis team on their eponymous blog (Note: Links have been removed),

The CBC’s Rick Mercer is a staunch science advocate, and his November 19th rant was no exception. He addressed the state of basic science in Canada, saying that Canadians are “passionate and curious about science.”

In response, scientist David Kent wrote a post on the Black Hole Blog in which he disagreed with Mercer, saying, “I do not believe Mr. Mercer’s idea that Canadians as a whole are interested although I, like him, would wish it to be the case.”

Kent’s post has generated some fierce discussion, both in the comments on his original post and in the comments on a Facebook post by Evidence for Democracy.

Here at Science Borealis, we rely on a keen and enthusiastic public to engage with the broad range of science-based work our bloggers share, so we decided to address some of the arguments Kent presented in his post.

Anecdotal evidence versus data

Kent says “Mr. Mercer’s claims about Canadians’ passions are anecdotal at best, and lack any evidence – indeed it is possible that Canadians don’t give a hoot about science for science’s sake.”

Unfortunately, Kent’s own argument is based on anecdotal evidence (“To me it appears that… the average Canadian adult does not particularly care about how or why something works.”).

If you’re looking for data, they’re available in a recent Council of Canadian Academies report that specifically studied science culture in Canada. Results show that Canadians are very interested in science.

You can find David Kent’s Nov. 26, 2014 post about Canadians, Rick Mercer and science here. Do take a look at the blog’s comments which feature a number of people deeply involved in promoting and producing Canadian science culture.

I promised disturbing statistics in the head for this posting and here they are in the second paragraph,

Canadian students perform well in PISA [Organization for Economic Cooperation and Development’s (OECD) Programme for International Student Assessment (PISA)] , with relatively high scores on all three of the major components of the assessment (reading, science, and mathematics) compared with students in other countries (Table 4.4). In 2012 only seven countries or regions had mean scores on the science assessment higher than Canada on a statistically significant basis: Shanghai–China, Hong Kong–China, Singapore, Japan, Finland, Estonia, and Korea (Brochu et al., 2013). A similar pattern holds for mathematics scores, where nine countries had mean scores higher than Canada on a statistically significant basis: Shanghai–China, Singapore, Hong Kong–China, Chinese Taipei, Korea, Macao–China, Japan, Lichtenstein, and Switzerland (Brochu et al., 2013). Regions scoring higher than Canada are concentrated in East Asia, and tend to be densely populated, urban areas. Among G8 countries, Canada ranks second on mean science and mathematics scores, behind Japan.

However, the 2012 PISA results also show statistically significant declines in Canada’s scores on both the mathematics and science components. Canada’s science score declined by nine points from its peak in 2006 (with a fall in ranking from 3rd to 10th), and the math score declined by 14 points since first assessed in 2003 (a fall from 7th to 13th) (Brochu et al., 2013). Changes in Canada’s standing relative to other countries reflect both the addition of new countries or regions over time (i.e., the addition of regions such as Hong Kong–China and Chinese Taipei in 2006, and of Shanghai–China in 2009) and statistically significant declines in mean scores.

My Oct. 9, 2013 post discusses the scores in more detail and as the Expert Panel notes, the drop is disconcerting and disturbing. Hopefully, it doesn’t indicate a trend.

Part 2 (b) follows immediately.

Gelatin nanoparticles for drug delivery after a stroke

A Dec. 24, 2014 news item on phys.org describes a treatment that could mitigate the effects of a stroke by extending the window of opportunity for recuperative treatments (Note: Links have been removed),

Stroke victims could have more time to seek treatment that could reduce harmful effects on the brain, thanks to tiny blobs of gelatin that could deliver the medication to the brain noninvasively.

University of Illinois researchers and colleagues in South Korea, led by U. of I. electrical and computer engineering senior research scientist Hyungsoo Choi and professor Kyekyoon “Kevin” Kim, published details about the gelatin nanoparticles in the journal Drug Delivery and Translational Research.

A Dec. 23, 2014 University of Illinois at Urbana-Champaign news release, which originated the news item, explains how the gelatin nanoparticles are directed to the injury site (Note: links have been removed),

The researchers found that gelatin nanoparticles could be laced with medications for delivery to the brain, and that they could extend the treatment window for when a drug could be effective. Gelatin is biocompatible, biodegradable, and classified as “Generally Recognized as Safe” by the Food and Drug Administration. Once administered, the gelatin nanoparticles target damaged brain tissue thanks to an abundance of gelatin-munching enzymes produced in injured regions.

The tiny gelatin particles have a huge benefit: They can be administered nasally, a noninvasive and direct route to the brain. This allows the drug to bypass the blood-brain barrier, a biological fence that prevents the vast majority of drugs from entering the brain through the bloodstream.

“Overcoming the difficulty of delivering therapeutic agents to specific regions of the brain presents a major challenge to treatment of most neurological disorders,” said Choi.  “However, if drug substances can be transferred along the olfactory nerve cells, they can bypass the blood-brain barrier and enter the brain directly.”

To test gelatin nanoparticles as a drug-delivery system, the researchers used the drug osteopontin (OPN), which in rats can help to reduce inflammation and prevent brain cell death if administered immediately after a stroke.

“It is crucial to treat ischemic strokes within three hours to improve the chances of recovery. However, a significant number of stroke victims don’t get to the hospital in time for the treatment,” Kim said.

By lacing gelatin nanoparticles with OPN, the researchers found that they could extend the treatment window in rats, so much so that treating a rat with nanoparticles six hours after a stroke showed the same efficacy rate as giving them OPN alone after one hour – 70 percent recovery of dead volume in the brain.

The researchers hope the gelatin nanoparticles, administered through the nasal cavity, can help deliver other drugs to more effectively treat a variety of brain injuries and neurological diseases.

“Gelatin nanoparticles are a delivery vehicle that could be used to deliver many therapeutics to the brain,” Choi said. “They will be most effective in delivering drugs that cannot cross the blood-brain barrier. In addition, they can be used for drugs of high toxicity or a short half-life.“

I expect the next steps will include some human clinical trials. In the meantime for those who are interested, here’s a link to and a citation for the paper,

Gelatin nanoparticles enhance the neuroprotective effects of intranasally administered osteopontin in rat ischemic stroke model by Elizabeth Joachim, Il-Doo Kim, Yinchuan Jin, Kyekyoon (Kevin) Kim, Ja-Kyeong Lee, and Hyungsoo Choi. Drug Delivery and Translational Research Volume 4, Issue 5-6 , pp 395-399 DOI 10.1007/s13346-014-0208-9 Published online Nov. 8, 2014

This paper is behind a paywall.

Flexible electronics and Inorganic-based Laser Lift-off (ILLO) in Korea

Korean scientists are trying to make the process of creating flexible electronics easier according to a Nov. 25, 2014 news item on ScienceDaily,

Flexible electronics have been touted as the next generation in electronics in various areas, ranging from consumer electronics to bio-integrated medical devices. In spite of their merits, insufficient performance of organic materials arising from inherent material properties and processing limitations in scalability have posed big challenges to developing all-in-one flexible electronics systems in which display, processor, memory, and energy devices are integrated. The high temperature processes, essential for high performance electronic devices, have severely restricted the development of flexible electronics because of the fundamental thermal instabilities of polymer materials.

A research team headed by Professor Keon Jae Lee of the Department of Materials Science and Engineering at KAIST provides an easier methodology to realize high performance flexible electronics by using the Inorganic-based Laser Lift-off (ILLO).

The process is described in a Nov. 26, 2014 KAIST news release on ResearchSEA, which originated the news item (despite the confusion of the date, probably due to timezone differentials), provides more detail about the technique for ILLO,

The ILLO process involves depositing a laser-reactive exfoliation layer on rigid substrates, and then fabricating ultrathin inorganic electronic devices, e.g., high density crossbar memristive memory on top of the exfoliation layer. By laser irradiation through the back of the substrate, only the ultrathin inorganic device layers are exfoliated from the substrate as a result of the reaction between laser and exfoliation layer, and then subsequently transferred onto any kind of receiver substrate such as plastic, paper, and even fabric.

This ILLO process can enable not only nanoscale processes for high density flexible devices but also the high temperature process that was previously difficult to achieve on plastic substrates. The transferred device successfully demonstrates fully-functional random access memory operation on flexible substrates even under severe bending.

Professor Lee said, “By selecting an optimized set of inorganic exfoliation layer and substrate, a nanoscale process at a high temperature of over 1000 °C can be utilized for high performance flexible electronics. The ILLO process can be applied to diverse flexible electronics, such as driving circuits for displays and inorganic-based energy devices such as battery, solar cell, and self-powered devices that require high temperature processes.”

Here’s a link to and a citation for the research paper,

Flexible Crossbar-Structured Resistive Memory Arrays on Plastic Substrates via Inorganic-Based Laser Lift-Off by Seungjun Kim, Jung Hwan Son, Seung Hyun Lee, Byoung Kuk You, Kwi-Il Park, Hwan Keon Lee, Myunghwan Byun and Keon Jae Lee. Advanced Materials Volume 26, Issue 44, pages 7480–7487, November 26, 2014 Article first published online: 8 SEP 2014 DOI: 10.1002/adma.201402472

© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

Here’s an image the researchers have made available,

This photo shows the flexible RRAM device on a plastic substrate. Courtesy: KAIST

This photo shows the flexible RRAM device on a plastic substrate. Courtesy: KAIST

Finally, the research paper is behind a paywall.

Nanosafety research: a quality control issue

Toxicologist Dr. Harald Krug has published a review of several thousand studies on nanomaterials safety exposing problematic research methodologies and conclusions. From an Oct. 29, 2014 news item on Nanowerk (Note: A link has been removed),

Empa [Swiss Federal Laboratories for Materials Science and Technology] toxicologist Harald Krug has lambasted his colleagues in the journal Angewandte Chemie (“Nanosafety Research—Are We on the Right Track?”). He evaluated several thousand studies on the risks associated with nanoparticles and discovered no end of shortcomings: poorly prepared experiments and results that don’t carry any clout. Instead of merely leveling criticism, however, Empa is also developing new standards for such experiments within an international network.

An Oct. 29, 2014 Empa press release (also on EurekAlert), which originated the news item, describes the new enthusiasm for research into nanomaterials and safety,

Researching the safety of nanoparticles is all the rage. Thousands of scientists worldwide are conducting research on the topic, examining the question of whether titanium dioxide nanoparticles from sun creams can get through the skin and into the body, whether carbon nanotubes from electronic products are as hazardous for the lungs as asbestos used to be or whether nanoparticles in food can get into the blood via the intestinal flora, for instance. Public interest is great, research funds are flowing – and the number of scientific projects is skyrocketing: between 1980 and 2010, a total of 5,000 projects were published, followed by another 5,000 in just the last three years. However, the amount of new knowledge has only increased marginally. After all, according to Krug the majority of the projects are poorly executed and all but useless for risk assessments.

The press release goes on to describe various pathways into the body and problems with research methodologies,

How do nanoparticles get into the body?

Artificial nanoparticles measuring between one and 100 nanometers in size can theoretically enter the body in three ways: through the skin, via the lungs and via the digestive tract. Almost every study concludes that healthy, undamaged skin is an effective protective barrier against nanoparticles. When it comes to the route through the stomach and gut, however, the research community is at odds. But upon closer inspection the value of many alarmist reports is dubious – such as when nanoparticles made of soluble substances like zinc oxide or silver are being studied. Although the particles disintegrate and the ions drifting into the body are cytotoxic, this effect has nothing to do with the topic of nanoparticles but is merely linked to the toxicity of the (dissolved) substance and the ingested dose.

Laboratory animals die in vain – drastic overdoses and other errors

Krug also discovered that some researchers maltreat their laboratory animals with absurdly high amounts of nanoparticles. Chinese scientists, for instance, fed mice five grams of titanium oxide per kilogram of body weight, without detecting any effects. By way of comparison: half the amount of kitchen salt would already have killed the animals. A sloppy job is also being made of things in the study of lung exposure to nanoparticles: inhalation experiments are expensive and complex because a defined number of particles has to be swirled around in the air. Although it is easier to place the particles directly in the animal’s windpipe (“instillation”), some researchers overdo it to such an extent that the animals suffocate on the sheer mass of nanoparticles.

While others might well make do without animal testing and conduct in vitro experiments on cells, here, too, cell cultures are covered by layers of nanoparticles that are 500 nanometers thick, causing them to die from a lack of nutrients and oxygen alone – not from a real nano-effect. And even the most meticulous experiment is worthless if the particles used have not been characterized rigorously beforehand. Some researchers simply skip this preparatory work and use the particles “straight out of the box”. Such experiments are irreproducible, warns Krug.

As noted in the news item, the scientists at Empa have devised a solution to some to of the problems (from the press release),

The solution: inter-laboratory tests with standard materials
Empa is thus collaborating with research groups like EPFL’s Powder Technology Laboratory, with industrial partners and with Switzerland’s Federal Office of Public Health (FOPH) to find a solution to the problem: on 9 October the “NanoScreen” programme, one of the “CCMX Materials Challenges”, got underway, which is expected to yield a set of pre-validated methods for lab experiments over the next few years. It involves using test materials that have a closely defined particle size distribution, possess well-documented biological and chemical properties and can be altered in certain parameters – such as surface charge. “Thanks to these methods and test substances, international labs will be able to compare, verify and, if need be, improve their experiments,” explains Peter Wick, Head of Empa’s laboratory for Materials-Biology Interactions.

Instead of the all-too-familiar “fumbling around in the dark”, this would provide an opportunity for internationally coordinated research strategies to not only clarify the potential risks of new nanoparticles in retrospect but even be able to predict them. The Swiss scientists therefore coordinate their research activities with the National Institute of Standards and Technology (NIST) in the US, the European Commission’s Joint Research Center (JRC) and the Korean Institute of Standards and Science (KRISS).

Bravo! and thank you Dr. Krug and Empa for confirming something I’ve suspected due to hints from more informed commentators. Unfortunately my ignorance. about research protocols has not permitted me to undertake a better analysis of the research. ,

Here’s a link to and a citation for the paper,

Nanosafety Research—Are We on the Right Track? by Prof. Dr. Harald F. Krug. Angewandte Chemie International Edition DOI: 10.1002/anie.201403367 Article first published online: 10 OCT 2014

This is an open access paper.

Batteryfree cardiac pacemaker

This particular energy-havesting pacemaker has been tested ‘in vivo’ or, as some like to say, ‘on animal models’. From an Aug. 31, 2014 European Society of Cardiology news release (also on EurekAlert),

A new batteryless cardiac pacemaker based on an automatic wristwatch and powered by heart motion was presented at ESC Congress 2014 today by Adrian Zurbuchen from Switzerland. The prototype device does not require battery replacement.

Mr Zurbuchen, a PhD candidate in the Cardiovascular Engineering Group at ARTORG, University of Bern, Switzerland, said: “Batteries are a limiting factor in today’s medical implants. Once they reach a critically low energy level, physicians see themselves forced to replace a correctly functioning medical device in a surgical intervention. This is an unpleasant scenario which increases costs and the risk of complications for patients.”

Four years ago Professor Rolf Vogel, a cardiologist and engineer at the University of Bern, had the idea of using an automatic wristwatch mechanism to harvest the energy of heart motion. Mr Zurbuchen said: “The heart seems to be a very promising energy source because its contractions are repetitive and present for 24 hours a day, 7 days a week. Furthermore the automatic clockwork, invented in the year 1777, has a good reputation as a reliable technology to scavenge energy from motion.”

The researchers’ first prototype is based on a commercially available automatic wristwatch. All unnecessary parts were removed to reduce weight and size. In addition, they developed a custom-made housing with eyelets that allows suturing the device directly onto the myocardium (photo 1).

The prototype works the same way it would on a person’s wrist. When it is exposed to an external acceleration, the eccentric mass of the clockwork starts rotating. This rotation progressively winds a mechanical spring. After the spring is fully charged it unwinds and thereby spins an electrical micro-generator.

To test the prototype, the researchers developed an electronic circuit to transform and store the signal into a small buffer capacity. They then connected the system to a custom-made cardiac pacemaker (photo 2). The system worked in three steps. First, the harvesting prototype acquired energy from the heart. Second, the energy was temporarily stored in the buffer capacity. And finally, the buffered energy was used by the pacemaker to apply minute stimuli to the heart.

The researchers successfully tested the system in in vivo experiments with domestic pigs. The newly developed system allowed them for the first time to perform batteryless overdrive-pacing at 130 beats per minute.

Mr Zurbuchen said: “We have shown that it is possible to pace the heart using the power of its own motion. The next step in our prototype is to integrate both the electronic circuit for energy storage and the custom-made pacemaker directly into the harvesting device. This will eliminate the need for leads.”

He concluded: “Our new pacemaker tackles the two major disadvantages of today’s pacemakers. First, pacemaker leads are prone to fracture and can pose an imminent threat to the patient. And second, the lifetime of a pacemaker battery is limited. Our energy harvesting system is located directly on the heart and has the potential to avoid both disadvantages by providing the world with a batteryless and leadless pacemaker.”

This project seems the furthest along with regard to its prospects for replacing batteries in pacemakers (with leadlessness being a definite plus) but there are other projects such as Korea’s Professor Keon Jae Lee of KAIST and Professor Boyoung Joung, M.D. at Severance Hospital of Yonsei University who are working on a piezoelectric nanogenerator according to a June 26, 2014 article by Colin Jeffrey for Gizmodo.com,

… Unfortunately, the battery technology used to power these devices [cardiac pacemakers] has not kept pace and the batteries need to be replaced on average every seven years, which requires further surgery. To address this problem, a group of researchers from Korea Advanced Institute of Science and Technology (KAIST) has developed a cardiac pacemaker that is powered semi-permanently by harnessing energy from the body’s own muscles.

The research team, headed by Professor Keon Jae Lee of KAIST and Professor Boyoung Joung, M.D. at Severance Hospital of Yonsei University, has created a flexible piezoelectric nanogenerator that has been used to directly stimulate the heart of a live rat using electrical energy produced from small body movements of the animal.

… the team created their new high-performance flexible nanogenerator from a thin film semiconductor material. In this case, lead magnesium niobate-lead titanate (PMN-PT) was used rather than the graphene oxide and carbon nanotubes of previous versions. As a result, the new device was able to harvest up to 8.2 V and 0.22 mA of electrical energy as a result of small flexing motions of the nanogenerator. The resultant voltage and current generated in this way were of sufficient levels to stimulate the rat’s heart directly.

I gather this project too was tested on animal models, in this case, rats.

Gaining some attention at roughly the same time as the Korean researchers, a French team’s work with a ‘living battery’ is mentioned in a June 17, 2014 news item on the Open Knowledge website,

Philippe Cinquin, Serge Cosnier and their team at Joseph Fourier University in France have invented a ‘living battery.’ The device – a fuel cell and conductive wires modified with reactive enzymes – has the power to tap into the body’s endless supply of glucose and convert simple sugar, which constitutes the energy source of living cells, into electricity.

Visions of implantable biofuel cells that use the body’s natural energy sources to power pacemakers or artificial hearts have been around since the 1960s, but the French team’s innovations represents the closest anyone has ever come to harnessing this energy.

The French team was a finalist for the 2014 European Inventor Award. Here’s a description of how their invention works, from their 2014 European Inventor Award’s webpage,

Biofuel cells that harvest energy from glucose in the body function much like every-day batteries that conduct electricity through positive and negative terminals called anodes and cathodes and a medium conducive to electric charge known as the electrolyte. Electricity is produced via a series of electrochemical reactions between these three components. These reactions are catalysed using enzymes that react with glucose stored in the blood.

Bodily fluids, which contain glucose and oxygen, serve as the electrolyte. To create an anode, two enzymes are used. The first enzyme breaks down the sugar glucose, which is produced every time the animal or person consumes food. The second enzyme oxidises the simpler sugars to release electrons. A current then flows as the electrons are drawn to the cathode. A capacitor that is hooked up to the biofuel cell stores the electric charge produced.

I wish all the researchers good luck as they race towards a new means of powering pacemakers, deep brain stimulators, and other implantable devices that now rely on batteries which need to be changed thus forcing the patient to undergo major surgery.

Self-powered batteries for pacemakers, etc. have been mentioned here before:

April 3, 2009 posting

July 12, 2010 posting

March 8, 2013 posting

Nanojuice in your gut

A July 7, 2014 news item on Azonano features a new technique that could help doctors better diagnose problems in the intestines (guts),

Located deep in the human gut, the small intestine is not easy to examine. X-rays, MRIs and ultrasound images provide snapshots but each suffers limitations. Help is on the way.

University at Buffalo [State University of New York] researchers are developing a new imaging technique involving nanoparticles suspended in liquid to form “nanojuice” that patients would drink. Upon reaching the small intestine, doctors would strike the nanoparticles with a harmless laser light, providing an unparalleled, non-invasive, real-time view of the organ.

A July 5, 2014 University of Buffalo news release (also on EurekAlert) by Cory Nealon, which originated the news item, describes some of the challenges associated with medical imaging of small intestines,

“Conventional imaging methods show the organ and blockages, but this method allows you to see how the small intestine operates in real time,” said corresponding author Jonathan Lovell, PhD, UB assistant professor of biomedical engineering. “Better imaging will improve our understanding of these diseases and allow doctors to more effectively care for people suffering from them.”

The average human small intestine is roughly 23 feet long and 1 inch thick. Sandwiched between the stomach and large intestine, it is where much of the digestion and absorption of food takes place. It is also where symptoms of irritable bowel syndrome, celiac disease, Crohn’s disease and other gastrointestinal illnesses occur.

To assess the organ, doctors typically require patients to drink a thick, chalky liquid called barium. Doctors then use X-rays, magnetic resonance imaging and ultrasounds to assess the organ, but these techniques are limited with respect to safety, accessibility and lack of adequate contrast, respectively.

Also, none are highly effective at providing real-time imaging of movement such as peristalsis, which is the contraction of muscles that propels food through the small intestine. Dysfunction of these movements may be linked to the previously mentioned illnesses, as well as side effects of thyroid disorders, diabetes and Parkinson’s disease.

The news release goes on to describe how the researchers manipulated dyes that are usually unsuitable for the purpose of imaging an organ in the body,

Lovell and a team of researchers worked with a family of dyes called naphthalcyanines. These small molecules absorb large portions of light in the near-infrared spectrum, which is the ideal range for biological contrast agents.

They are unsuitable for the human body, however, because they don’t disperse in liquid and they can be absorbed from the intestine into the blood stream.

To address these problems, the researchers formed nanoparticles called “nanonaps” that contain the colorful dye molecules and added the abilities to disperse in liquid and move safely through the intestine.

In laboratory experiments performed with mice, the researchers administered the nanojuice orally. They then used photoacoustic tomography (PAT), which is pulsed laser lights that generate pressure waves that, when measured, provide a real-time and more nuanced view of the small intestine.

The researchers plan to continue to refine the technique for human trials, and move into other areas of the gastrointestinal tract.

Here’s an image of the nanojuice in the guts of a mouse,

The combination of "nanojuice" and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

The combination of “nanojuice” and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

This is an international collaboration both from a research perspective and a funding perspective (from the news release),

Additional authors of the study come from UB’s Department of Chemical and Biological Engineering, Pohang University of Science and Technology in Korea, Roswell Park Cancer Institute in Buffalo, the University of Wisconsin-Madison, and McMaster University in Canada.

The research was supported by grants from the National Institutes of Health, the Department of Defense and the Korean Ministry of Science, ICT and Future Planning.

Here’s a link to and a citation for the paper,

Non-invasive multimodal functional imaging of the intestine with frozen micellar naphthalocyanines by Yumiao Zhang, Mansik Jeon, Laurie J. Rich, Hao Hong, Jumin Geng, Yin Zhang, Sixiang Shi, Todd E. Barnhart, Paschalis Alexandridis, Jan D. Huizinga, Mukund Seshadri, Weibo Cai, Chulhong Kim, & Jonathan F. Lovell. Nature Nanotechnology (2014) doi:10.1038/nnano.2014.130 Published online 06 July 2014

This paper is behind a paywall.