Tag Archives: Korea

Part 2 (b) of 3: Science Culture: Where Canada Stands; an expert assessment (reconstructed)

Carrying on from part 2 (a) of this commentary on the Science Culture: Where Canada Stands assessment by the Council of Canadian Academies (CAC).

One of the most intriguing aspects of this assessment was the reliance on an unpublished inventory of Canadian science outreach initiatives (informal science education) that was commissioned by the Korean Foundation for the Advancement of Science and Creativity,

The system of organizations, programs, and initiatives that supports science culture in any country is dynamic. As a result, any inventory provides only a snapshot at a single point in time, and risks quickly becoming out of date. No sustained effort has been made to track public science outreach and engagement efforts in Canada at the national or regional level. Some of the Panel’s analysis relies on data from an unpublished inventory of public science communication initiatives in Canada undertaken in 2011 by Bernard Schiele, Anik Landry, and Alexandre Schiele for the Korean Foundation for the Advancement of Science and Creativity (Schiele et al., 2011). This inventory identified over 700 programs and organizations across all provinces and regions in Canada, including over 400 initiatives related to museums, science centres, zoos, or aquariums; 64 associations or NGOs involved in public science outreach; 49 educational initiatives; 60 government policies and programs; and 27 media programs. (An update of this inventory completed by the Panel brings the total closer to 800 programs.) The inventory is used throughout the chapter [chapter five] to characterize different components of the Canadian system supporting public science outreach, communication, and engagement. (p. 130 PDF; p. 98 print)

I’m fascinated by the Korean interest and wonder if this due to perceived excellence or to budgetary considerations. The cynic in me suspects the Korean foundation was interested in the US scene but decided that information from the Canadian scene would be cheaper to acquire and the data could be extrapolated to give a perspective on the US scene.

In addition to the usual suspects (newspapers, television, radio, science centres, etc.), the Expert Panel did recognize the importance of online science sources (they would have looked foolish if they hadn’t),

Canadians are increasingly using the internet to seek out information relating to science. This activity can take the form of generalized searches about science-related issues or more targeted forms of information acquisition. For example, Canadians report using the internet to seek out information on health and medical issues an average of 47 times a year, or nearly every week. Other forms of online exposure to scientific content also appear to be common. For example, 46% of Canadians report having read a blog post or listserv related to science and technology at least once in the last three months, and 62% having watched an online video related to science and technology.

An increasing reliance on the internet as the main source of information about science and technology is consistent with the evolution of the media environment, as well as with survey data from other countries. Based on the Panel’s survey, 17% of Canadians, for example, report reading a printed newspaper daily, while 40% report reading about the news or current events online every day. (p. 13/2 PDF; p. 100/1 print)

In common with the rest of the world, Canadians are producing and enjoying science festivals,

In Canada there are two established, large-scale science festivals. Science Rendezvous [founded in 2008 as per its Wikipedia entry] takes place in about 20 cities across the country and combines a variety of programming to comprise a day-long free event (Science Rendezvous, 2013).

The annual Eureka! Festival in Montréal (see Figure 5.6 [founded in 2007 as per its program list]) has over 100 activities over three days; it attracted over 68,000 attendees in 2012 (Eureka! Festival, 2013). More science festivals have recently been created. The University of Toronto launched the Toronto Science Festival in fall 2013 (UofT, 2013), and Beakerhead, a new festival described as a “collision of art and culture, technology, and engineering,” was launched in 2013 in Calgary (Beakerhead, 2013). Two Canadian cities have also recently won bids to host STEMfest (Saskatoon in 2015 and Halifax in 2018), an international festival of science, technology, engineering, and mathematics (Global STEM States, 2014). (pp. 145/6 PDF; pp. 113/4 PDF)

The assessment notes have a grand total of five radio and television programmes devoted to science: The Nature of Things, Daily Planet, Quirks and Quarks, Découverte, and Les années lumière (p. 150 PDF; p. 118 print) and a dearth of science journalism,

Dedicated science coverage is notably absent from the majority of newspapers and other print journalism in Canada. As shown in Table 5.3, none of the top 11 newspapers by weekly readership in Canada has a dedicated science section, including nationals such as The Globe and Mail and National Post. Nine of these newspapers have dedicated technology sections, which sometimes contain sub-sections with broader coverage of science or environment stories; however, story coverage tends to be dominated by technology or business (or gaming) stories. Few Canadian newspapers have dedicated science journalists on staff, and The Globe and Mail is unique among Canadian papers in having a science reporter, a medicine and health reporter, and a technology reporter. (p. 152 PDF; p. 120 print)

Not stated explicitly in the assessment is this: those science and technology stories you see in the newspaper are syndicated stories, i.e., written by reporters for the Associated Press, Reuters, and other international press organizations or simply reprinted (with credit) from another newspaper.

The report does cover science blogging with this,

Science blogs are another potential source of information about developments in science and technology. A database compiled by the Canadian Science Writers’ Association, as of March of 2013, lists 143 Canadian science blogs, covering all areas of science and other aspects of science such as science policy and science culture (CSWA, 2013). Some blogs are individually authored and administered, while others are affiliated with larger networks or other organizations (e.g., Agence Science-Presse, PLOS Blogs). Canadian science blogger Maryse de la Giroday has also published an annual round-up of Canadian science blogs on her blog (www.frogheart.ca) for the past three years, and a new aggregator of Canadian science blogs was launched in 2013 (www.scienceborealis.ca). [emphases mine]

Data from the Panel’s survey suggest that blogs are becoming a more prominent source of information about science and technology for the general public. As noted at the beginning of the chapter, 46% of Canadians report having read a blog post about science or technology at least once in the past three months. Blogs are also influencing the way that scientific research is carried out and disseminated. A technical critique in a blog post by Canadian microbiologist Rosie Redfield in 2010, for example, catalyzed a widely publicized debate on the validity of a study published in Science, exploring the ability of bacteria to incorporate arsenic into their DNA. The incident demonstrated the potential impact of blogs on mainstream scientific research. CBC highlighted the episode as the Canadian science story of the year (Strauss, 2011), and Nature magazine identified Redfield as one of its 10 newsmakers of the year in 2011 as a result of her efforts to replicate the initial study and publicly document her progress and results (Hayden, 2011).

The impact of online information sources, however, is not limited to blogs, with 42% of Canadians reporting having heard about a science and technology news story though social media sources like Twitter and Facebook in the last three months. And, as noted earlier, the internet is often used to search for information about specific science and technology topics, both for general issues such as climate change, and more personalized information on medical and health issues.(pp. 153/4 PDF; pp. 121/2 print)

Yes, I got a shout out as did Rosie Redfield. We were the only two science bloggers namechecked. (Years ago, the Guardian newspaper was developing a science blog network and the editor claimed he couldn’t find many female science bloggers after fierce criticism of its first list of bloggers. This was immediately repudiated not only by individuals but someone compiled a list of hundreds of female science bloggers.) Still, the perception persists and I’m thrilled that the panel struck out in a different direction. I was also pleased to see Science Borealis (a Canadian science blog aggregator) mentioned. Having been involved with its founding, I’m also delighted its first anniversary was celebrated in Nov. 2014.

I doubt many people know we have a science press organization in Canada, Agence Science-Presse, but perhaps this mention in the assessment will help raise awareness in Canada’s English language media,

Founded in 1978 with the motto Parce que tout le monde s’intéresse à la science (“because everyone is interested in science”), Agence Science-Presse is a not-for-profit organization in Quebec that supports media coverage of science by distributing articles on scientific research or other topical science and technology issues to media outlets in Canada and abroad. The organization also supports science promotion activities aimed at youth. For example, it currently edits and maintains an aggregation of blogs designed for young science enthusiasts and science journalists (Blogue ta science). (p. 154 PDF; p. 122)

The final chapter (the 6th) of the assessment makes five key recommendations for ‘Cultivating a strong science culture':

  1. Support lifelong science learning
  2. Make science inclusive
  3. Adapt to new technologies
  4. Enhance science communication and engagement
  5. Provide national or regional leadership

Presumably the agriculture reference in the chapter title is tongue-in-cheek. Assuming that’s not one of my fantasies, it’s good to see a little humour.

On to the first recommendation, lifelong learning,

… Science centres and museums, science programs on radio and television, science magazines and journalism, and online resources can all help fulfil this function by providing accessible resources for adult science learning, and by anticipating emerging information needs based on topical issues.

Most informal science learning organizations already provide these opportunities to varying degrees; however, this conception of the relative roles of informal and formal science learning providers differs from the traditional understanding, which often emphasizes how informal environments can foster engagement in science (particularly among youth), thereby triggering additional interest and the later acquisition of knowledge (Miller, 2010b). [emphasis mine] Such a focus may be appropriate for youth programming, but neglects the role that these institutions can play in ongoing education for adults, who often seek out information on science based on specific, well-defined interests or needs (e.g., a medical diagnosis, a newspaper article on the threat of a viral pandemic, a new technology brought into the workplace) (Miller, 2012). [emphases mine] Informal science learning providers can take advantage of such opportunities by anticipating these needs, providing useful and accessible information, and then simultaneously building and deepening knowledge of the underlying science through additional content.

I’m glad to see the interest in adult informal science education although the emphasis on health/medical and workplace technology issues suggests the panel underestimates, despite the data from its own survey, Canadians’ curiosity about and interest in science and technology. The panel also underestimates the tenacity with which many gatekeepers hold to the belief that no one is interested in science. It took me two years before a local organizer would talk to me about including one science-themed meeting in his programme (the final paragraph in my April 14, 2014 post describes some of the process  and my April 18, 2014 post describes the somewhat disappointing outcome). In the end, it was great to see a science-themed ‘city conversation’ but I don’t believe the organizer found it to be a success, which means it’s likely to be a long time before there’s another one.

The next recommendation, ‘Making science inclusive’, is something that I think needs better practice. If one is going to be the change one wants to see that means getting people onto your expert panels that reflect your inclusiveness and explaining to your audience how your expert panel is inclusive.

The ‘Adapting to new technologies’ recommendation is where I expected to see some mention of the social impact of such emerging technologies as robotics, nanotechnology, synthetic biology, etc. That wasn’t the case,

Science culture in Canada and other countries is now evolving in a rapidly changing technological environment. Individuals are increasingly turning to online sources for information about science and technology, and science communicators and the media are also adapting to the new channels of communication and outreach provided over the internet. As people engage more with new forms of technology in their home and work lives, organizations may be able to identify new ways to take advantage of available technologies to support learning and foster science interest and engagement. At the same time, as noted in Chapter 2, this transition is also challenging traditional models of operation for many organizations such as science centres, museums, and science media providers, forcing them to develop new strategies.

Examples of the use of new technologies to support learning are now commonplace. Nesta, an innovation-oriented organization based in the United Kingdom, conducted a study investigating the extent to which new technologies are transforming learning among students (Luckin et al., 2012) (p. 185 PDF; p. 153 print)

Admittedly, the panel was not charged with looking too far into the future but it does seem odd that in a science culture report there isn’t much mention (other than a cursory comment in an early chapter) of these emerging technologies and the major changes they are bringing with them. If nothing else, the panel might have wanted to make mention of artificial intelligence how the increasing role of automated systems may be affecting science culture in Canada. For example, in my July 16, 2014 post I made described a deal Associated Press (AP) signed with a company that automates the process of writing sports and business stories. You may well have read a business story (AP contracted for business stories) written by an artificial intelligence system or, if you prefer the term, an algorithm.

The recommendation for ‘Enhancing science communication and engagement’ is where I believe the Expert Panel should be offered a bouquet,

… Given the significance of government science in many areas of research, government science communication constitutes an important vector for increasing public awareness and understanding about science. In Canada current policies governing how scientists working in federal departments and agencies are allowed to interact with the media and the public have come under heavy criticism in recent years …

Concerns about the federal government’s current policies on government scientists’ communication with the media have been widely reported in Canadian and international
press in recent years (e.g., Ghosh, 2012; CBC, 2013c; Gatehouse, 2013; Hume, 2013; Mancini, 2013; Munro, 2013). These concerns were also recently voiced by the editorial board of Nature (2012), which unfavourably compared Canada’s current approach with the more open policies now in place in the United States. Scientists at many U.S. federal agencies are free to speak to the media without prior departmental approval, and to
express their personal views as long as they clearly state that they are not speaking on behalf of the government. In response to such concerns, and to a formal complaint filed by the Environmental Law Clinic at the University of Victoria and Democracy Watch, on April 2, 2013 Canada’s Information Commissioner launched an investigation into whether current policies and policy instruments in seven federal departments and agencies are “restricting or prohibiting government scientists from speaking with or sharing research with the media and the Canadian public” (OICC, 2013).

Since these concerns have come to light, many current and former government scientists have discussed how these policies have affected their interactions with the media. Marley Waiser, a former scientist with Environment Canada, has spoken about how that department’s policies prevented her from discussing her research on chemical pollutants in Wascana Creek near Regina (CBC, 2013c). Dr. Kristi Miller, a geneticist with the Department of Fisheries and Oceans, was reportedly prevented from speaking publicly about a study she published in Science, which investigated whether a viral infection might be the cause of declines in Sockeye salmon stocks in the Fraser River (Munro, 2011).

According to data from Statistics Canada (2012), nearly 20,000 science and technology professionals work for the federal government. The ability of these researchers to communicate with the media and the Canadian public has a clear bearing on Canada’s science culture. Properly supported, government scientists can serve as a useful conduit for informing the public about their scientific work, and engaging the public in discussions about the social relevance of their research; however, the concerns reported above raise questions about the extent to which current federal policies in Canada are limiting these opportunities for public communication and engagement. (pp. 190/1 PDF; p. 158/9 print)

Kudos for including the information and for this passage as well,

Many organizations including science centres and museums, research centres, and even governments may be perceived as having a science promotion agenda that portrays only the benefits of science. As a result, these organizations are not always seen as promoters of debate through questioning, which is a crucial part of the scientific process. Acknowledging complexity and controversy is another means to improve the quality of public engagement in science in a range of different contexts. (p. 195 PDF; p. 163 print)

One last happy note, which is about integrating the arts and design into the STEM (science, technology, engineering, and mathematics communities),

Linking Science to the Arts and Design U.S. advocates for “STEM to STEAM” call for an incorporation of the arts in discussions of science, technology, engineering, and mathematics in an effort to “achieve a synergistic balance” (Piro, 2010). They cite positive outcomes such as cognitive development, reasoning skills, and concentration abilities. Piro (2010) argues that “if creativity, collaboration, communication, and critical thinking — all touted as hallmark skills for 21st-century success — are to be cultivated, we need to ensure that STEM subjects are drawn closer to the arts.” Such approaches offer new techniques to engage both student and adult audiences in science learning and engagement opportunities.

What I find fascinating about this STEM to STEAM movement is that many of these folks don’t seem to realize is that until fairly recently the arts and sciences recently have always been closely allied.  James Clerk Maxwell was also a poet, not uncommon amongst 19th century scientists.

In Canada one example of this approach is found in the work of Michael R. Hayden, who has conducted extensive genetic research on Huntington disease. In the lead-up to the 2000 Human Genome Project World Conference, Hayden commissioned Vancouver’s Electric Company Theatre to fuse “the spheres of science and art in a play that explored the implications of the revolutionary technology of the Human Genome Project” (ECT, n.d.). This play, The Score, was later adapted into a film. Hayden believes that his play “transforms the scientific ideas explored in the world of the laboratory into universal themes of human identity, freedom and creativity, and opens up a door for a discussion between the scientific community and the public in general” (Genome Canada, 2006). (p. 196 PDF; p. 164 print)

I’m not sure why the last recommendation presents an either/or choice, ‘Providing national or regional leadership’, while the following content suggests a much more fluid state,

…  it should be recognized that establishing a national or regional vision for science culture is not solely the prerogative of government. Such a vision requires broad support and participation from the community of affected stakeholders to be effective, and can also emerge from that community in the absence of a strong governmental role.

The final chapter (the seventh) restates the points the panel has made throughout its report. Unexpectedly, part 2 got bigger, ’nuff said.

Part 2 (a) of 3: Science Culture: Where Canada Stands; an expert assessment (reconstructed)

Losing over 2000 words, i.e., part 2 of this commentary on the Science Culture: Where Canada Stands assessment by the Council of Canadian Academies (CAC) on New Year’s Eve 2014 was a bit of blow. So, here’s my attempt at reconstructing my much mourned part 2.

There was acknowledgement of Canada as a Arctic country and an acknowledgement of this country’s an extraordinary geographical relationship to the world’s marine environment,

Canada’s status as an Arctic nation also has a bearing on science and science culture. Canada’s large and ecologically diverse Arctic landscape spans a substantial part of the circumpolar Arctic, and comprises almost 40% of the country’s landmass (Statistics Canada, 2009). This has influenced the development of Canadian culture more broadly, and also created opportunities in the advancement of Arctic science. Canada’s northern inhabitants, the majority of whom are Indigenous peoples, represent a source of knowledge that contributes to scientific research in the North (CCA, 2008).

These characteristics have contributed to the exploration of many scientific questions including those related to environmental science, resource development, and the health and well-being of northern populations. Canada also has the longest coastline of any country, and these extensive coastlines and marine areas give rise to unique research opportunities in ocean science (CCA, 2013a). (p. 55 PDF; p. 23 print)

Canada’s aging population is acknowledged in a backhand way,

Like most developed countries, Canada’s population is also aging. In 2011 the median age in Canada was 39.9 years, up from 26.2 years in 1971 (Statistics Canada, n.d.). This ongoing demographic transition will have an impact on science culture in Canada in years to come. An aging population will be increasingly interested in health and medical issues. The ability to make use of this kind of information will depend in large part on the combination of access to the internet, skill in navigating it, and a conceptual toolbox that includes an understanding of genes, probability, and related constructs (Miller, 2010b). (p. 56 PDF; p. 24 print)

Yes, the only science topics of interest for an old person are health and medicine. Couldn’t they have included one sentence suggesting an aging population’s other interests and other possible impacts on science culture?

On the plus side, the report offers a list of selected Canadian science culture milestones,

• 1882 – Royal Society of Canada is established.
• 1916 – National Research Council is established.
• 1923 – Association canadienne-française pour l’avancement des sciences (ACFAS) is established.
• 1930 – Canadian Geographic is first published by the Royal Canadian Geographical Society.
• 1951 – Massey–Lévesque Commission calls for the creation of a national science and technology museum.
• 1959 – Canada sees its first science fairs in Winnipeg, Edmonton, Hamilton, Toronto, Montréal, and Vancouver; volunteer coordination eventually grows into Youth Science Canada.
• 1960 – CBC’s Nature of Things debuts on television; Fernand Séguin hosts “Aux frontières de la science.”
• 1962 – ACFAS creates Le Jeune scientifique, which becomes Québec Science in 1970.
• 1966 – Science Council of Canada is created to advise Parliament on science and technology issues.
• 1967 – Canada Museum of Science and Technology is created.
• 1969 – Ontario Science Centre opens its doors (the Exploratorium in San Francisco opens the same year).
• 1971 – Canadian Science Writers’ Association is formed.
• 1975 – Symons Royal Commission on Canadian Studies speaks to how understanding the role of science in society is important to understanding Canadian culture and identity.
• 1975 – Quirks and Quarks debuts on CBC Radio.
• 1976 – OWL children’s magazine begins publication.
• 1977 – Association des communicateurs scientifiques du Québec is established.
• 1978 – L’Agence Science-Presse is created.
• 1981 – Association des communicateurs scientifiques creates the Fernand-Séguin scholarship to identify promising young science journalists.
• 1982 – Les Débrouillards is launched in Quebec. (p. 58 PDF; p. 26 print)

The list spills onto the next page and into the 2000’s.

It’s a relief to see the Expert Panel give a measured response to the claims made about science culture and its various impacts, especially on the economy (in my book, some of the claims have bordered on hysteria),

The Panel found little definitive empirical evidence of causal relationships between the dimensions of science culture and higher-level social objectives like stronger economic performance or more effective public policies. As is the case with much social science research, isolating the impacts of a single variable on complex social phenomena is methodologically challenging, and few studies have attempted to establish such relationships in any detail. As noted in 1985 by the Bodmer report (a still-influential report on public understanding of science in the United Kingdom), although there is good reason prima facie to believe that improving public understanding of science has national economic benefits, empirical proof for such a link is often elusive (RS & Bodmer, 1985). This remains the case today. Nevertheless, many pieces of evidence suggest why a modern, industrialized society should cultivate a strong science culture. Literature from the domains of cognitive science, sociology, cultural studies, economics, innovation, political science, and public policy provides relevant insights. (p. 63 PDF; p. 31 print)

Intriguingly, while the panel has made extensive use of social science methods for this assessment there are some assumptions made about skill sets required for the future,

Technological innovation depends on the presence of science and technology skills in the workforce. While at one point it may have been possible for relatively low-skilled individuals to substantively contribute to technological development, in the 21st century this is no longer the case. [emphasis mine] Advanced science and technology skills are now a prerequisite for most types of technological innovation. (p. 72 PDF; p. 40 print)

Really, it’s no longer possible for relatively low-skilled individuals to contribute to technological development? Maybe the expert panel missed this bit in my March 27, 2013 post,

Getting back to Bittel’s Slate article, he mentions Foldit (here’s my first piece in an Aug. 6, 2010 posting [scroll down about 1/2 way]), a protein-folding game which has generated some very exciting science. He also notes some of that science was generated by older, ‘uneducated’ women. Bittel linked to Jeff Howe’s Feb. 27, 2012 article about Foldit and other crowdsourced science projects for Slate where I found this very intriguing bit,

“You’d think a Ph.D. in biochemistry would be very good at designing protein molecules,” says Zoran Popović, the University of Washington game designer behind Foldit. Not so. “Biochemists are good at other things. But Foldit requires a narrow, deeper expertise.”

Or as it turns out, more than one. Some gamers have a preternatural ability to recognize patterns, an innate form of spatial reasoning most of us lack. Others—often “grandmothers without a high school education,” says Popovic—exercise a particular social skill. “They’re good at getting people unstuck. They get them to approach the problem differently.” What big pharmaceutical company would have anticipated the need to hire uneducated grandmothers? (I know a few, if Eli Lilly HR is thinking of rejiggering its recruitment strategy.) [emphases mine]

It’s not the idea that technical and scientific skills are needed that concerns me; it’s the report’s hard line about ‘low skills’ (which is a term that is not defined). In addition to the notion that future jobs require only individuals with ‘high level’ skills; there’s the notion (not mentioned in this report but gaining general acceptance in the media) that we shouldn’t ever have to perform repetitive and boring activities. It’s a notion which completely ignores a certain aspect of the learning process. Very young children repeat over and over and over and over … . Apprenticeships in many skills-based crafts were designed with years of boring, repetitive work as part of the training. It seems counter-intuitive but boring, repetitive activities can lead to very high level skills such as the ability to ‘unstick’ a problem for an expert with a PhD in biochemistry.

Back to the assessment, the panel commissioned a survey, conducted in 2013, to gather data about science culture in Canada,

The Panel’s survey of Canadian science culture, designed to be comparable to surveys undertaken in other countries as well as to the 1989 Canadian survey, assessed public attitudes towards science and technology, levels and modes of public engagement in science, and public science knowledge or understanding. (The evidence reported in this chapter on the fourth dimension, science and technology skills, is drawn from other sources such as Statistics Canada and the OECD).

Conducted in April 2013, the survey relied on a combination of landline and mobile phone respondents (60%) and internet respondents (40%), randomly recruited from the general population. In analyzing the results, responses to the survey were weighted based on Statistics Canada data according to region, age, education, and gender to ensure that the sample was representative of the Canadian public. 7 A total of 2,004 survey responses were received, with regional breakdowns presented in Table 4.1. At a national level, survey results are accurate within a range of plus or minus 2.2% 19 times out of 20 (i.e., at the 95% confidence interval), and margins of error for regional results range from 3.8% to 7.1%). Three open-ended questions were also included in the survey, which were coded using protocols previously applied to these questions in other international surveys. 8 All open-ended questions were coded independently by at least three bilingual coders, and any discrepancies in coding were settled through a review by a fourth coder. (p. 79 PDF; p. 47 print)

The infographic’s data in part 1 of this commentary, What Do Canadians Think About Science and Technology (S&T)? is based on the survey and other statistical information included in the report especially Chapter four focused on measurements (pp. 77  – 127 PDF; pp. 45 – 95 print). While the survey presents a somewhat rosier picture of the Canadian science culture than the one I experience on a daily basis, the data seems to have been gathered in a thoughtful fashion. Regardless of the assessment’s findings and my opinions,  how Canadians view science became a matter of passionate debate in the Canadian science blogging community (at least parts of it) in late 2014 as per a Dec. 3, 2014 posting by the Science Borealis team on their eponymous blog (Note: Links have been removed),

The CBC’s Rick Mercer is a staunch science advocate, and his November 19th rant was no exception. He addressed the state of basic science in Canada, saying that Canadians are “passionate and curious about science.”

In response, scientist David Kent wrote a post on the Black Hole Blog in which he disagreed with Mercer, saying, “I do not believe Mr. Mercer’s idea that Canadians as a whole are interested although I, like him, would wish it to be the case.”

Kent’s post has generated some fierce discussion, both in the comments on his original post and in the comments on a Facebook post by Evidence for Democracy.

Here at Science Borealis, we rely on a keen and enthusiastic public to engage with the broad range of science-based work our bloggers share, so we decided to address some of the arguments Kent presented in his post.

Anecdotal evidence versus data

Kent says “Mr. Mercer’s claims about Canadians’ passions are anecdotal at best, and lack any evidence – indeed it is possible that Canadians don’t give a hoot about science for science’s sake.”

Unfortunately, Kent’s own argument is based on anecdotal evidence (“To me it appears that… the average Canadian adult does not particularly care about how or why something works.”).

If you’re looking for data, they’re available in a recent Council of Canadian Academies report that specifically studied science culture in Canada. Results show that Canadians are very interested in science.

You can find David Kent’s Nov. 26, 2014 post about Canadians, Rick Mercer and science here. Do take a look at the blog’s comments which feature a number of people deeply involved in promoting and producing Canadian science culture.

I promised disturbing statistics in the head for this posting and here they are in the second paragraph,

Canadian students perform well in PISA [Organization for Economic Cooperation and Development’s (OECD) Programme for International Student Assessment (PISA)] , with relatively high scores on all three of the major components of the assessment (reading, science, and mathematics) compared with students in other countries (Table 4.4). In 2012 only seven countries or regions had mean scores on the science assessment higher than Canada on a statistically significant basis: Shanghai–China, Hong Kong–China, Singapore, Japan, Finland, Estonia, and Korea (Brochu et al., 2013). A similar pattern holds for mathematics scores, where nine countries had mean scores higher than Canada on a statistically significant basis: Shanghai–China, Singapore, Hong Kong–China, Chinese Taipei, Korea, Macao–China, Japan, Lichtenstein, and Switzerland (Brochu et al., 2013). Regions scoring higher than Canada are concentrated in East Asia, and tend to be densely populated, urban areas. Among G8 countries, Canada ranks second on mean science and mathematics scores, behind Japan.

However, the 2012 PISA results also show statistically significant declines in Canada’s scores on both the mathematics and science components. Canada’s science score declined by nine points from its peak in 2006 (with a fall in ranking from 3rd to 10th), and the math score declined by 14 points since first assessed in 2003 (a fall from 7th to 13th) (Brochu et al., 2013). Changes in Canada’s standing relative to other countries reflect both the addition of new countries or regions over time (i.e., the addition of regions such as Hong Kong–China and Chinese Taipei in 2006, and of Shanghai–China in 2009) and statistically significant declines in mean scores.

My Oct. 9, 2013 post discusses the scores in more detail and as the Expert Panel notes, the drop is disconcerting and disturbing. Hopefully, it doesn’t indicate a trend.

Part 2 (b) follows immediately.

Gelatin nanoparticles for drug delivery after a stroke

A Dec. 24, 2014 news item on phys.org describes a treatment that could mitigate the effects of a stroke by extending the window of opportunity for recuperative treatments (Note: Links have been removed),

Stroke victims could have more time to seek treatment that could reduce harmful effects on the brain, thanks to tiny blobs of gelatin that could deliver the medication to the brain noninvasively.

University of Illinois researchers and colleagues in South Korea, led by U. of I. electrical and computer engineering senior research scientist Hyungsoo Choi and professor Kyekyoon “Kevin” Kim, published details about the gelatin nanoparticles in the journal Drug Delivery and Translational Research.

A Dec. 23, 2014 University of Illinois at Urbana-Champaign news release, which originated the news item, explains how the gelatin nanoparticles are directed to the injury site (Note: links have been removed),

The researchers found that gelatin nanoparticles could be laced with medications for delivery to the brain, and that they could extend the treatment window for when a drug could be effective. Gelatin is biocompatible, biodegradable, and classified as “Generally Recognized as Safe” by the Food and Drug Administration. Once administered, the gelatin nanoparticles target damaged brain tissue thanks to an abundance of gelatin-munching enzymes produced in injured regions.

The tiny gelatin particles have a huge benefit: They can be administered nasally, a noninvasive and direct route to the brain. This allows the drug to bypass the blood-brain barrier, a biological fence that prevents the vast majority of drugs from entering the brain through the bloodstream.

“Overcoming the difficulty of delivering therapeutic agents to specific regions of the brain presents a major challenge to treatment of most neurological disorders,” said Choi.  “However, if drug substances can be transferred along the olfactory nerve cells, they can bypass the blood-brain barrier and enter the brain directly.”

To test gelatin nanoparticles as a drug-delivery system, the researchers used the drug osteopontin (OPN), which in rats can help to reduce inflammation and prevent brain cell death if administered immediately after a stroke.

“It is crucial to treat ischemic strokes within three hours to improve the chances of recovery. However, a significant number of stroke victims don’t get to the hospital in time for the treatment,” Kim said.

By lacing gelatin nanoparticles with OPN, the researchers found that they could extend the treatment window in rats, so much so that treating a rat with nanoparticles six hours after a stroke showed the same efficacy rate as giving them OPN alone after one hour – 70 percent recovery of dead volume in the brain.

The researchers hope the gelatin nanoparticles, administered through the nasal cavity, can help deliver other drugs to more effectively treat a variety of brain injuries and neurological diseases.

“Gelatin nanoparticles are a delivery vehicle that could be used to deliver many therapeutics to the brain,” Choi said. “They will be most effective in delivering drugs that cannot cross the blood-brain barrier. In addition, they can be used for drugs of high toxicity or a short half-life.“

I expect the next steps will include some human clinical trials. In the meantime for those who are interested, here’s a link to and a citation for the paper,

Gelatin nanoparticles enhance the neuroprotective effects of intranasally administered osteopontin in rat ischemic stroke model by Elizabeth Joachim, Il-Doo Kim, Yinchuan Jin, Kyekyoon (Kevin) Kim, Ja-Kyeong Lee, and Hyungsoo Choi. Drug Delivery and Translational Research Volume 4, Issue 5-6 , pp 395-399 DOI 10.1007/s13346-014-0208-9 Published online Nov. 8, 2014

This paper is behind a paywall.

Flexible electronics and Inorganic-based Laser Lift-off (ILLO) in Korea

Korean scientists are trying to make the process of creating flexible electronics easier according to a Nov. 25, 2014 news item on ScienceDaily,

Flexible electronics have been touted as the next generation in electronics in various areas, ranging from consumer electronics to bio-integrated medical devices. In spite of their merits, insufficient performance of organic materials arising from inherent material properties and processing limitations in scalability have posed big challenges to developing all-in-one flexible electronics systems in which display, processor, memory, and energy devices are integrated. The high temperature processes, essential for high performance electronic devices, have severely restricted the development of flexible electronics because of the fundamental thermal instabilities of polymer materials.

A research team headed by Professor Keon Jae Lee of the Department of Materials Science and Engineering at KAIST provides an easier methodology to realize high performance flexible electronics by using the Inorganic-based Laser Lift-off (ILLO).

The process is described in a Nov. 26, 2014 KAIST news release on ResearchSEA, which originated the news item (despite the confusion of the date, probably due to timezone differentials), provides more detail about the technique for ILLO,

The ILLO process involves depositing a laser-reactive exfoliation layer on rigid substrates, and then fabricating ultrathin inorganic electronic devices, e.g., high density crossbar memristive memory on top of the exfoliation layer. By laser irradiation through the back of the substrate, only the ultrathin inorganic device layers are exfoliated from the substrate as a result of the reaction between laser and exfoliation layer, and then subsequently transferred onto any kind of receiver substrate such as plastic, paper, and even fabric.

This ILLO process can enable not only nanoscale processes for high density flexible devices but also the high temperature process that was previously difficult to achieve on plastic substrates. The transferred device successfully demonstrates fully-functional random access memory operation on flexible substrates even under severe bending.

Professor Lee said, “By selecting an optimized set of inorganic exfoliation layer and substrate, a nanoscale process at a high temperature of over 1000 °C can be utilized for high performance flexible electronics. The ILLO process can be applied to diverse flexible electronics, such as driving circuits for displays and inorganic-based energy devices such as battery, solar cell, and self-powered devices that require high temperature processes.”

Here’s a link to and a citation for the research paper,

Flexible Crossbar-Structured Resistive Memory Arrays on Plastic Substrates via Inorganic-Based Laser Lift-Off by Seungjun Kim, Jung Hwan Son, Seung Hyun Lee, Byoung Kuk You, Kwi-Il Park, Hwan Keon Lee, Myunghwan Byun and Keon Jae Lee. Advanced Materials Volume 26, Issue 44, pages 7480–7487, November 26, 2014 Article first published online: 8 SEP 2014 DOI: 10.1002/adma.201402472

© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

Here’s an image the researchers have made available,

This photo shows the flexible RRAM device on a plastic substrate. Courtesy: KAIST

This photo shows the flexible RRAM device on a plastic substrate. Courtesy: KAIST

Finally, the research paper is behind a paywall.

Nanosafety research: a quality control issue

Toxicologist Dr. Harald Krug has published a review of several thousand studies on nanomaterials safety exposing problematic research methodologies and conclusions. From an Oct. 29, 2014 news item on Nanowerk (Note: A link has been removed),

Empa [Swiss Federal Laboratories for Materials Science and Technology] toxicologist Harald Krug has lambasted his colleagues in the journal Angewandte Chemie (“Nanosafety Research—Are We on the Right Track?”). He evaluated several thousand studies on the risks associated with nanoparticles and discovered no end of shortcomings: poorly prepared experiments and results that don’t carry any clout. Instead of merely leveling criticism, however, Empa is also developing new standards for such experiments within an international network.

An Oct. 29, 2014 Empa press release (also on EurekAlert), which originated the news item, describes the new enthusiasm for research into nanomaterials and safety,

Researching the safety of nanoparticles is all the rage. Thousands of scientists worldwide are conducting research on the topic, examining the question of whether titanium dioxide nanoparticles from sun creams can get through the skin and into the body, whether carbon nanotubes from electronic products are as hazardous for the lungs as asbestos used to be or whether nanoparticles in food can get into the blood via the intestinal flora, for instance. Public interest is great, research funds are flowing – and the number of scientific projects is skyrocketing: between 1980 and 2010, a total of 5,000 projects were published, followed by another 5,000 in just the last three years. However, the amount of new knowledge has only increased marginally. After all, according to Krug the majority of the projects are poorly executed and all but useless for risk assessments.

The press release goes on to describe various pathways into the body and problems with research methodologies,

How do nanoparticles get into the body?

Artificial nanoparticles measuring between one and 100 nanometers in size can theoretically enter the body in three ways: through the skin, via the lungs and via the digestive tract. Almost every study concludes that healthy, undamaged skin is an effective protective barrier against nanoparticles. When it comes to the route through the stomach and gut, however, the research community is at odds. But upon closer inspection the value of many alarmist reports is dubious – such as when nanoparticles made of soluble substances like zinc oxide or silver are being studied. Although the particles disintegrate and the ions drifting into the body are cytotoxic, this effect has nothing to do with the topic of nanoparticles but is merely linked to the toxicity of the (dissolved) substance and the ingested dose.

Laboratory animals die in vain – drastic overdoses and other errors

Krug also discovered that some researchers maltreat their laboratory animals with absurdly high amounts of nanoparticles. Chinese scientists, for instance, fed mice five grams of titanium oxide per kilogram of body weight, without detecting any effects. By way of comparison: half the amount of kitchen salt would already have killed the animals. A sloppy job is also being made of things in the study of lung exposure to nanoparticles: inhalation experiments are expensive and complex because a defined number of particles has to be swirled around in the air. Although it is easier to place the particles directly in the animal’s windpipe (“instillation”), some researchers overdo it to such an extent that the animals suffocate on the sheer mass of nanoparticles.

While others might well make do without animal testing and conduct in vitro experiments on cells, here, too, cell cultures are covered by layers of nanoparticles that are 500 nanometers thick, causing them to die from a lack of nutrients and oxygen alone – not from a real nano-effect. And even the most meticulous experiment is worthless if the particles used have not been characterized rigorously beforehand. Some researchers simply skip this preparatory work and use the particles “straight out of the box”. Such experiments are irreproducible, warns Krug.

As noted in the news item, the scientists at Empa have devised a solution to some to of the problems (from the press release),

The solution: inter-laboratory tests with standard materials
Empa is thus collaborating with research groups like EPFL’s Powder Technology Laboratory, with industrial partners and with Switzerland’s Federal Office of Public Health (FOPH) to find a solution to the problem: on 9 October the “NanoScreen” programme, one of the “CCMX Materials Challenges”, got underway, which is expected to yield a set of pre-validated methods for lab experiments over the next few years. It involves using test materials that have a closely defined particle size distribution, possess well-documented biological and chemical properties and can be altered in certain parameters – such as surface charge. “Thanks to these methods and test substances, international labs will be able to compare, verify and, if need be, improve their experiments,” explains Peter Wick, Head of Empa’s laboratory for Materials-Biology Interactions.

Instead of the all-too-familiar “fumbling around in the dark”, this would provide an opportunity for internationally coordinated research strategies to not only clarify the potential risks of new nanoparticles in retrospect but even be able to predict them. The Swiss scientists therefore coordinate their research activities with the National Institute of Standards and Technology (NIST) in the US, the European Commission’s Joint Research Center (JRC) and the Korean Institute of Standards and Science (KRISS).

Bravo! and thank you Dr. Krug and Empa for confirming something I’ve suspected due to hints from more informed commentators. Unfortunately my ignorance. about research protocols has not permitted me to undertake a better analysis of the research. ,

Here’s a link to and a citation for the paper,

Nanosafety Research—Are We on the Right Track? by Prof. Dr. Harald F. Krug. Angewandte Chemie International Edition DOI: 10.1002/anie.201403367 Article first published online: 10 OCT 2014

This is an open access paper.

Batteryfree cardiac pacemaker

This particular energy-havesting pacemaker has been tested ‘in vivo’ or, as some like to say, ‘on animal models’. From an Aug. 31, 2014 European Society of Cardiology news release (also on EurekAlert),

A new batteryless cardiac pacemaker based on an automatic wristwatch and powered by heart motion was presented at ESC Congress 2014 today by Adrian Zurbuchen from Switzerland. The prototype device does not require battery replacement.

Mr Zurbuchen, a PhD candidate in the Cardiovascular Engineering Group at ARTORG, University of Bern, Switzerland, said: “Batteries are a limiting factor in today’s medical implants. Once they reach a critically low energy level, physicians see themselves forced to replace a correctly functioning medical device in a surgical intervention. This is an unpleasant scenario which increases costs and the risk of complications for patients.”

Four years ago Professor Rolf Vogel, a cardiologist and engineer at the University of Bern, had the idea of using an automatic wristwatch mechanism to harvest the energy of heart motion. Mr Zurbuchen said: “The heart seems to be a very promising energy source because its contractions are repetitive and present for 24 hours a day, 7 days a week. Furthermore the automatic clockwork, invented in the year 1777, has a good reputation as a reliable technology to scavenge energy from motion.”

The researchers’ first prototype is based on a commercially available automatic wristwatch. All unnecessary parts were removed to reduce weight and size. In addition, they developed a custom-made housing with eyelets that allows suturing the device directly onto the myocardium (photo 1).

The prototype works the same way it would on a person’s wrist. When it is exposed to an external acceleration, the eccentric mass of the clockwork starts rotating. This rotation progressively winds a mechanical spring. After the spring is fully charged it unwinds and thereby spins an electrical micro-generator.

To test the prototype, the researchers developed an electronic circuit to transform and store the signal into a small buffer capacity. They then connected the system to a custom-made cardiac pacemaker (photo 2). The system worked in three steps. First, the harvesting prototype acquired energy from the heart. Second, the energy was temporarily stored in the buffer capacity. And finally, the buffered energy was used by the pacemaker to apply minute stimuli to the heart.

The researchers successfully tested the system in in vivo experiments with domestic pigs. The newly developed system allowed them for the first time to perform batteryless overdrive-pacing at 130 beats per minute.

Mr Zurbuchen said: “We have shown that it is possible to pace the heart using the power of its own motion. The next step in our prototype is to integrate both the electronic circuit for energy storage and the custom-made pacemaker directly into the harvesting device. This will eliminate the need for leads.”

He concluded: “Our new pacemaker tackles the two major disadvantages of today’s pacemakers. First, pacemaker leads are prone to fracture and can pose an imminent threat to the patient. And second, the lifetime of a pacemaker battery is limited. Our energy harvesting system is located directly on the heart and has the potential to avoid both disadvantages by providing the world with a batteryless and leadless pacemaker.”

This project seems the furthest along with regard to its prospects for replacing batteries in pacemakers (with leadlessness being a definite plus) but there are other projects such as Korea’s Professor Keon Jae Lee of KAIST and Professor Boyoung Joung, M.D. at Severance Hospital of Yonsei University who are working on a piezoelectric nanogenerator according to a June 26, 2014 article by Colin Jeffrey for Gizmodo.com,

… Unfortunately, the battery technology used to power these devices [cardiac pacemakers] has not kept pace and the batteries need to be replaced on average every seven years, which requires further surgery. To address this problem, a group of researchers from Korea Advanced Institute of Science and Technology (KAIST) has developed a cardiac pacemaker that is powered semi-permanently by harnessing energy from the body’s own muscles.

The research team, headed by Professor Keon Jae Lee of KAIST and Professor Boyoung Joung, M.D. at Severance Hospital of Yonsei University, has created a flexible piezoelectric nanogenerator that has been used to directly stimulate the heart of a live rat using electrical energy produced from small body movements of the animal.

… the team created their new high-performance flexible nanogenerator from a thin film semiconductor material. In this case, lead magnesium niobate-lead titanate (PMN-PT) was used rather than the graphene oxide and carbon nanotubes of previous versions. As a result, the new device was able to harvest up to 8.2 V and 0.22 mA of electrical energy as a result of small flexing motions of the nanogenerator. The resultant voltage and current generated in this way were of sufficient levels to stimulate the rat’s heart directly.

I gather this project too was tested on animal models, in this case, rats.

Gaining some attention at roughly the same time as the Korean researchers, a French team’s work with a ‘living battery’ is mentioned in a June 17, 2014 news item on the Open Knowledge website,

Philippe Cinquin, Serge Cosnier and their team at Joseph Fourier University in France have invented a ‘living battery.’ The device – a fuel cell and conductive wires modified with reactive enzymes – has the power to tap into the body’s endless supply of glucose and convert simple sugar, which constitutes the energy source of living cells, into electricity.

Visions of implantable biofuel cells that use the body’s natural energy sources to power pacemakers or artificial hearts have been around since the 1960s, but the French team’s innovations represents the closest anyone has ever come to harnessing this energy.

The French team was a finalist for the 2014 European Inventor Award. Here’s a description of how their invention works, from their 2014 European Inventor Award’s webpage,

Biofuel cells that harvest energy from glucose in the body function much like every-day batteries that conduct electricity through positive and negative terminals called anodes and cathodes and a medium conducive to electric charge known as the electrolyte. Electricity is produced via a series of electrochemical reactions between these three components. These reactions are catalysed using enzymes that react with glucose stored in the blood.

Bodily fluids, which contain glucose and oxygen, serve as the electrolyte. To create an anode, two enzymes are used. The first enzyme breaks down the sugar glucose, which is produced every time the animal or person consumes food. The second enzyme oxidises the simpler sugars to release electrons. A current then flows as the electrons are drawn to the cathode. A capacitor that is hooked up to the biofuel cell stores the electric charge produced.

I wish all the researchers good luck as they race towards a new means of powering pacemakers, deep brain stimulators, and other implantable devices that now rely on batteries which need to be changed thus forcing the patient to undergo major surgery.

Self-powered batteries for pacemakers, etc. have been mentioned here before:

April 3, 2009 posting

July 12, 2010 posting

March 8, 2013 posting

Nanojuice in your gut

A July 7, 2014 news item on Azonano features a new technique that could help doctors better diagnose problems in the intestines (guts),

Located deep in the human gut, the small intestine is not easy to examine. X-rays, MRIs and ultrasound images provide snapshots but each suffers limitations. Help is on the way.

University at Buffalo [State University of New York] researchers are developing a new imaging technique involving nanoparticles suspended in liquid to form “nanojuice” that patients would drink. Upon reaching the small intestine, doctors would strike the nanoparticles with a harmless laser light, providing an unparalleled, non-invasive, real-time view of the organ.

A July 5, 2014 University of Buffalo news release (also on EurekAlert) by Cory Nealon, which originated the news item, describes some of the challenges associated with medical imaging of small intestines,

“Conventional imaging methods show the organ and blockages, but this method allows you to see how the small intestine operates in real time,” said corresponding author Jonathan Lovell, PhD, UB assistant professor of biomedical engineering. “Better imaging will improve our understanding of these diseases and allow doctors to more effectively care for people suffering from them.”

The average human small intestine is roughly 23 feet long and 1 inch thick. Sandwiched between the stomach and large intestine, it is where much of the digestion and absorption of food takes place. It is also where symptoms of irritable bowel syndrome, celiac disease, Crohn’s disease and other gastrointestinal illnesses occur.

To assess the organ, doctors typically require patients to drink a thick, chalky liquid called barium. Doctors then use X-rays, magnetic resonance imaging and ultrasounds to assess the organ, but these techniques are limited with respect to safety, accessibility and lack of adequate contrast, respectively.

Also, none are highly effective at providing real-time imaging of movement such as peristalsis, which is the contraction of muscles that propels food through the small intestine. Dysfunction of these movements may be linked to the previously mentioned illnesses, as well as side effects of thyroid disorders, diabetes and Parkinson’s disease.

The news release goes on to describe how the researchers manipulated dyes that are usually unsuitable for the purpose of imaging an organ in the body,

Lovell and a team of researchers worked with a family of dyes called naphthalcyanines. These small molecules absorb large portions of light in the near-infrared spectrum, which is the ideal range for biological contrast agents.

They are unsuitable for the human body, however, because they don’t disperse in liquid and they can be absorbed from the intestine into the blood stream.

To address these problems, the researchers formed nanoparticles called “nanonaps” that contain the colorful dye molecules and added the abilities to disperse in liquid and move safely through the intestine.

In laboratory experiments performed with mice, the researchers administered the nanojuice orally. They then used photoacoustic tomography (PAT), which is pulsed laser lights that generate pressure waves that, when measured, provide a real-time and more nuanced view of the small intestine.

The researchers plan to continue to refine the technique for human trials, and move into other areas of the gastrointestinal tract.

Here’s an image of the nanojuice in the guts of a mouse,

The combination of "nanojuice" and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

The combination of “nanojuice” and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

This is an international collaboration both from a research perspective and a funding perspective (from the news release),

Additional authors of the study come from UB’s Department of Chemical and Biological Engineering, Pohang University of Science and Technology in Korea, Roswell Park Cancer Institute in Buffalo, the University of Wisconsin-Madison, and McMaster University in Canada.

The research was supported by grants from the National Institutes of Health, the Department of Defense and the Korean Ministry of Science, ICT and Future Planning.

Here’s a link to and a citation for the paper,

Non-invasive multimodal functional imaging of the intestine with frozen micellar naphthalocyanines by Yumiao Zhang, Mansik Jeon, Laurie J. Rich, Hao Hong, Jumin Geng, Yin Zhang, Sixiang Shi, Todd E. Barnhart, Paschalis Alexandridis, Jan D. Huizinga, Mukund Seshadri, Weibo Cai, Chulhong Kim, & Jonathan F. Lovell. Nature Nanotechnology (2014) doi:10.1038/nnano.2014.130 Published online 06 July 2014

This paper is behind a paywall.

Carbon nanotubes: OCSiAl’s deal in Korea and their effect on the body after one year

I have two news items related only by their focus on carbon nanotubes. First, there’s a July 3, 2014 news item on Azonano featuring OCSiAl’s deal with a Korean company announced at NANO KOREA 2014,

At NANO KOREA 2014 OCSiAl announced an unprecedentedly large-scale deal with Korean company Applied Carbon Nano Technology [ACN] Co., Ltd. – one of the key industry players.

OCSiAl, the dominating graphene tubes manufacturer, that successfully presented its products and technology in Europe and USA, now to enter Asian nanotech markets. At NANO KOREA 2014 the company introduced TUBALL, the universal nanomodifier of materials featuring >75% of single wall carbon nanotubes, and announced signing of supply agreement with Applied Carbon Nano Technology Co., Ltd. (hereinafter referred to as ACN), a recognized future-oriented innovative company.

A July 3, 2014 OCSiAl news release, which originated the news item, describes the memorandum of understanding (MOU) in greater detail,

Under this MoU ACN would buy 100 kg of TUBALL. The upcoming deal is the first of OCSiAl’s Korean contracts to be performed in 2015 and it turns up the largest throughout SWCNT market, which annual turnover recently hardly reached 500 kg. The agreement is exceptionally significant as it opens fundamental opportunities for manufacturing of new nanomaterial-based product with the unique properties that were not even possible before.

“OCSiAl’s entry to Korean market required thorough preparation. We invested time and efforts to prove that our company, our technology and our products worth credibility, – says Viktor Kim, OCSiAl Vice President, – we urged major playmakers to take TUBALL for testing to verify the quality and effectiveness. We believe that ACN is more than an appropriate partner to start – they are experts at the market and they understand its future perspectives very clearly. We believe that mutually beneficial partnership with ACN will path the way for future contracts, since it will become indicative to other companies in Asia and all over the world”.

“It comes as no surprise that OCSiAl’s products here in Korea will be in a great demand soon. The country strives to become world’s leader in advanced technology, and we do realize the benefits of nanomaterial’s exploitation. TUBALL is a truly versatile additive which may be used across many market sectors, where adoption of new materials with top-class performance is essential”, – says Mr. Dae-Yeol Lee, CEO of ACN.

OCSiAl’s entering to Korean market will undoubtedly have a high-reaching impact on the industry. The recent merger with American Zyvex Technologies made OCSiAl the not only the world’s largest nanomaterial producer but a first-rate developer of modifiers of different materials based on carbon nanotubes. To its Korean partners OCSiAl offers TUBALL, the raw ‘as produced’ SWCNT material and masterbatches, which can be either custom-made or ready-to-use mixtures for different applications, including li-ion batteries, car tires, transparent conductive coatings and many others. “Since Korea is increasingly dynamic, our success here will build on continuous development of our product, – adds Viktor Kim, – And we are constantly working on new applications of graphene tubes to meet sophisticated demands of nanotech-savvy Korean consumers”.

OCSiAl’s Zyvex acquisition was mentioned in a June 23, 2014 posting here.

My second tidbit concerns a July 4, 2014 news item on Nanowerk about carbon nanotubes and their effect on the body (Note: A link has been removed),

Having perfected an isotope labeling method allowing extremely sensitive detection of carbon nanotubes in living organisms, CEA and CNRS researchers have looked at what happens to nanotubes after one year inside an animal. Studies in mice revealed that a very small percentage (0.75%) of the initial quantity of nanotubes inhaled crossed the pulmonary epithelial barrier and translocated to the liver, spleen, and bone marrow. Although these results cannot be extrapolated to humans, this work highlights the importance of developing ultrasensitive methods for assessing the behavior of nanoparticles in animals. It has been published in the journal ACS Nano (“Carbon Nanotube Translocation to Distant Organs after Pulmonary Exposure: Insights from in Situ 14C-Radiolabeling and Tissue Radioimaging”).

A July 1, 2014 CNRS [France Centre national de la recherche scientifique] press release, which originated the news item, describes both applications for carbon nanotubes and the experiment in greater detail,

Carbon nanotubes are highly specific nanoparticles with outstanding mechanical and electronic properties that make them suitable for use in a wide range of applications, from structural materials to certain electronic components. Their many present and future uses explain why research teams around the world are now focusing on their impact on human health and the environment.

Researchers from CEA and the CNRS joined forces to study the distribution over time of these nanoparticles in mice, following contamination by inhalation. They combined radiolabeling with radio imaging tools for optimum detection sensitivity. When making the carbon nanotubes, stable carbon (12C) atoms were replaced directly by radioactive carbon (14C) atoms in the very structure of the tubes. This method allows the use of carbon nanotubes similar to those produced in industry, but labeled with 14C. Radio imaging tools make it possible to detect up to twenty or so carbon nanotubes on an animal tissue sample.

A single dose of 20 µg [micrograms] of labeled nanotubes was administered at the start of the protocol, then monitored for one year. The carbon nanotubes were observed to translocate from the lungs to other organs, especially the liver, spleen, and bone marrow. The study demonstrates that these nanoparticles are capable of crossing the pulmonary epithelial barrier, or air-blood barrier. It was also observed that the quantity of carbon nanotubes in these organs rose steadily over time, thus demonstrating that these particles are not eliminated on this timescale. Further studies will have to determine whether this observation remains true beyond a year.

The CEA [French Alternative Energies and Atomic Energy Commission {Commissariat à l’énergie atomique et aux énergies alternatives}] and CNRS teams have developed highly specific skills that enable them to study the health and environmental impact of nanoparticles from various angles. Nanotoxicology and nanoecotoxicology research such as this is both a priority for society and a scientific challenge, involving experimental approaches and still emerging concepts.

This work is conducted as part of CEA’s interdisciplinary Toxicology and Nanosciences programs. These are management, coordination and support structures set up to promote multidisciplinary approaches for studying the potential impact on living organisms of various components of industrial interest, including heavy metals, radionuclides, and new products.

At the CNRS, these concerns are reflected in particular in major initiatives such as the International Consortium for the Environmental Implications of Nano Technology (i-CEINT), a CNRS-led international initiative focusing on the ecotoxicology of nanoparticles. CNRS teams also have a long tradition of close involvement in matters relating to standards and regulations. Examples of this include the ANR NanoNORMA program, led by the CNRS, or ongoing work within the French C’Nano network.

For those who would either prefer or like to check out  the French language version of the July 1, 2014 CNRS press release (La biodistribution des nanotubes de carbone dans l’organisme), it can be found here.

Here’s a link to and a citation for the paper,

Carbon Nanotube Translocation to Distant Organs after Pulmonary Exposure: Insights from in Situ 14C-Radiolabeling and Tissue Radioimaging by Bertrand Czarny, Dominique Georgin, Fannely Berthon, Gael Plastow, Mathieu Pinault, Gilles Patriarche, Aurélie Thuleau, Martine Mayne L’Hermite, Frédéric Taran, and Vincent Dive. ACS Nano, 2014, 8 (6), pp 5715–5724 DOI: 10.1021/nn500475u Publication Date (Web): May 22, 2014

Copyright © 2014 American Chemical Society

This paper is behind a paywall.

Cardiac pacemakers: Korea’s in vivo demonstration of a self-powered one* and UK’s breath-based approach

As i best I can determine ,the last mention of a self-powered pacemaker and the like on this blog was in a Nov. 5, 2012 posting (Developing self-powered batteries for pacemakers). This latest news from The Korea Advanced Institute of Science and Technology (KAIST) is, I believe, the first time that such a device has been successfully tested in vivo. From a June 23, 2014 news item on ScienceDaily,

As the number of pacemakers implanted each year reaches into the millions worldwide, improving the lifespan of pacemaker batteries has been of great concern for developers and manufacturers. Currently, pacemaker batteries last seven years on average, requiring frequent replacements, which may pose patients to a potential risk involved in medical procedures.

A research team from the Korea Advanced Institute of Science and Technology (KAIST), headed by Professor Keon Jae Lee of the Department of Materials Science and Engineering at KAIST and Professor Boyoung Joung, M.D. of the Division of Cardiology at Severance Hospital of Yonsei University, has developed a self-powered artificial cardiac pacemaker that is operated semi-permanently by a flexible piezoelectric nanogenerator.

A June 23, 2014 KAIST news release on EurekAlert, which originated the news item, provides more details,

The artificial cardiac pacemaker is widely acknowledged as medical equipment that is integrated into the human body to regulate the heartbeats through electrical stimulation to contract the cardiac muscles of people who suffer from arrhythmia. However, repeated surgeries to replace pacemaker batteries have exposed elderly patients to health risks such as infections or severe bleeding during operations.

The team’s newly designed flexible piezoelectric nanogenerator directly stimulated a living rat’s heart using electrical energy converted from the small body movements of the rat. This technology could facilitate the use of self-powered flexible energy harvesters, not only prolonging the lifetime of cardiac pacemakers but also realizing real-time heart monitoring.

The research team fabricated high-performance flexible nanogenerators utilizing a bulk single-crystal PMN-PT thin film (iBULe Photonics). The harvested energy reached up to 8.2 V and 0.22 mA by bending and pushing motions, which were high enough values to directly stimulate the rat’s heart.

Professor Keon Jae Lee said:

“For clinical purposes, the current achievement will benefit the development of self-powered cardiac pacemakers as well as prevent heart attacks via the real-time diagnosis of heart arrhythmia. In addition, the flexible piezoelectric nanogenerator could also be utilized as an electrical source for various implantable medical devices.”

This image illustrating a self-powered nanogenerator for a cardiac pacemaker has been provided by KAIST,

This picture shows that a self-powered cardiac pacemaker is enabled by a flexible piezoelectric energy harvester. Credit: KAIST

This picture shows that a self-powered cardiac pacemaker is enabled by a flexible piezoelectric energy harvester.
Credit: KAIST

Here’s a link to and a citation for the paper,

Self-Powered Cardiac Pacemaker Enabled by Flexible Single Crystalline PMN-PT Piezoelectric Energy Harvester by Geon-Tae Hwang, Hyewon Park, Jeong-Ho Lee, SeKwon Oh, Kwi-Il Park, Myunghwan Byun, Hyelim Park, Gun Ahn, Chang Kyu Jeong, Kwangsoo No, HyukSang Kwon, Sang-Goo Lee, Boyoung Joung, and Keon Jae Lee. Advanced Materials DOI: 10.1002/adma.201400562
Article first published online: 17 APR 2014

© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

There was a May 15, 2014 KAIST news release on EurekAlert announcing this same piece of research but from a technical perspective,

The energy efficiency of KAIST’s piezoelectric nanogenerator has increased by almost 40 times, one step closer toward the commercialization of flexible energy harvesters that can supply power infinitely to wearable, implantable electronic devices

NANOGENERATORS are innovative self-powered energy harvesters that convert kinetic energy created from vibrational and mechanical sources into electrical power, removing the need of external circuits or batteries for electronic devices. This innovation is vital in realizing sustainable energy generation in isolated, inaccessible, or indoor environments and even in the human body.

Nanogenerators, a flexible and lightweight energy harvester on a plastic substrate, can scavenge energy from the extremely tiny movements of natural resources and human body such as wind, water flow, heartbeats, and diaphragm and respiration activities to generate electrical signals. The generators are not only self-powered, flexible devices but also can provide permanent power sources to implantable biomedical devices, including cardiac pacemakers and deep brain stimulators.

However, poor energy efficiency and a complex fabrication process have posed challenges to the commercialization of nanogenerators. Keon Jae Lee, Associate Professor of Materials Science and Engineering at KAIST, and his colleagues have recently proposed a solution by developing a robust technique to transfer a high-quality piezoelectric thin film from bulk sapphire substrates to plastic substrates using laser lift-off (LLO).

Applying the inorganic-based laser lift-off (LLO) process, the research team produced a large-area PZT thin film nanogenerators on flexible substrates (2 cm x 2 cm).

“We were able to convert a high-output performance of ~250 V from the slight mechanical deformation of a single thin plastic substrate. Such output power is just enough to turn on 100 LED lights,” Keon Jae Lee explained.

The self-powered nanogenerators can also work with finger and foot motions. For example, under the irregular and slight bending motions of a human finger, the measured current signals had a high electric power of ~8.7 μA. In addition, the piezoelectric nanogenerator has world-record power conversion efficiency, almost 40 times higher than previously reported similar research results, solving the drawbacks related to the fabrication complexity and low energy efficiency.

Lee further commented,

“Building on this concept, it is highly expected that tiny mechanical motions, including human body movements of muscle contraction and relaxation, can be readily converted into electrical energy and, furthermore, acted as eternal power sources.”

The research team is currently studying a method to build three-dimensional stacking of flexible piezoelectric thin films to enhance output power, as well as conducting a clinical experiment with a flexible nanogenerator.

In addition to the 2012 posting I mentioned earlier, there was also this July 12, 2010 posting which described research on harvesting biomechanical movement ( heart beat, blood flow, muscle stretching, or even irregular vibration) at the Georgia (US) Institute of Technology where the lead researcher observed,

…  Wang [Professor Zhong Lin Wang at Georgia Tech] tells Nanowerk. “However, the applications of the nanogenerators under in vivo and in vitro environments are distinct. Some crucial problems need to be addressed before using these devices in the human body, such as biocompatibility and toxicity.”

Bravo to the KAIST researchers for getting this research to the in vivo testing stage.

Meanwhile at the University of Bristol and at the University of Bath, researchers have received funding for a new approach to cardiac pacemakers, designed them with the breath in mind. From a June 24, 2014 news item on Azonano,

Pacemaker research from the Universities of Bath and Bristol could revolutionise the lives of over 750,000 people who live with heart failure in the UK.

The British Heart Foundation (BHF) is awarding funding to researchers developing a new type of heart pacemaker that modulates its pulses to match breathing rates.

A June 23, 2014 University of Bristol press release, which originated the news item, provides some context,

During 2012-13 in England, more than 40,000 patients had a pacemaker fitted.

Currently, the pulses from pacemakers are set at a constant rate when fitted which doesn’t replicate the natural beating of the human heart.

The normal healthy variation in heart rate during breathing is lost in cardiovascular disease and is an indicator for sleep apnoea, cardiac arrhythmia, hypertension, heart failure and sudden cardiac death.

The device is then briefly described (from the press release),

The novel device being developed by scientists at the Universities of Bath and Bristol uses synthetic neural technology to restore this natural variation of heart rate with lung inflation, and is targeted towards patients with heart failure.

The device works by saving the heart energy, improving its pumping efficiency and enhancing blood flow to the heart muscle itself.  Pre-clinical trials suggest the device gives a 25 per cent increase in the pumping ability, which is expected to extend the life of patients with heart failure.

One aim of the project is to miniaturise the pacemaker device to the size of a postage stamp and to develop an implant that could be used in humans within five years.

Dr Alain Nogaret, Senior Lecturer in Physics at the University of Bath, explained“This is a multidisciplinary project with strong translational value.  By combining fundamental science and nanotechnology we will be able to deliver a unique treatment for heart failure which is not currently addressed by mainstream cardiac rhythm management devices.”

The research team has already patented the technology and is working with NHS consultants at the Bristol Heart Institute, the University of California at San Diego and the University of Auckland. [emphasis mine]

Professor Julian Paton, from the University of Bristol, added: “We’ve known for almost 80 years that the heart beat is modulated by breathing but we have never fully understood the benefits this brings. The generous new funding from the BHF will allow us to reinstate this natural occurring synchrony between heart rate and breathing and understand how it brings therapy to hearts that are failing.”

Professor Jeremy Pearson, Associate Medical Director at the BHF, said: “This study is a novel and exciting first step towards a new generation of smarter pacemakers. More and more people are living with heart failure so our funding in this area is crucial. The work from this innovative research team could have a real impact on heart failure patients’ lives in the future.”

Given some current events (‘Tesla opens up its patents’, Mike Masnick’s June 12, 2014 posting on Techdirt), I wonder what the situation will be vis à vis patents by the time this device gets to market.

* ‘one’ added to title on Aug. 13, 2014.

Graphene-based sensor mimics pain (mu-opioid) receptor

I once had a job where I had to perform literature searches and read papers on pain research as it related to morphine tolerance. Not a pleasant task, it has left me eager to encourage and write about alternatives to animal testing, a key component of pain research. So, with a ‘song in my heart’, I feature this research from the University of Pennsylvania written up in a May 12, 2014 news item on ScienceDaily,

Almost every biological process involves sensing the presence of a certain chemical. Finely tuned over millions of years of evolution, the body’s different receptors are shaped to accept certain target chemicals. When they bind, the receptors tell their host cells to produce nerve impulses, regulate metabolism, defend the body against invaders or myriad other actions depending on the cell, receptor and chemical type.

Now, researchers from the University of Pennsylvania have led an effort to create an artificial chemical sensor based on one of the human body’s most important receptors, one that is critical in the action of painkillers and anesthetics. In these devices, the receptors’ activation produces an electrical response rather than a biochemical one, allowing that response to be read out by a computer.

By attaching a modified version of this mu-opioid receptor to strips of graphene, they have shown a way to mass produce devices that could be useful in drug development and a variety of diagnostic tests. And because the mu-opioid receptor belongs to the most common class of such chemical sensors, the findings suggest that the same technique could be applied to detect a wide range of biologically relevant chemicals.

A May 6, 2014 University of Pennsylvania news release, which originated the news item, describes the main teams involved in this research along with why and how they worked together (Note: Links have been removed),

The study, published in the journal Nano Letters, was led by A.T. Charlie Johnson, director of Penn’s Nano/Bio Interface Center and professor of physics in Penn’s School of Arts & Sciences; Renyu Liu, assistant professor of anesthesiology in Penn’s Perelman School of Medicine; and Mitchell Lerner, then a graduate student in Johnson’s lab. It was made possible through a collaboration with Jeffery Saven, professor of chemistry in Penn Arts & Sciences. The Penn team also worked with researchers from the Seoul National University in South Korea.

Their study combines recent advances from several disciplines.

Johnson’s group has extensive experience attaching biological components to nanomaterials for use in chemical detectors. Previous studies have involved wrapping carbon nanotubes with single-stranded DNA to detect odors related to cancer and attaching antibodies to nanotubes to detect the presence of the bacteria associated with Lyme disease.

After Saven and Liu addressed these problems with the redesigned receptor, they saw that it might be useful to Johnson, who had previously published a study on attaching a similar receptor protein to carbon nanotubes. In that case, the protein was difficult to grow genetically, and Johnson and his colleagues also needed to include additional biological structures from the receptors’ natural membranes in order to keep them stable.

In contrast, the computationally redesigned protein could be readily grown and attached directly to graphene, opening up the possibility of mass producing biosensor devices that utilize these receptors.

“Due to the challenges associated with isolating these receptors from their membrane environment without losing functionality,” Liu said, “the traditional methods of studying them involved indirectly investigating the interactions between opioid and the receptor via radioactive or fluorescent labeled ligands, for example. This multi-disciplinary effort overcame those difficulties, enabling us to investigate these interactions directly in a cell free system without the need to label any ligands.”

With Saven and Liu providing a version of the receptor that could stably bind to sheets of graphene, Johnson’s team refined their process of manufacturing those sheets and connecting them to the circuitry necessary to make functional devices.

The news release provides more technical details about the graphene sensor,

“We start by growing a piece of graphene that is about six inches wide by 12 inches long,” Johnson said. “That’s a pretty big piece of graphene, but we don’t work with the whole thing at once. Mitchell Lerner, the lead author of the study, came up with a very clever idea to cut down on chemical contamination. We start with a piece that is about an inch square, then separate them into ribbons that are about 50 microns across.

“The nice thing about these ribbons is that we can put them right on top of the rest of the circuitry, and then go on to attach the receptors. This really reduces the potential for contamination, which is important because contamination greatly degrades the electrical properties we measure.”

Because the mechanism by which the device reports on the presence of the target molecule relies only on the receptor’s proximity to the nanostructure when it binds to the target, Johnson’s team could employ the same chemical technique for attaching the antibodies and other receptors used in earlier studies.

Once attached to the ribbons, the opioid receptors would produce changes in the surrounding graphene’s electrical properties whenever they bound to their target. Those changes would then produce electrical signals that would be transmitted to a computer via neighboring electrodes.

The high reliability of the manufacturing process — only one of the 193 devices on the chip failed — enables applications in both clinical diagnostics and further research. [emphasis mine]

“We can measure each device individually and average the results, which greatly reduces the noise,” said Johnson. “Or you could imagine attaching 10 different kinds of receptors to 20 devices each, all on the same chip, if you wanted to test for multiple chemicals at once.”

In the researchers’ experiment, they tested their devices’ ability to detect the concentration of a single type of molecule. They used naltrexone, a drug used in alcohol and opioid addiction treatment, because it binds to and blocks the natural opioid receptors that produce the narcotic effects patients seek.

“It’s not clear whether the receptors on the devices are as selective as they are in the biological context,” Saven said, “as the ones on your cells can tell the difference between an agonist, like morphine, and an antagonist, like naltrexone, which binds to the receptor but does nothing. By working with the receptor-functionalized graphene devices, however, not only can we make better diagnostic tools, but we can also potentially get a better understanding of how the bimolecular system actually works in the body.”

“Many novel opioids have been developed over the centuries,” Liu said. “However, none of them has achieved potent analgesic effects without notorious side effects, including devastating addiction and respiratory depression. This novel tool could potentially aid the development of new opioids that minimize these side effects.”

Wherever these devices find applications, they are a testament to the potential usefulness of the Nobel-prize winning material they are based on.

“Graphene gives us an advantage,” Johnson said, “in that its uniformity allows us to make 192 devices on a one-inch chip, all at the same time. There are still a number of things we need to work out, but this is definitely a pathway to making these devices in large quantities.”

There is no mention of animal research but it seems likely to me that this work could lead to a decreased use of animals in pain research.

This project must have been quite something as it involved collaboration across many institutions (from the news release),

Also contributing to the study were Gang Hee Han, Sung Ju Hong and Alexander Crook of Penn Arts & Sciences’ Department of Physics and Astronomy; Felipe Matsunaga and Jin Xi of the Department of Anesthesiology at the Perelman School of Medicine, José Manuel Pérez-Aguilar of Penn Arts & Sciences’ Department of Chemistry; and Yung Woo Park of Seoul National University. Mitchell Lerner is now at SPAWAR Systems Center Pacific, Felipe Matsunaga at Albert Einstein College of Medicine, José Manuel Pérez-Aguilar at Cornell University and Sung Ju Hong at Seoul National University.

Here’s a link to and a citation for the paper,

Scalable Production of Highly Sensitive Nanosensors Based on Graphene Functionalized with a Designed G Protein-Coupled Receptor by Mitchell B. Lerner, Felipe Matsunaga, Gang Hee Han, Sung Ju Hong, Jin Xi, Alexander Crook, Jose Manuel Perez-Aguilar, Yung Woo Park, Jeffery G. Saven, Renyu Liu, and A. T. Charlie Johnson.Nano Lett., Article ASAP
DOI: 10.1021/nl5006349 Publication Date (Web): April 17, 2014
Copyright © 2014 American Chemical Society

This paper is behind a paywall.