Tag Archives: Horizon 2020

Nanotechnology research protocols for Environment, Health and Safety Studies in US and a nanomedicine characterization laboratory in the European Union

I have two items relating to nanotechnology and the development of protocols. The first item concerns the launch of a new web portal by the US National Institute of Standards and Technology.

US National Institute of Standards and Technology (NIST)

From a July 1, 2015 news item on Azonano,

As engineered nanomaterials increasingly find their way into commercial products, researchers who study the potential environmental or health impacts of those materials face a growing challenge to accurately measure and characterize them. These challenges affect measurements of basic chemical and physical properties as well as toxicology assessments.

To help nano-EHS (Environment, Health and Safety)researchers navigate the often complex measurement issues, the National Institute of Standards and Technology (NIST) has launched a new website devoted to NIST-developed (or co-developed) and validated laboratory protocols for nano-EHS studies.

A July 1, 2015 NIST news release on EurekAlert, which originated the news item, offers more details about the information available through the web portal,

In common lab parlance, a “protocol” is a specific step-by-step procedure used to carry out a measurement or related activity, including all the chemicals and equipment required. Any peer-reviewed journal article reporting an experimental result has a “methods” section where the authors document their measurement protocol, but those descriptions are necessarily brief and condensed, and may lack validation of any sort. By comparison, on NIST’s new Protocols for Nano-EHS website the protocols are extraordinarily detailed. For ease of citation, they’re published individually–each with its own unique digital object identifier (DOI).

The protocols detail not only what you should do, but why and what could go wrong. The specificity is important, according to program director Debra Kaiser, because of the inherent difficulty of making reliable measurements of such small materials. “Often, if you do something seemingly trivial–use a different size pipette, for example–you get a different result. Our goal is to help people get data they can reproduce, data they can trust.”

A typical caution, for example, notes that if you’re using an instrument that measures the size of nanoparticles in a solution by how they scatter light, it’s important also to measure the transmission spectrum of the particles if they’re colored, because if they happen to absorb light strongly at the same frequency as your instrument, the result may be biased.

“These measurements are difficult because of the small size involved,” explains Kaiser. “Very few new instruments have been developed for this. People are adapting existing instruments and methods for the job, but often those instruments are being operated close to their limits and the methods were developed for chemicals or bulk materials and not for nanomaterials.”

“For example, NIST offers a reference material for measuring the size of gold nanoparticles in solution, and we report six different sizes depending on the instrument you use. We do it that way because different instruments sense different aspects of a nanoparticle’s dimensions. An electron microscope is telling you something different than a dynamic light scattering instrument, and the researcher needs to understand that.”

The nano-EHS protocols offered by the NIST site, Kaiser says, could form the basis for consensus-based, formal test methods such as those published by ASTM and ISO.

NIST’s nano-EHS protocol site currently lists 12 different protocols in three categories: sample preparation, physico-chemical measurements and toxicological measurements. More protocols will be added as they are validated and documented. Suggestions for additional protocols are welcome at nanoprotocols@nist.gov.

The next item concerns European nanomedicine.

CEA-LETI and Europe’s first nanomedicine characterization laboratory

A July 1, 2015 news item on Nanotechnology Now describes the partnership which has led to launch of the new laboratory,

CEA-Leti today announced the launch of the European Nano-Characterisation Laboratory (EU-NCL) funded by the European Union’s Horizon 2020 research and innovation programm[1]e. Its main objective is to reach a level of international excellence in nanomedicine characterisation for medical indications like cancer, diabetes, inflammatory diseases or infections, and make it accessible to all organisations developing candidate nanomedicines prior to their submission to regulatory agencies to get the approval for clinical trials and, later, marketing authorization.

“As reported in the ETPN White Paper[2], there is a lack of infrastructure to support nanotechnology-based innovation in healthcare,” said Patrick Boisseau, head of business development in nanomedicine at CEA-Leti and chairman of the European Technology Platform Nanomedicine (ETPN). “Nanocharacterisation is the first bottleneck encountered by companies developing nanotherapeutics. The EU-NCL project is of most importance for the nanomedicine community, as it will contribute to the competiveness of nanomedicine products and tools and facilitate regulation in Europe.”

EU-NCL is partnered with the sole international reference facility, the Nanotechnology Characterization Lab of the National Cancer Institute in the U.S. (US-NCL)[3], to get faster international harmonization of analytical protocols.

“We are excited to be part of this cooperative arrangement between Europe and the U.S.,” said Scott E. McNeil, director of U.S. NCL. “We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.”

A July 2, 2015 EMPA (Swiss Federal Laboratories for Materials Science and Technology) news release on EurekAlert provides more detail about the laboratory and the partnerships,

The «European Nanomedicine Characterization Laboratory» (EU-NCL), which was launched on 1 June 2015, has a clear-cut goal: to help bring more nanomedicine candidates into the clinic and on the market, for the benefit of patients and the European pharmaceutical industry. To achieve this, EU-NCL is partnered with the sole international reference facility, the «Nanotechnology Characterization Laboratory» (US-NCL) of the US-National Cancer Institute, to get faster international harmonization of analytical protocols. EU-NCL is also closely connected to national medicine agencies and the European Medicines Agency to continuously adapt its analytical services to requests of regulators. EU-NCL is designed, organized and operated according to the highest EU regulatory and quality standards. «We are excited to be part of this cooperative project between Europe and the U.S.,» says Scott E. McNeil, director of US-NCL. «We hope this collaboration will help standardize regulatory requirements for clinical evaluation and marketing of nanomedicines internationally. This venture holds great promise for using nanotechnologies to overcome cancer and other major diseases around the world.»

Nine partners from eight countries

EU-NCL, which is funded by the EU for a four-year period with nearly 5 million Euros, brings together nine partners from eight countries: CEA-Tech in Leti and Liten, France, the coordinator of the project; the Joint Research Centre of the European Commission in Ispra, Italy; European Research Services GmbH in Münster Germany; Leidos Biomedical Research, Inc. in Frederick, USA; Trinity College in Dublin, Ireland; SINTEF in Oslo, Norway; the University of Liverpool in the UK; Empa, the Swiss Federal Laboratories for Materials Science and Technology in St. Gallen, Switzerland; Westfälische Wilhelms-Universität (WWU) and Gesellschaft für Bioanalytik, both in Münster, Germany. Together, the partnering institutions will provide a trans-disciplinary testing infrastructure covering a comprehensive set of preclinical characterization assays (physical, chemical, in vitro and in vivo biological testing), which will allow researchers to fully comprehend the biodistribution, metabolism, pharmacokinetics, safety profiles and immunological effects of their medicinal nano-products. The project will also foster the use and deployment of standard operating procedures (SOPs), benchmark materials and quality management for the preclinical characterization of medicinal nano-products. Yet another objective is to promote intersectoral and interdisciplinary communication among key drivers of innovation, especially between developers and regulatory agencies.

The goal: to bring safe and efficient nano-therapeutics faster to the patient

Within EU-NCL, six analytical facilities will offer transnational access to their existing analytical services for public and private developers, and will also develop new or improved analytical assays to keep EU-NCL at the cutting edge of nanomedicine characterization. A complementary set of networking activities will enable EU-NCL to deliver to European academic or industrial scientists the high-quality analytical services they require for accelerating the industrial development of their candidate nanomedicines. The Empa team of Peter Wick at the «Particles-Biology Interactions» lab will be in charge of the quality management of all analytical methods, a key task to guarantee the best possible reproducibility and comparability of the data between the various analytical labs within the consortium. «EU-NCL supports our research activities in developing innovative and safe nanomaterials for healthcare within an international network, which will actively shape future standards in nanomedicine and strengthen Empa as an enabler to facilitate the transfer of novel nanomedicines from bench to bedside», says Wick.

You can find more information about the laboratory on the Horizon 2020 (a European Union science funding programme) project page for the EU-NCL laboratory. For anyone curious about CEA-Leti, it’s a double-layered organization. CEA is France’s Commission on Atomic Energy and Alternative Energy (Commissariat à l’énergie atomique et aux énergies alternatives); you can go here to their French language site (there is an English language clickable option on the page). Leti is one of the CEA’s institutes and is known as either Leti or CEA-Leti. I have no idea what Leti stands for. Here’s the Leti website (this is the English language version).

SEMANTICS, a major graphene project based in Ireland

A Jan. 28, 2015 news item on Nanowerk profiles SEMANTICS, a major graphene project based in Ireland (Note: A link has been removed),

Graphene is the strongest, most impermeable and conductive material known to man. Graphene sheets are just one atom thick, but 200 times stronger than steel. The European Union is investing heavily in the exploitation of graphene’s unique properties through a number of research initiatives such as the SEMANTICS project running at Trinity College Dublin.

A Dec. 16, 2014 European Commission press release, which originated the news item, provides an overview of the graphene enterprise in Europe,

It is no surprise that graphene, a substance with better electrical and thermal conductivity, mechanical strength and optical purity than any other, is being heralded as the ‘wonder material’ of the 21stcentury, as plastics were in the 20thcentury.

Graphene could be used to create ultra-fast electronic transistors, foldable computer displays and light-emitting diodes. It could increase and improve the efficiency of batteries and solar cells, help strengthen aircraft wings and even revolutionise tissue engineering and drug delivery in the health sector.

It is this huge potential which has convinced the European Commission to commit €1 billion to the Future and Emerging Technologies (FET) Graphene Flagship project, the largest-ever research initiative funded in the history of the EU. It has a guaranteed €54 million in funding for the first two years with much more expected over the next decade.

Sustained funding for the full duration of the Graphene Flagship project comes from the EU’s Research Framework Programmes, principally from Horizon 2020 (2014-2020).

The aim of the Graphene Flagship project, likened in scale to NASA’s mission to put a man on the moon in the 1960s, or the Human Genome project in the 1990s, is to take graphene and related two-dimensional materials such as silicene (a single layer of silicon atoms) from a state of raw potential to a point where they can revolutionise multiple industries and create economic growth and new jobs in Europe.

The research effort will cover the entire value chain, from materials production to components and system integration. It will help to develop the strong position Europe already has in the field and provide an opportunity for European initiatives to lead in global efforts to fully exploit graphene’s miraculous properties.

Under the EU plan, 126 academics and industry groups from 17 countries will work on 15 individual but connected projects.

The press release then goes on to describe a new project, SEMANTICS,

… this is not the only support being provided by the EU for research into the phenomenal potential of graphene. The SEMANTICS research project, led by Professor Jonathan Coleman at Trinity College Dublin, is funded by the European Research Council (ERC) and has already achieved some promising results.

The ERC does not assign funding to particular challenges or objectives, but selects the best scientists with the best ideas on the sole criterion of excellence. By providing complementary types of funding, both to individual scientists to work on their own ideas, and to large-scale consortia to coordinate top-down programmes, the EU is helping to progress towards a better knowledge and exploitation of graphene.

“It is no overestimation to state that graphene is one of the most exciting materials of our lifetime,” Prof. Coleman says. “It has the potential to provide answers to the questions that have so far eluded us. Technology, energy and aviation companies worldwide are racing to discover the full potential of graphene. Our research will be an important element in helping to realise that potential.”

With the help of European Research Council (ERC) Starting and Proof of Concept Grants, Prof. Coleman and his team are researching methods for obtaining single-atom layers of graphene and other layered compounds through exfoliation (peeling off) from the multilayers, followed by deposition on a range of surfaces to prepare films displaying specific behaviour.

“We’re working towards making graphene and other single-atom layers available on an economically viable industrial scale, and making it cheaply,” Prof. Coleman continues.

“At CRANN [Centre for Research on Adaptive Nanostructures and Nanodevices at Trinity College Dublin], we are developing nanosheets of graphene and other single-atom materials which can be made in very large quantities,” he adds. “When you put these sheets in plastic, for example, you make the plastic stronger. Not only that – you can massively increase its electrical properties, you can improve its thermal properties and you can make it less permeable to gases. The applications for industry could be endless.”

Prof. Coleman admits that scientists are regularly taken aback by the potential of graphene. “We are continually amazed at what graphene and other single-atom layers can do,” he reveals. “Recently it has been discovered that, when added to glue, graphene can make it more adhesive. Who would have thought that? It’s becoming clear that graphene just makes things a whole lot better,” he concludes.

So far, the project has developed a practical method for producing two-dimensional nanosheets in large quantities. Crucially, these nanosheets are already being used for a range of applications, including the production of reinforced plastics and metals, building super-capacitors and batteries which store energy, making cheap light detectors, and enabling ultra-sensitive position and motion sensors. As the number of application grows, increased demand for these materials is anticipated. In response, the SEMANTICS team has scaled up the production process and is now producing 2D nanosheets at a rate more than 1000 times faster than was possible just a year ago.

I believe that new graphene production process is the ‘blender’ technique featured here in an April 23, 2014 post. There’s also a profile of the ‘blender’ project  in a Dec. 10, 2014 article by Ben Deighton for the European Commission’s Horizon magazine (Horizon 2020 is the European Union’s framework science funding programme). Deighton’s article hosts a video of Jonathan Coleman speaking about nanotechnology, blenders, and more on Dec. 1, 2014 at TEDxBrussels.

German nanotechnology industry mood lightens

A March 11, 2014 news item on Nanowerk proclaims a mood change for some sectors, including nanotechnology, of German industry,

For the German companies dealing with microtechnology, nanotechnology, advanced materials and optical technologies, business in 2013 has developed just as the industry had predicted earlier that year: at a constant level. But things are supposed to get better in 2014. The companies do not expect an enormous growth, but they are more positive than they have been ever since the outbreak of the financial and economic crisis. Orders, production and sales figures are expected to rise noticeably in 2014. Areas excluded from an optimistic outlook are staff and financing: the numbers of employees is likely to remain static in 2014 while the funding situation might even reach a new low.

The March 11, 2014 IVAN news release, which originated the news item, provides more detail about this change of mood,

The situation and mood of the micro- and nanotechnology industry, which the IVAM Microtechnology Network queried in a recent economic data survey, coincides with the overall economic development in Germany and general forecasts for 2014. According to publications of the German Federal Statistical Office, the gross domestic product in Germany has grown by only 0.4 percent in 2013 – the lowest growth since the crisis year 2009. For 2014, the Ifo Institute predicts a strong growth for the German economy. Especially exports are expected to increase.

The German micro- and nanotechnology industry is expecting increases during 2014 above all in the area of orders. Production and sales are likely to rise for more than 60 percent of companies each. But just a quarter of companies intend to hire more staff. Only one tenth of companies expect increases in the field of financing. Nevertheless, 30 percent of companies are planning to make investments, which is a higher proportion than in previous years.

The new research funding program of the European Union, Horizon 2020, has aroused certain hopes of enhancing financing opportunities for innovation projects. Compared to the 7th Framework Program, Horizon 2020 is designed in a way that means to facilitate access to EU funding for small and medium-sized enterprises. Especially small companies are still a little sceptical in this regard.

In the IVAM survey, 43 percent of micro- and nanotechnology companies say that EU funding is essential for them in order to implement their innovation projects. Three quarter of companies are planning to apply for funds from the new program. But only 23 percent of companies think that their opportunities to obtain EU funding have improved with Horizon 2020. Many small high-tech enterprises presume that the application still takes too much time and effort. However, since the program had just started at the time of survey, there are no experiences that might confirm or refute these first impressions.

The NSA surveillance scandal has caused a great insecurity among the micro- and nanotechnology companies in Germany concerning the safety of their technical knowledge. The majority of respondents (54 percent) would not even make a guess at whether their company’s know-how is safe from spying. A quarter of companies believe that they are sufficiently safe from spying. Only 21 percent are convinced that they do not have adequate protection. A little more than a third of companies have drawn consequences from the NSA scandal and taken steps to enhance the safety of their data.

Most companies agree that each company is responsible to ensure the best possible safety of its data. But still, almost 90 percent demand that authorities like national governments and the European Commission should intervene and impose stricter regulations. They feel that although each company bears a partial responsibility, the state must also fulfil its responsibilities, establish a clear legal framework for data security, make sure that regulations are complied with, and impose sanctions when they are not.

IVAM has provided this chart amongst others to illustrate their data,

Courtesy: IVAM. [downloaded from http://www.ivam.de/news/pm_ivam_survey_2014]

Courtesy: IVAM. [downloaded from http://www.ivam.de/news/pm_ivam_survey_2014]

You can access the 2014 IVAM survey from this page.

‘Valley of Death’, ‘Manufacturing Middle’, and other concerns in new government report about the future of nanomanufacturing in the US

A Feb. 8, 2014 news item on Nanowerk features a US Government Accountability Office (GAO) publication announcement (Note:  A link has been removed),

In a new report on nanotechnology manufacturing (or nanomanufacturing) released yesterday (“Nanomanufacturing: Emergence and Implications for U.S. Competitiveness, the Environment, and Human Health”; pdf), the U.S. Government Accountability Office finds flaws in America’s approach to many things nano.

At a July 2013 forum, participants from industry, government, and academia discussed the future of nanomanufacturing; investments in nanotechnology R&D and challenges to U.S. competitiveness; ways to enhance U.S. competitiveness; and EHS concerns.

A summary and a PDF version of the report, published Jan. 31, 2014, can be found here on the GAO’s GAO-14-181SP (report’s document number) webpage.  From the summary,

The forum’s participants described nanomanufacturing as a future megatrend that will potentially match or surpass the digital revolution’s effect on society and the economy. They anticipated further scientific breakthroughs that will fuel new engineering developments; continued movement into the manufacturing sector; and more intense international competition.

Although limited data on international investments made comparisons difficult, participants viewed the U.S. as likely leading in nanotechnology research and development (R&D) today. At the same time, they identified several challenges to U.S. competitiveness in nanomanufacturing, such as inadequate U.S. participation and leadership in international standard setting; the lack of a national vision for a U.S. nanomanufacturing capability; some competitor nations’ aggressive actions and potential investments; and funding or investment gaps in the United States (illustrated in the figure, below), which may hamper U.S. innovators’ attempts to transition nanotechnology from R&D to full-scale manufacturing.

[downloaded from http://www.gao.gov/products/GAO-14-181SP]

[downloaded from http://www.gao.gov/products/GAO-14-181SP]

I read through (skimmed) this 125pp (PDF version;  119 pp. print version) report and allthough it’s not obvious in the portion I’ve excerpted from the summary or in the following sections, the participants did seem to feel that the US national nanotechnology effort was in relatively good shape overall but with some shortcomings that may become significant in the near future.

First, government investment illustrates the importance the US has placed on its nanotechnology efforts (excerpted from p. 11 PDF; p. 5 print),

Focusing on U.S. public investment since 2001, the overall growth in the funding of nanotechnology has been substantial, as indicated by the funding of the federal interagency National Nanotechnology Initiative (NNI), with a cumulative investment of about $18 billion for fiscal years 2001 through 20133. Adding the request for fiscal year 2014 brings the total to almost $20 billion. However, the amounts budgeted in recent years have not shown an increasing trend.

Next, the participants in the July 2013 forum focused on four innovations in four different industry sectors as a means of describing the overall situation (excerpted from p. 16 PDF; p. 10 print):

Semiconductors (Electronics and semiconductors)

Battery-powered vehicles (Energy and power)

Nano-based concrete (Materials and chemical industries)

Nanotherapeutics (Pharmaceuticals, biomedical, and biotechnology)

There was some talk about nanotechnology as a potentially disruptive technology,

Nanomanufacturing could eventually bring disruptive innovation and the creation of new jobs—at least for the nations that are able to compete globally. According to the model suggested by Christensen (2012a; 2012b), which was cited by a forum participant, the widespread disruption of existing industries (and their supply chains) can occur together with the generation of broader markets, which can lead to net job creation, primarily for nations that bring the disruptive technology to market. The Ford automobile plant (with its dramatic changes in the efficient assembly of vehicles) again provides an historical example: mass – produced automobiles made cheaply enough—through economies of scale—were sold to vast numbers of consumers, replacing horse and buggy transportation and creating jobs to (1) manufacture large numbers of cars and develop the supply chain; (2) retail new cars; and (3) service them. The introduction of minicomputers and then personal computers in the 1980s and 1990s provides another historical example; the smaller computers disrupted the dominant mainframe computing industry (Christensen et al. 2000). Personal computers were provided to millions of homes, and an analyst in the Bureau of Labor Statistics (Freeman 1996) documented the creation of jobs in related areas such as selling home computers and software. According to Christensen (2012b), “[A]lmost all net growth in jobs in America has been created by companies that were empowering—companies that made complicated things affordable and accessible so that more people could own them and use them.”14 As a counterpoint, a recent report analyzing manufacturing today (Manyika et al. 2012, 4) claims that manufacturing “cannot be expected to create mass employment in advanced economies on the scale that it did decades ago.”

Interestingly, there is no mention in any part of the report of the darker sides of a disruptive technology. After all, there were people who were very, very upset over the advent of computers. For example, a student (I was teaching a course on marketing communication) once informed me that she and her colleagues used to regularly clear bullets from the computerized equipment they were sending up to the camps (memory fails as to whether these were mining or logging camps) in northern British Columbia in the early days of the industry’s computerization.

Getting back to the report, I wasn’t expecting to see that one of the perceived problems is the US failure to participate in setting standards (excerpted from p. 23 PDF; p. 17 print),

Lack of sufficient U.S. participation in setting standards for nanotechnology or nanomanufacturing. Some participants discussed a possible need for a stronger role for the United States in setting commercial standards for nanomanufactured goods (including defining basic terminology in order to sell products in global markets).17

The participants discussed the ‘Valley of Death’ and the ‘Missing Middle’ (excerpted from pp. 31-2 PDF; pp. 25-6 print)

Forum participants said that middle-stage funding, investment, and support gaps occur for not only technology innovation but also manufacturing innovation. They described the Valley of Death (that is, the potential lack of funding or investment that may characterize the middle stages in the development of a technology or new product) and the Missing Middle (that is, a similar lack of adequate support for the middle stages of developing a manufacturing process or approach), as explained below.

The Valley of Death refers to a gap in funding or investment that can occur after research on a new technology and its initial development—for example, when the technology moves beyond tests in a controlled laboratory setting.22 In the medical area, participants said the problem of inadequate funding /investment may be exacerbated by requirements for clinical trials. To illustrate, one participant said that $10 million to $20 million is needed to bring a new medical treatment into clinical trials, but “support from [a major pharmaceutical company] typically is not forthcoming until Phase II clinical trials,” resulting in a  Valley of Death for  some U.S. medical innovations. Another participant mentioned an instance where a costly trial was required for an apparently low risk medical device—and this participant tied high costs of this type to potential difficulties that medical innovators might have obtaining venture capital. A funding /investment gap at this stage can prevent further development of a technology.

The term  Missing Middle has been used to refer to the lack of funding/investment that can occur with respect to manufacturing innovation—that is, maturing manufacturing capabilities and processes to produce technologies at scale, as illustrated in figure 8.23 Here, another important lack of support may be the absence of what one participant called an “industrial commons”  to sustain innovation within a  manufacturing sector.24 Logically, successful transitioning across the  middle stages of manufacturing development is a prerequisite to  achieving successful new approaches to manufacturing at scale.

There was discussion of the international scene with regard to the ‘Valley of Death’ and the ‘Missing Middle’ (excerpted from pp. 41-2 PDF; pp. 35-6 print)

Participants said that the Valley of Death and Missing Middle funding and investment gaps, which are of concern in the United States, do not apply to the same extent in some other countries—for example, China and Russia—or are being addressed. One participant said that other countries in which these gaps have occurred “have zeroed in [on them] with a laser beam.” Another participant summed up his view of the situation with the statement: “Government investments in establishing technology platforms, technology transfer, and commercialization are higher in other countries than in the United States.”  He further stated that those making higher investments include China, Russia, and the European Union.

Multiple participants referred to the European Commission’s upcoming Horizon 2020 program, which will have major funding extending over 7 years. In addition to providing major funding for fundamental research, the Horizon 2020 website states that the program will help to:

“…bridge the gap between research and the market by, for example, helping innovative enterprises to develop their technological breakthroughs into viable products with real commercial potential. This market-driven approach will include creating partnerships with the private sector and Member States to bring together the resources needed.”

A key program within Horizon 2020 consists of the European Institute of Innovation and Technology (EIT), which as illustrated in the “Knowledge Triangle” shown figure 11, below, emphasizes the nexus of business, research, and higher education. The 2014-2020 budget for this portion of Horizon 2020 is 2.7 billion euros (or close to $3.7 billion in U.S. dollars as of January 2014).

As is often the case with technology and science, participants mentioned intellectual property (IP) (excerpted from pp. 43-44 PDF; pp. 37-8 print),

Several participants discussed threats to IP associated with global competition.43 One participant described persistent attempts by other countries (or by certain elements in other countries) to breach information  systems at his nanomanufacturing company. Another described an IP challenge pertaining to research at U.S. universities, as follows:

•due to a culture of openness, especially among students, ideas and research are “leaking out” of universities prior to the initial researchers having patented or fully pursued them;

•there are many foreign students at U.S. universities; and

•there is a current lack of awareness about “leakage” and of university policies or training to counter it.

Additionally, one of our earlier interviewees said that one country targeted. Specific research projects at U.S. universities—and then required its own citizen-students to apply for admission to each targeted U.S. university and seek work on the targeted project.

Taken together with other factors, this situation can result in an overall failure to protect IP and undermine U.S. research competitiveness. (Although a culture of openness and the presence of foreign students are  generally considered strengths of the U.S. system, in this context such factors could represent a challenge to capturing the full value of U.S. investments.)

I would have liked to have seen a more critical response to the discussion about IP issues given the well-documented concerns regarding IP and its depressing affect on competitiveness as per my June 28, 2012 posting titled: Billions lost to patent trolls; US White House asks for comments on intellectual property (IP) enforcement; and more on IP, my  Oct. 10, 2012 posting titled: UN’s International Telecommunications Union holds patent summit in Geneva on Oct. 10, 2012, and my Oct. 31, 2011 posting titled: Patents as weapons and obstacles, amongst many, many others here.

This is a very readable report and it answered a few questions for me about the state of nanomanufacturing.

ETA Feb. 10, 2014 at 2:45 pm PDT, The Economist magazine has a Feb. 7, 2014 online article about this new report from the US.

ETA April 2, 2014: There’s an April 1, 2014 posting about this report  on the Foresight Institute blog titled, US government report highlights flaws in US nanotechnology effort.

Canada-European Union research and Horizon 2020 funding opportunities

Thanks to the Society of Italian Researchers and Professionals of Western Canada (ARPICO), I received a Jan. 15, 2014 notice about ERA-Can‘s (European Research Area and Canada) upcoming Horizon 2020 information sessions, i.e., funidng opportunities for Canadian researchers,

The Canadian partners* to ERA-Can+ invite you to learn about Horizon 2020, a European funding opportunity that is accessible to Canadians working in science, technology, and innovation.

Horizon 2020 is a multi-year (2014-2020) program for science and technology funded by the European Commission. With a budget of almost Euro 80 billion (CAD $118 billion) Horizon 2020 forms a central part of the EU’s economic policy agenda. The program’s main goals are to encourage scientific excellence, increase the competitiveness of industries, and develop solutions to societal challenges in Europe and abroad.

ERA-Can+ has been established to help Canadians access Horizon 2020 funding. Building on several years of successful collaboration, ERA-Can+ will encourage bilateral exchange across the science, technology, and innovation chain. The project will also enrich the EU-Canada policy dialogue, enhance coordination between European and Canadian sector leaders, and stimulate transatlantic collaboration by increasing awareness of the funding opportunities available.

The European Commission released its first call for proposals under Horizon 2020 in December 2013. Canadian and European researchers and innovators can submit proposals for projects in a variety of fields including personalized health and care; food security; the sustainable growth of marine and maritime sectors; digital security; smart cities and communities; competitive low-carbon energy; efficient transportation; waste management; and disaster resilience. Further calls for proposals will be released later this year.

You are invited to attend one of four upcoming information sessions on Horizon 2020 opportunities for Canadians. These sessions will explain the structure of research funding in Europe and provide information on upcoming funding opportunities and the mechanisms by which Canadians can participate. Martina De Sole, Coordinator of ERA-Can+, and numerous Canadian partners will be on hand to share their expertise on these topics. Participants also will have the opportunity to learn about current and developing collaborations between Canadian and European researchers and innovators.

ERA-CAN+ Information Session Dates – Precise times to be confirmed.

Toronto: Morning of January 28th
MaRS Discovery District, 101 College Street

Kitchener-Waterloo: Morning of January 29th
Canadian Digital Media Network, 151 Charles Street West, Suite 100, Kitchener

Ottawa: Morning of January 30th
University of Ottawa; precise location on campus to be confirmed.

Montreal: Morning of January 31st
Intercontinental Hotel, 360 Rue Saint Antoine Ouest

This session is organised in partnership with the Ministère de l’Enseignement supérieur, de la Recherche, de la Science, de la Technologie du Québec.

For further information please contact eracanplus@ppforum.ca.

* ERA-Can+ Project Partners
APRE – Agenzia per la Promozione della Ricerca Europea (Italy)
AUCC – Association of Universities and Colleges of Canada (Canada)
CNRS – Centre National de la Recherche Scientifique (France)
DFATD – Department of Foreign Affairs, Trade and Development Canada (Canada)
DLR – Deutsches Zentrum fur Luft- und Raumfahrt e.V. (Germany)
PPF – The Public Policy Forum (Canada)
ZSI – Zentrum fur Soziale Innovation (Austria)

You can go to ERA-Can’s Information Sessions webpage to register for a specific event.

There are plans to hold sessions elsewhere in Canada,

Plans to have Info Sessions in other parts of Canada are underway.

For further information please contact eracanplus@ppforum.ca

Funding opportunities from the European Union’s Horizon 2010 programme and US DARPA’s Young Faculty Award program

A Dec. 12, 2013 news item on Nanowerk announces a call for proposals from the European Union’s (EU) massive science funding programme, Horizon 2020, which replaces the EU’s previous Framework Programme 7 initiative,

The European Commission presented for the first time today calls for Proposals under Horizon 2020, the European Union’s new 80 billion euro research and innovation program, which runs from 2014 to 2020. Worth more than 15 billion euros over the first two years, the funding is intended to help boost Europe’s knowledge-driven economy, and tackle issues that will make a difference in people’s lives. International cooperation is a priority in Horizon 2020 with the program open to participation of researchers from across the world, including the United States.

“It’s time to get down to business,” said European Research, Innovation and Science Commissioner Maire Geoghegan-Quinn. “Horizon 2020 funding is vital for the future of research and innovation in Europe, and will contribute to growth, jobs and a better quality of life. We have designed Horizon 2020 to produce results, and we have slashed red tape to make it easier to participate. So I am calling on researchers, universities, businesses including SMEs, and others to sign up!”

A Dec. 11, 2013 EU press release provides more details about the call and about Horizon 2020,

For the first time, the Commission has indicated funding priorities over two years, providing researchers and businesses with more certainty than ever before on the direction of EU research policy. Most calls from the 2014 budget are already open for submissions as of today, with more to follow over the course of the year. Calls in the 2014 budget alone are worth around €7.8 billion, with funding focused on the three key pillars of Horizon 2020:

  • Excellent Science: Around €3 billion, including €1.7 billion for grants from the European Research Council for top scientists and €800 million for Marie Skłodowska-Curie fellowships for younger researchers (see MEMO/13/1123).
  • Industrial Leadership: €1.8 billion to support Europe’s industrial leadership in areas like ICT, nanotechnologies, advanced manufacturing, robotics, biotechnologies and space.
  • Societal challenges: €2.8 billion for innovative projects addressing Horizon 2020’s seven societal challenges, broadly: health; agriculture, maritime and bioeconomy; energy; transport; climate action, environment, resource efficiency and raw materials; reflective societies; and security.

Background

Horizon 2020 is the EU’s biggest ever research and innovation framework programme with a seven year budget worth nearly €80 billion. Most EU research funding is allocated on the basis of competitive calls, but the budget for Horizon includes funding also for the Joint Research Centre, the European Commission’s in-house science service; the European Institute for Innovation and Technology and research carried out within the framework of the Euratom Treaty. Separate calls will also be published under specific Partnerships with industry and with Member States (see IP/13/668). In 2014 the total EU research budget, including these items and administrative expenditure, will be around €9.3 billion, rising to around €9.9 billion in 2015. Final 2015 amounts are subject to the decision on the 2015 annual budget.

The funding opportunities under Horizon 2020 are set out in work programmes published on the EU’s digital portal for research funding, which has been redesigned for quicker, paperless procedures. Participants will also find simpler programme architecture and funding, a single set of rules, and a reduced burden from financial controls and audits.

The 2014-15 calls include €500 million over two years dedicated to innovative small and medium-sized enterprises (SMEs) through a brand new SME Instrument. Gender aspects are expected to be included in many of the projects, and there is funding to further stimulate debate on science’s role within society. There are also new rules to make ‘open access’ a requirement for Horizon 2020, so that publications of project results are freely accessible to all.

The EU’s Horizon 2010 programme has created a How to get funding? webpage, which should answer your questions and does provide links to applications and more.

Moving on: Jessica Leber writes about a US DARPA (Defense Advanced Research Projects Agency) call for research proposals in her Dec. 11, 2013 article for Fast Company (Note: Links have been removed),

The Pentagon’s advanced research arm is always dreaming up crazy, futuristic technologies that will shape the future of the military and society. DARPA was involved in early Internet development, and these days the agency works on everything from drone-slaying lasers to humanoid robots that could save your life.

Every year, DARPA gives out young faculty awards aimed at recruiting the “rising star” researchers in academia to devote their brains to the military’s technological needs. “The long-term goal of the program is to develop the next generation of scientists and engineers in the research community who will focus a significant portion of their future careers on DoD and National Security issues,” this year’s grant program announcement reads.

A Nov. 19, 2013 DARPA news release describes the Young Faculty Awards program, eligibility (you must be employed in a US institution of higher learning), and their areas of interest,

2014 YFA announcement increases the number of research topics from 13 to 18 and for the first time permits teaming with subcontractors

DARPA defines its research portfolio within a framework that puts the Agency’s enduring mission in the context of tomorrow’s environment for national security and technology. An integral part of this strategy includes establishing and sustaining a pipeline of talented scientists, engineers, and mathematicians who are motivated to pursue high risk, high payoff fundamental research in disciplines that are critical to maintaining the technological superiority of the U.S. military.

DARPA’s Young Faculty Awards (YFA) program addresses this need by funding the work of promising researchers and pairing them with DARPA program managers.  This pairing provides YFA researchers with mentoring and networking opportunities as well as exposure to DoD technology needs and the overall research and development process. The 2014 YFA solicitation includes technical topic areas in the physical sciences, engineering, materials, mathematics, biology, computing, informatics and manufacturing disciplines that are relevant to the research interests of DARPA’s Defense Sciences and Microsystems Technology Offices.

“YFA offers promising junior faculty members and their peers at nonprofit research institutions the chance to do potentially revolutionary work much earlier in their careers than they otherwise could,” said William Casebeer, DARPA program manager for the 2014 class. “By expanding the list of research topics this year from 13 to 18—our largest portfolio since the program started in 2006—we hope to attract even more creative proposals that could lead to future breakthroughs on critical defense challenges. The growth reflects how successful past awardees have been in supporting DARPA’s mission.”

Eligible applicants must be employed in U.S. institutions of higher learning and within five years of appointment to a tenure-track position, or hold equivalent positions at non-profit research institutions.

Researchers selected for YFA grants receive up to $500,000 in funding over a 24-month period. As many as four of the most exceptional performers may be selected to receive up to another $500,000 over an additional year under a DARPA Director’s Fellowship.

DARPA is, for the first time, permitting proposers to form partnerships with subcontractors. The subcontractor relationship cannot exceed 30 percent of the total grant value. In addition to enhancing the competitiveness of proposed research plans, this change is designed to provide young investigators with the opportunity to manage a multidisciplinary team and gain a better understanding of the work performed by a DARPA program manager.

“The YFA program represents a strategic investment in fundamental research and professional development of the next generation of scientists and engineers focused on defense and national security issues,” said Mari Maeda, director of DARPA’s Defense Sciences Office. “It also benefits the young researchers and their institutions by engaging them in valuable, high-risk, high-impact research, providing a mentoring relationship with a DARPA program manager, expanding channels for future ideas to flow, and, now, exposing them to the rigors of managing a multidisciplinary team.”

The list of technical topic areas for 2014 includes:

  • Optimizing Supervision for Improved Autonomy
  • Neurobiological Mechanisms of Social Media Processing
  • Next-generation Neural Sensing for Brain-Machine Interfaces
  • Mathematical and Computational Methods to Identify and Characterize Logical and Causal Relations in Information
  • Time-Dependent Integrated Computational Materials Engineering
  • Long-range Detection of Special Nuclear Materials
  • Alternate Fusion Concepts
  • New Materials and Devices for Monitoring and Modulating Local Physiology
  • Methods and Theory for Fundamental Circuit-Level Understanding of the Human Brain
  • Hierarchically Complex Materials that Respond and Adapt
  • Disruptive Materials Processing
  • Disruptive Computing Architectures
  • Appliqué Antenna Elements for Platform Integration
  • Modeling Phonon Generation and Transport in the Near Junction Region of Wide-Bandgap Transistors
  • Advanced Automation and Microfluidic Technologies for Engineering Biology
  • Energy Recovery in Post-CMOS Technologies
  • Thin Film Transistors for High-performance RF and Power Electronics
  • Neural-inspired Computer Engineering

You can go here  http://www.grants.gov/web/grants/view-opportunity.html?oppId=247637 for all the details about DAARPA’s YFA call for proposals,

As for deadlines, I had some difficulty finding one for the current 2020 Horizon call for proposals, as I gather there a number of calls being announced in the news item on Nanowerk,. You can find more information on the How to participate page but it is only one of several starting points for your journey through this remarkable and huge funding programme.

Meanwhile ,the current deadline for the DARPA YRA proposals is Jan. 7, 2014.

Good luck!

The European Union’s Technology Platform on Nanomedicine (ETPN) and Canada’s Prime Minister Stephen Harper

Sometimes the European Union (EU) projects make my head spin and this is one of those times. From a June 6, 2013 news item on Nanowerk (Note: Links have been removed),

The European Technology Platform on Nanomedicine (ETPN) and the NANOMED2020 project published the White Paper on Contribution of Nanomedicine to Horizon 2020 (pdf), the next European Framework Programme for Research and Innovation till 2020. This strategic document provides a set of key recommendations for the European Commission and the EU Member States to create a favourable ecosystem for the successful deployment of Nanomedicine in Europe. It lays thereby the groundwork to manage the efficient translation of nanotechnology from a Key Enabling Technology (KET) into new and innovative medical products.

I found more information on the ETPN webpage,

Based on an in-depth analysis of the main bottlenecks in the translation of nanomedicine to the market, i.e. the inefficient selection process of translatable projects and the lack of technical infrastructures, such as pharmacology and toxicology facilities, the White Paper strives for the creation of an strong SME [small to medium enterprises] based supply chain for innovative therapeutics and diagnostics as a profitable industrial sector. This will support innovation and keep production and high tech jobs in Europe for the benefit of both European economy and European patients. The pivotal proposition of the White Paper is the establishment of a Nanomedicine Translation Hub designed as an umbrella for a set of complementary actions and initiatives such as:

  • a dedicated Nanomedicine Translation Advisory Board (TAB) with experienced industrial experts to select, guide and push forward the best translatable concepts,
  • a European Nano-Characterisation Laboratory (EU-NCL) for physical, chemical and biological characterisation of nanomaterials intended for medical use,
  • GMP manufacturing pilot lines for clinical batches to assist both academia and SMEs to develop nanomedical materials for validation in clinical trials, before transfer to dedicated manufacturing organisations,
  • a dedicated NanoMed SME instrument aiming at funding discovery projects and innovative SMEs in order to keep excellence in nanomedicine research and development.

I have looked at the White Paper, which is a slim 32 pp. and was particularly struck by this passage,

Based on this experience, the ETPN will cooperate with health care industry organisations to bring together EU, national and private resources to create an SME based supply chain for innovative ground breaking therapeutics and diagnostics. It will focus on nanotechnology based medicine and regenerative medicine, moving European SMEs to a position where their knowledge of development competes with the best in the US and Far East. This strengthening of the nanomedicine supply chain will support nanomedicine innovation and keep high tech jobs in Europe for the benefit both of the European economy and patients. (p. 6)

The ETPN folks are prepared to be generous in light of changing medical and health care research conditions,

Much has changed in the last decade in healthcare companies, with the result that pharmaceutical companies for example are moving to a global Open Innovation model, with a reduction in their own research capacities, especially in Europe. They have found their revenues under pressure and their pipeline of new products unable to provide the income to support the headcount without a loss of jobs. Most large pharmaceutical companies have now radically changed and embraced in-licencing and Open Innovation (OI) to try to reduce risks, to lower costs and to raise innovation standards. One consequence of this move is the cut in R&D staff in Europe’s research centres, and an investment in business development with the aim of technology scouting and in-licencing from academia or SMEs. This global strategy aims to recognise novel putative products or concepts, from SMEs and academics, wherever they originate in the world . Large companies are now focusing their business on late clinical validation, regulatory approval and worldwide distribution. The advantage for large pharmaceutical companies as well as for diagnostic and imaging companies is that it does not require early investment of resources, just the use of the knowledge of what makes a good potential marketable product. However, this OI concept will only succeed in Europe, if projects from the emerging healthcare supply chain of SMEs (and even academics) are developable.

If I read this rightly, they want to keep jobs in Europe and they’re perfectly happy to receive from other parts of the world. It seems to me that this could develop into a one-directional relationship and it provides an interesting contrast to Canadian Prime Minister Stephen Harper’s current campaign for a Canada-EU trade pact. From the Financial Post’s June 13, 2013 article by Stephen Rennie about Harper’s appeal to the British Parliament,

The prime minister’s pitch included talk of Canada’s economic strength relative to much of the rest of the world. Harper argued, as he often has in the past, against protectionism and touted trade as the main driver of prosperity.

“Another value whose certainty has been repeatedly proven, though sadly sometimes more in the breach than the application, is that everyone gains in an open economy,” Harper said.

“Our businesses grow when new markets are opened.”

To sum this up, I find the ETPN project intriguing and applaud the transparency although I find the messages a little confusing. As for Prime Minister Stephen Harper, I hope he has someone on his staff paying attention to these things.

EuroNanoforum workshop about products based on nanocellulose and other wood nanotechnology

EuroNanoforum 2013 will be held in Dublin, Ireland from June 18 – 20, 2013 and will feature a workshop on nanocellulose and other wood nanotechnology products according to an Apr. 18, 2013 news release from CEPI (Confederation of European Paper Industries),

The EuroNanoForum conference focuses on the impact nanotechnology is having in solving societal challenges linked to environmental, energy and health issues. It showcases innovation as a driver of economic growth. It presents new technologies arising from nanoscience and their applications and discusses potential new end products. It addresses commercialisation and co-operative alliances and schemes that accelerate their deployment, whilst also considering other key enabling technologies: advanced materials, nanoelectronics and manufacturing.

Participation will be in excess of 1000 key stakeholders from Europe and elsewhere, including nanotechnology applied researchers, industry stakeholders, and the decision-makers responsible for European R&D funding. This is your opportunity to influence decisions on the future of European Nanotechnology R&D.  [emphasis mine] The event offers a bridge to Horizon 2020, the European Union’s future funding programme for research and innovation (2014-2020). The conference will look at how nanotechnologies will fit into the targeted key priority areas of Horizon 2020: Excellent Science, Industrial Leadership and Societal Challenges.

The Forest Technology Platform, together with the Technical Research Centre of Finland (VTT), is organising a  workshop on ‘Products based on wood nanotechnology’ on Thursday 20 June, 09:00 – 10:30h. Wood-based nanotechnologies and the production of nano-based products have many promising application areas. The world production of nanocellulose has just passed 100 ton per year and meanwhile new multi-functional nanoparticle coatings are being developed and the research on wood-based carbon-nanotubes is moving fast.

I had not realized that world production of nanocellulose was so high and this is the first I’ve heard about wood-based carbon nanotubes. Good to know.

There is more information about the workshop itself on the EuroNanoforum workshop page,

… how long will it take until we see the first market applications? In what way will wood-nanotechnologies satisfy the need of consumers? Will nanotechnology-derived applications evolve gradually or will we experience a revolutionary paradigm-shift for the forest-based industries and European consumers?

  • Johan Elvnert, Managing Director, FTP – Introduction to the renewed European Strategic Research and Innovation Agenda (SRA) for the forest-based sector
  • Alexander Bismarck, Professor, Vienna University – Nanosized bacterial cellulose, truly green and fully renewable composites, and novel macroporous polymers
  • Pia Qvintus, VTT – Plant-based nanocellulose – from research to applications
  • Esa Laurinsilta, Director, UPM – The market for nanoncellulose in 5 years
  • Anna Suurnäkki, Chief research scientist, VTT – overview of other nano-application research areas emerging
  • Panel Discussion: Will nanotechnology-derived applications take the market gradually or will we see a paradigm-shift for the forest-based industry and consumers?

I first wrote about EuroNanoforum 2013 in my Mar. 14, 2013 posting where I highlighted some of their ‘Hot Topics’ at the upcoming conference.

Opening up Open Access: European Union, UK, Argentina, US, and Vancouver (Canada)

There is a furor growing internationally and it’s all about open access. It ranges from a petition in the US to a comprehensive ‘open access’ project from the European Union to a decision in the Argentinian Legislature to a speech from David Willetts, UK Minister of State for Universities and Science to an upcoming meeting in June 2012 being held in Vancouver (Canada).

As this goes forward, I’ll try to be clear as to which kind of open access I’m discussing,  open access publication (access to published research papers), open access data (access to research data), and/or both.

The European Commission has adopted a comprehensive approach to giving easy, open access to research funded through the European Union under the auspices of the current 7th Framework Programme and the upcoming Horizon 2020 (or what would have been called the 8th Framework Pr0gramme under the old system), according to the May 9, 2012 news item on Nanowerk,

To make it easier for EU-funded projects to make their findings public and more readily accessible, the Commission is funding, through FP7, the project ‘Open access infrastructure for research in Europe’ ( OpenAIRE). This ambitious project will provide a single access point to all the open access publications produced by FP7 projects during the course of the Seventh Framework Programme.

OpenAIRE is a repository network and is based on a technology developed in an earlier project called Driver. The Driver engine trawled through existing open access repositories of universities, research institutions and a growing number of open access publishers. It would index all these publications and provide a single point of entry for individuals, businesses or other scientists to search a comprehensive collection of open access resources. Today Driver boasts an impressive catalogue of almost six million taken from 327 open access repositories from across Europe and beyond.

OpenAIRE uses the same underlying technology to index FP7 publications and results. FP7 project participants are encouraged to publish their papers, reports and conference presentations to their institutional open access repositories. The OpenAIRE engine constantly trawls these repositories to identify and index any publications related to FP7-funded projects. Working closely with the European Commission’s own databases, OpenAIRE matches publications to their respective FP7 grants and projects providing a seamless link between these previously separate data sets.

OpenAIRE is also linked to CERN’s open access repository for ‘orphan’ publications. Any FP7 participants that do not have access to an own institutional repository can still submit open access publications by placing them in the CERN repository.

Here’s why I described this project as comprehensive, from the May 9, 2012 news item,

‘OpenAIRE is not just about developing new technologies,’ notes Ms Manola [Natalia Manola, the project’s manager], ‘because a significant part of the project focuses on promoting open access in the FP7 community. We are committed to promotional and policy-related activities, advocating open access publishing so projects can fully contribute to Europe’s knowledge infrastructure.’

The project is collecting usage statistics of the portal and the volume of open access publications. It will provide this information to the Commission and use this data to inform European policy in this domain.

OpenAIRE is working closely to integrate its information with the CORDA database, the master database of all EU-funded research projects. Soon it should be possible to click on a project in CORDIS (the EU’s portal for research funding), for example, and access all the open access papers published by that project. Project websites will also be able to provide links to the project’s peer reviewed publications and make dissemination of papers virtually effortless.

The project participants are also working with EU Members to develop a European-wide ‘open access helpdesk’ which will answer researchers’ questions about open access publishing and coordinate the open access initiatives currently taking place in different countries. The helpdesk will build up relationships and identify additional open access repositories to add to the OpenAIRE network.

Meanwhile, there’s been a discussion on the UK’s Guardian newspaper website about an ‘open access’ issue, money,  in a May 9, 2012 posting by John Bynner,

The present academic publishing system obstructs the free communication of research findings. By erecting paywalls, commercial publishers prevent scientists from downloading research papers unless they pay substantial fees. Libraries similarly pay huge amounts (up to £1m or more per annum) to give their readers access to online journals.

There is general agreement that free and open access to scientific knowledge is desirable. The way this might be achieved has come to the fore in recent debates about the future of scientific and scholarly journals.

Our concern lies with the major proposed alternative to the current system. Under this arrangement, authors are expected to pay when they submit papers for publication in online journals: the so called “article processing cost” (APC). The fee can amount to anything between £1,000 and £2,000 per article, depending on the reputation of the journal. Although the fees may sometimes be waived, eligibility for exemption is decided by the publisher and such concessions have no permanent status and can always be withdrawn or modified.

A major problem with the APC model is that it effectively shifts the costs of academic publishing from the reader to the author and therefore discriminates against those without access to the funds needed to meet these costs. [emphasis mine] Among those excluded are academics in, for example, the humanities and the social sciences whose research funding typically does not include publication charges, and independent researchers whose only means of paying the APC is from their own pockets. Academics in developing countries in particular face discrimination under APC because of their often very limited access to research funds.

There is another approach that could be implemented for a fraction of the cost of commercial publishers’ current journal subscriptions. “Access for all” (AFA) journals, which charge neither author nor reader, are committed to meeting publishing costs in other ways.

Bynner offers a practical solution, get the libraries to pay their subscription fees to an AFA journal, thereby funding ‘access for all’.

The open access discussion in the UK hasn’t stopped with a few posts in the Guardian, there’s also support from the government. David Willetts, in a May 2, 2012 speech to the UK Publishers Association Annual General Meeting had this to say, from the UK’s Dept. for Business Innovation and Skills website,

I realise this move to open access presents a challenge and opportunity for your industry, as you have historically received funding by charging for access to a publication. Nevertheless that funding model is surely going to have to change even beyond the positive transition to open access and hybrid journals that’s already underway. To try to preserve the old model is the wrong battle to fight. Look at how the music industry lost out by trying to criminalise a generation of young people for file sharing. [emphasis mine] It was companies outside the music business such as Spotify and Apple, with iTunes, that worked out a viable business model for access to music over the web. None of us want to see that fate overtake the publishing industry.

Wider access is the way forward. I understand the publishing industry is currently considering offering free public access to scholarly journals at all UK public libraries. This is a very useful way of extending access: it would be good for our libraries too, and I welcome it.

It would be deeply irresponsible to get rid of one business model and not put anything in its place. That is why I hosted a roundtable at BIS in March last year when all the key players discussed these issues. There was a genuine willingness to work together. As a result I commissioned Dame Janet Finch to chair an independent group of experts to investigate the issues and report back. We are grateful to the Publishers Association for playing a constructive role in her exercise, and we look forward to receiving her report in the next few weeks. No decisions will be taken until we have had the opportunity to consider it. But perhaps today I can share with you some provisional thoughts about where we are heading.

The crucial options are, as you know, called green and gold. Green means publishers are required to make research openly accessible within an agreed embargo period. This prompts a simple question: if an author’s manuscript is publicly available immediately, why should any library pay for a subscription to the version of record of any publisher’s journal? If you do not believe there is any added value in academic publishing you may view this with equanimity. But I believe that academic publishing does add value. So, in determining the embargo period, it’s necessary to strike a suitable balance between enabling revenue generation for publishers via subscriptions and providing public access to publicly funded information. In contrast, gold means that research funding includes the costs of immediate open publication, thereby allowing for full and immediate open access while still providing revenue to publishers.

In a May 22, 2012 posting at the Guardian website, Mike Taylor offers some astonishing figures (I had no idea academic publishing has been quite so lucrative) and notes that the funders have been a driving force in this ‘open access’ movement (Note: I have removed links from the excerpt),

The situation again, in short: governments and charities fund research; academics do the work, write and illustrate the papers, peer-review and edit each others’ manuscripts; then they sign copyright over to profiteering corporations who put it behind paywalls and sell research back to the public who funded it and the researchers who created it. In doing so, these corporations make grotesque profits of 32%-42% of revenue – far more than, say, Apple’s 24% or Penguin Books’ 10%. [emphasis mine]

… But what makes this story different from hundreds of other cases of commercial exploitation is that it seems to be headed for a happy ending. That’s taken some of us by surprise, because we thought the publishers held all the cards. Academics tend to be conservative, and often favour publishing their work in established paywalled journals rather than newer open access venues.

The missing factor in this equation is the funders. Governments and charitable trusts that pay academics to carry out research naturally want the results to have the greatest possible effect. That means publishing those results openly, free for anyone to use.

Taylor also goes on to mention the ongoing ‘open access’ petition in the US,

There is a feeling that the [US] administration fully understands the value of open access, and that a strong demonstration of public concern could be all it takes now to goad it into action before the November election. To that end a Whitehouse.gov petition has been set up urging Obama to “act now to implement open access policies for all federal agencies that fund scientific research”. Such policies would bring the US in line with the UK and Europe.

The people behind the US campaign have produced a video,

Anyone wondering about the reference to Elsevier may want to check out Thomas Lin’s Feb. 13, 2012 article for the New York Times,

More than 5,700 researchers have joined a boycott of Elsevier, a leading publisher of science journals, in a growing furor over open access to the fruits of scientific research.

You can find out more about the boycott and the White House petition at the Cost of Knowledge website.

Meanwhile, Canadians are being encouraged to sign the petition (by June 19, 2012), according to the folks over at ScienceOnline Vancouver in a description o f their June 12, 2012 event, Naked Science; Excuse: me your science is showing (a cheap, cheesy, and attention-getting  title—why didn’t I think of it first?),

Exposed. Transparent. Nude. All adjectives that should describe access to scientific journal articles, but currently, that’s not the case. The research paid by our Canadian taxpayer dollars is locked behind doors. The only way to access these articles is money, and lots of it!

Right now research articles costs more than a book! About $30. Only people with university affiliations have access and only journals their libraries subscribe to. Moms, dads, sisters, brothers, journalists, students, scientists, all pay for research, yet they can’t read the articles about their research without paying for it again. Now that doesn’t make sense.

….

There is also petition going around that states that research paid for by US taxpayer dollars should be available for free to US taxpayers (and others!) on the internet. Don’t worry if you are Canadian citizen, by signing this petition, Canadians would get access to the US research too and it would help convince the Canadian government to adopt similar rules. [emphasis mine]

Here’s where you can go to sign the petition. As for the notion that this will encourage the Canadian government to adopt an open access philosophy, I do not know. On the one hand, the government has opened up access to data, notably Statistics Canada data, mentioned by Frances Woolley in her March 22, 2012 posting about that and other open access data initiatives by the Canadian government on the Globe and Mail blog,

The federal government is taking steps to build the country’s data infrastructure. Last year saw the launch of the open data pilot project, data.gc.ca. Earlier this year the paywall in front of Statistics Canada’s enormous CANSIM database was taken down. The National Research Council, together with University of Guelph and Carleton University, has a new data registration service, DataCite, which allows Canadian researches to give their data permanent names in the form of digital object identifiers. In the long run, these projects should, as the press releases claim, “support innovation”, “add value-for-money for Canadians,” and promote “the reuse of existing data in commercial applications.”

That seems promising but there is a countervailing force. The Canadian government has also begun to charge subscription fees for journals that were formerly free. From the March 8, 2011 posting by Emily Chung on the CBC’s (Canadian Broadcasting Corporation) Quirks and Quarks blog,

The public has lost free online access to more than a dozen Canadian science journals as a result of the privatization of the National Research Council’s government-owned publishing arm.

Scientists, businesses, consultants, political aides and other people who want to read about new scientific discoveries in the 17 journals published by National Research Council Research Press now either have to pay $10 per article or get access through an institution that has an annual subscription.

It caused no great concern at the time,

Victoria Arbour, a University of Alberta graduate student, published her research in the Canadian Journal of Earth Sciences, one of the Canadian Science Publishing journals, both before and after it was privatized. She said it “definitely is too bad” that her new articles won’t be available to Canadians free online.

“It would have been really nice,” she said. But she said most journals aren’t open access, and the quality of the journal is a bigger concern than open access when choosing where to publish.

Then, there’s this from the new publisher, Canadian Science Publishing,

Cameron Macdonald, executive director of Canadian Science Publishing, said the impact of the change in access is “very little” on the average scientist across Canada because subscriptions have been purchased by many universities, federal science departments and scientific societies.

“I think the vast majority of researchers weren’t all that concerned,” he said. “So long as the journals continued with the same mission and mandate, they were fine with that.”

Macdonald said the journals were never strictly open access, as online access was free only inside Canadian borders and only since 2002.

So, journals that offered open access to research funded by Canadian taxpapers (to Canadians only) are now behind paywalls. Chung’s posting notes the problem already mentioned in the UK Guardian postings, money,

“It’s pretty prohibitively expensive to make things open access, I find,” she {Victoria Arbour] said.

Weir [Leslie Weir, chief librarian at the University of Ottawa] said more and more open-access journals need to impose author fees to stay afloat nowadays.

Meanwhile, the cost of electronic subscriptions to research journals has been ballooning as library budgets remain frozen, she said.

So far, no one has come up with a solution to the problem. [emphasis mine]

It seems they have designed a solution in the UK, as noted in John Bynner’s posting; perhaps we could try it out here.

Before I finish up, I should get to the situation in Argentina, from the May 27, 2012 posting on the Pasco Phronesis (David Bruggeman) blog (Note: I have removed a link in the following),

The lower house of the Argentinian legislature has approved a bill (en Español) that would require research results funded by the government be placed in institutional repositories once published.  There would be exceptions for studies involving confidential information and the law is not intended to undercut intellectual property or patent rights connected to research.  Additionally, primary research data must be published within 5 years of their collection.  This last point would, as far as I can tell, would be new ground for national open access policies, depending on how quickly the U.S. and U.K. may act on this issue.

Argentina steals a march on everyone by offering open access publication and open access data, within certain, reasonable constraints.

Getting back to David’s May 27, 2012 posting, he offers also some information on the European Union situation and some thoughts  on science policy in Egypt.

I have long been interested in open access publication as I feel it’s infuriating to be denied access to research that one has paid for in tax dollars. I have written on the topic before in my Beethoven inspires Open Research (Nov. 18, 2011 posting) and Princeton goes Open Access; arXiv is 10 years old (Sept. 30, 2011 posting) and elsewhere.

ETA May 28, 2012: I found this NRC Research Press website for the NRC journals and it states,

We are pleased to announce that Canadians can enjoy free access to over 100 000 back files of NRC Research Press journals, dating back to 1951. Access to material in these journals published after December 31, 2010, is available to Canadians through subscribing universities across Canada as well as the major federal science departments.

Concerned readers and authors whose institutes have not subscribed for the 2012 volume year can speak to their university librarians or can contact us to subscribe directly.

It’s good to see Canadians still have some access, although personally, I do prefer to read recent research.

ETA May 29, 2012: Yikes, I think this is one of the longest posts ever and I’m going to add this info. about libre redistribution and data mining as they relate to open access in this attempt to cover the topic as fully as possible in one posting.

First here’s an excerpt  from  Ross Mounce’s May 28, 2012 posting on the Palaeophylophenomics blog about ‘Libre redistribution’ (Note: I have removed a link),

I predict that the rights to electronically redistribute, and machine-read research will be vital for 21st century research – yet currently we academics often wittingly or otherwise relinquish these rights to publishers. This has got to stop. The world is networked, thus scholarly literature should move with the times and be openly networked too.

To better understand the notion of ‘libre redistribution’ you’ll want to read more of Mounce’s comments but you might also  want to check out Cameron Neylon’s comments in his March 6, 2012 posting on the Science in the Open blog,

Centralised control, failure to appreciate scale, and failure to understand the necessity of distribution and distributed systems. I have with me a device capable of holding the text of perhaps 100,000 papers It also has the processor power to mine that text. It is my phone. In 2-3 years our phones, hell our watches, will have the capacity to not only hold the world’s literature but also to mine it, in context for what I want right now. Is Bob Campbell ready for every researcher, indeed every interested person in the world, to come into his office and discuss an agreement for text mining? Because the mining I want to do and the mining that Peter Murray-Rust wants to do will be different, and what I will want to do tomorrow is different to what I want to do today. This kind of personalised mining is going to be the accepted norm of handling information online very soon and will be at the very centre of how we discover the information we need.

This moves the discussion past access (taxpayers not seeing the research they’ve funded, researchers who don’t have subscriptions, libraries not have subscriptions, etc.)  to what happens when you can get access freely. It opens up new ways of doing research by means of text mining and data mining redistribution of them both.

Science research spending and innovation in Europe and reflections on the Canadian situation

I thought I’d pull together some information about science funding and innovation for closer examination. First, in early July 2011 the European Union announced plans for a huge spending increase, approximately 45%, for science. Their current programme, the Seventh Framework Programme (US$79B budget) is coming to an end in 2013 and the next iteration will be called, Horizon 2020 (proposed US$114B budget).  Here’s more from Kit Eaton’s July 6, 2011 article on Fast Company,

The proposal still awaits approval by the E.U.’s parliament and member states, but just getting this far is a milestone. The next phase is to forge spending into the next generation of the E.U.’s Framework Programme, which is its main research spending entity, to produce a plan called Horizon 2020. The spending shift has been championed by E.U. research commissioner Márie Geoghan-Quinn, and means that the share of the E.U. budget portioned out for scientific research will eventually double from its 4.5% figure in 2007 to 9% in 2020.

How will Europe pay for it? This is actually the biggest trick being pulled off: More than €4.5 billion would be transferred from the E.U.’s farm subsidies program, the Common Agricultural Policy. This is the enormous pile of cash paid by E.U. authorities to farmers each year to keep them in business, to keep food products rolling off the production line, and to keep fields fallow–as well as to diversify their businesses.

Nature journal also covered the news in a July 5, 2011 article by Colin Macilwane,

Other research advocates say that the proposal — although falling short of the major realignment of funding priorities they had been hoping for — was as good as could be expected in the circumstances. “Given the times we’re in, we couldn’t realistically have hoped for much more,” says Dieter Imboden, president of Eurohorcs, the body representing Europe’s national research agencies.

Geoghegan-Quinn told Nature that the proposal was “a big vote of confidence in science” but also called on researchers to push to get the proposal implemented — especially in their home countries. “The farmers will be out there lobbying, and scientists and researchers need to do the same,” she says.

While the European Union wrangles over a budget that could double their investment in science research, Canadians evince, at best, a mild interest in science research.

The latest Science, Technology and Innovation Council report, State of the Nation 2010: Canada’s Science, Technology and Innovation System, was released in June 2011 and has, so far, occasioned little interest despite an article in the Globe & Mail and a Maclean’s blog posting by Paul Wells. Hopefully,  The Black Hole Blog, where Beth Swan and David Kent are writing a series about the report, will be able to stimulate some discussion.

From Beth’s July 12, 2011 posting,

The report – at least the section I’m talking about today – is based on data from the Organisation for Economic Co-operation and Development’s (OECD) Programme for International Student Assessment and Statistics Canada. Some of the interesting points include:

  • 15-year-old Canadians rank in the top 10 of OECD countries for math and science in 20091.
  • 80% of 15-19 year-old Canadians are pursuing a formal education, which is lower than the OECD average
  • But Canada ranks 1st in OECD countries for adults (ages 25–64 years) in terms of the percentage of the population with a post-secondary education (49%)
  • The numbers of Canadian students in science and engineering at the undergraduate level increased (18% increase in the number of science undergraduate degrees, 9% increase in the number of engineering undergraduate degrees) in 2008 compared to 2005

This all begs the question, though, of what those science-based graduates do once they graduate. It’s something that we’ve talked about a fair bit here on the Black Hole and the STIC report gives us some unhappy data on it. Canada had higher unemployment rates for science-based PhDs (~3-4%) compared to other OECD countries (e.g., in the US, it’s about ~1-1.5%).  Specifically, in 2006 Canada had the highest rate of unemployment for the medical sciences -3%- and engineering -4%- and the third highest rate of unemployment for the natural sciences -3%- among the OECD countries: the data are from 2006.

David, in his July 16, 2011 posting, focuses on direct and indirect Canadian federal government Research & Development (R&D) spending,

It appears from a whole host of statistics, reports, etc – that Canada lags in innovation, but what is the government’s role in helping to nurture its advancement.  Is it simply to create fertile ground for “the market” to do its work?  or is it a more interventionist style of determining what sorts of projects the country needs and investing as such?  Perhaps it involves altering the way we train and inspire our young people?

Beth then comments on Canadian business R&D investment, which has always been a low priority according to the material I’ve read, in her July 25, 2011 posting on ,

Taken together, this shows a rather unfavourable trend in Canadian businesses not investing in research & development – i.e, not contributing to innovation. We know from Dave’s last posting that Canada is not very good at contributing direct funds to research and my first posting in this series illustrated that while Canada is pretty good at getting PhDs trained, we are not so good at having jobs for those PhDs once they are done their schooling.

The latest July 27, 2011 posting from David asks the age old question, Why does Canada lag in R&D spending?

Many reports have been written over the past 30 years about Canada and its R&D spending, and they clamour one after the other about Canada’s relative lack of investment into R&D.  We’ve been through periods of deep cutbacks and periods of very strong growth, yet one thing remains remarkably consistent – Canada underspends on R&D relative to other countries.

The waters around such questions are extremely murky and tangible outcomes are tough to identify and quantify when so many factors are at play.  What does seem reasonable though is to ask where this investment gap is filled from in other countries that currently outstrip Canada’s spending – is it public money, private money, foreign money, or domestic money?  Hopefully these questions are being asked and answered before we set forth on another 30 year path of poor relative investment.

As I stated in my submission to the federal government’s R&D review panel and noted in my March 15, 2011 posting about the ‘Innovation’ consultation, I think we need to approach the issues in more imaginative ways.