Tag Archives: OSTP

The US White House and its Office of Science and Technology Policy (OSTP)

It’s been a while since I first wrote this but I believe this situation has not changed.

There’s some consternation regarding the US Office of Science and Technology Policy’s (OSTP) diminishing size and lack of leadership. From a July 3, 2017 article by Bob Grant for The Scientist (Note: Links have been removed),

Three OSTP staffers did leave last week, but it was because their prearranged tenures at the office had expired, according to an administration official familiar with the situation. “I saw that there were some tweets and what-not saying that it’s zero,” the official tells The Scientist. “That is not true. We have plenty of PhDs that are still on staff that are working on science. All of the work that was being done by the three who left on Friday had been transitioned to other staffers.”

At least one of the tweets that the official is referring to came from Eleanor Celeste, who announced leaving OSTP, where she was the assistant director for biomedical and forensic sciences. “science division out. mic drop,” she tweeted on Friday afternoon.

The administration official concedes that the OSTP is currently in a state of “constant flux” and at a “weird transition period” at the moment, and expects change to continue. “I’m sure that the office will look even more different in three months than it does today, than it did six months ago,” the official says.

Jeffrey Mervis in two articles for Science Magazine is able to provide more detail. From his July 11, 2017 article,

OSTP now has 35 staffers, says an administration official who declined to be named because they weren’t authorized to speak to the media. Holdren [John Holdren], who in January [2017] returned to Harvard University, says the plunge in staff levels is normal during a presidential transition. “But what’s shocking is that, this far into the new administration, the numbers haven’t gone back up.”

The office’s only political appointee is Michael Kratsios, a former aide to Trump confidant and Silicon Valley billionaire Peter Thiel. Kratsios is serving as OSTP’s deputy chief technology officer and de facto OSTP head. Eight new detailees have arrived from other agencies since the inauguration.

Although there has been no formal reorganization of OSTP, a “smaller, more collaborative staff” is now grouped around three areas—science, technology, and national security—according to the Trump aide. Three holdovers from Obama’s OSTP are leading teams focused on specific themes—Lloyd Whitman in technology, Chris Fall in national security, and Deerin Babb-Brott in environment and energy. They report to Kratsios and Ted Wackler, a career civil servant who was Holdren’s deputy chief of staff and who joined OSTP under former President George W. Bush.

“It’s a very flat structure,” says the Trump official, consistent with the administration’s view that “government should be looking for ways to do more with less.” Ultimately, the official adds, “the goal is [for OSTP] to have “probably closer to 50 [people].”

A briefing book prepared by Obama’s outgoing OSTP staff may be a small but telling indication of the office’s current status. The thick, three-ring binder, covering 100 issues, was modeled on one that Holdren received from John “Jack” Marburger, Bush’s OSTP director. “Jack did a fabulous job of laying out what OSTP does, including what reports it owes Congress, so we decided to do likewise,” Holdren says. “But nobody came [from Trump’s transition team] to collect it until a week before the inauguration.”

That person was Reed Cordish, the 43-year-old scion of billionaire real estate developer David Cordish. An English major in college, Reed Cordish was briefly a professional tennis player before joining the family business. He “spent an hour with us and took the book away,” Holdren says. “He told us, ‘This is an important operation and I’ll do my best to see that it flourishes.’ But we don’t know … whether he has the clout to make that happen.”

Cordish is now assistant to the president for intragovernmental and technology initiatives. He works in the new Office of American Innovation led by presidential son-in-law Jared Kushner. That office arranged a recent meeting with high-tech executives, and is also leading yet another White House attempt to “reinvent” government.

Trump has renewed the charter of the National Science and Technology Council, a multiagency group that carries out much of the day-to-day work of advancing the president’s science initiatives. … Still pending is the status of the President’s Council of Advisors on Science and Technology [emphasis mine], a body of eminent scientists and high-tech industry leaders that went out of business at the end of the Obama administration.

Mervis’ July 12, 2017 article is in the form of a Q&A (question and answer) session with the previously mentioned John Holdren, director of the OSTP in Barack Obama’s administration,

Q: Why did you have such a large staff?

A: One reason was to cover the bases. We knew from the start that the Obama administration thought cybersecurity would be an important issue and we needed to be capable in that space. We also knew we needed people who were capable in climate change, in science and technology for economic recovery and job creation and sustained economic growth, and people who knew about advanced manufacturing and nanotechnology and biotechnology.

We also recruited to carry out specific initiatives, like in precision medicine, or combating antibiotic resistance, or the BRAIN [Brain Research through Advancing Innovative Neurotechnologies] initiative. Most of the work will go on in the departments and agencies, but you need someone to oversee it.

The reason we ended up with 135 people at our peak, which was twice the number during its previous peak in the Clinton administration’s second term, was that this president was so interested in knowing what science could do to advance his agenda, on economic recovery, or energy and climate change, or national intelligence. He got it. He didn’t need to be tutored on why science and technology matters.

I feel I’ve been given undue credit for [Obama] being a science geek. It wasn’t me. He came that way. He was constantly asking what we could do to move the needle. When the first flu epidemic, H1N1, came along, the president immediately turned to me and said, “OK, I want [the President’s Council of Advisors on Science and Technology] to look in depth on this, and OSTP, and NIH [National Institutes of Health], and [the Centers for Disease Control and Prevention].” And he told us to coordinate my effort on this stuff—inform me on what can be done and assemble the relevant experts. It was the same with Ebola, with the Macondo oil spill in the Gulf, with Fukushima, where the United States stepped up to work with the Japanese.

It’s not that we had all the expertise. But our job was to reach out to those who did have the relevant expertise.

Q: OSTP now has 35 people. What does that level of staffing say to you?

A: I have to laugh.

Q: Why?

A: When I left, on 19 January [2017], we were down to 30 people. And a substantial fraction of the 30 were people who, in a sense, keep the lights on. They were the OSTP general counsel and deputy counsel, the security officer and deputy, the budget folks, the accounting folks, the executive director of NSTC [National Science and Technology Council].

There are some scientists left, and there are some scientists there still. But on 30 June the last scientist in the science division left.

Somebody said OSTP has shut down. But that’s not quite it. There was no formal decision to shut anything down. But they did not renew the contract of the last remaining science folks in the science division.

I saw somebody say, “Well, we still have some Ph.D.s left.” And that’s undoubtedly true. There are still some science Ph.D.s left in the national security and international affairs division. But because [OSTP] is headless, they have no direct connection to the president and his top advisers.

I don’t want to disparage the top people there. The top people there now are Michael Kratsios, who they named the deputy chief technology officer, and Ted Wackler, who was my deputy chief of staff and who was [former OSTP Director] Jack Marberger’s deputy, and who I kept because he’s a fabulously effective manager. And I believe that they are doing everything they can to make sure that OSTP, at the very least, does the things it has to do. … But right now I think OSTP is just hanging on.

Q: Why did some people choose to stay on?

A: A large portion of OSTP staff are borrowed from other agencies, and because the White House is the White House, we get the people we need. These are dedicated folks who want to get the job done. They want to see science and technology applied to advance the public interest. And they were willing to stay and do their best despite the considerable uncertainty about their future.

But again, most of the detailees, and the reason we went from 135 to 30 almost overnight, is that it’s pretty standard for the detailees to go back to their home agencies and wait for the next administration to decide what set of detailees it wants to advance their objects.

So there’s nothing shocking that most of the detailees went back to their home agencies. The people who stayed are mostly employed directly by OSTP. What’s shocking is that, this far into the new administration, that number hasn’t gone back up. That is, they have only five more people than they had on January 20 [2017].

As I had been wondering about the OSTP and about the President’s Council of Advisors on Science and Technology (PCAST), it was good to get an update.

On a more parochial note, we in Canada are still waiting for an announcement about who our Chief Science Advisor might be.

$1.4B for US National Nanotechnology Initiative (NNI) in 2017 budget

According to an April 1, 2016 news item on Nanowerk, the US National Nanotechnology (NNI) has released its 2017 budget supplement,

The President’s Budget for Fiscal Year 2017 provides $1.4 billion for the National Nanotechnology Initiative (NNI), affirming the important role that nanotechnology continues to play in the Administration’s innovation agenda. NNI
Cumulatively totaling nearly $24 billion since the inception of the NNI in 2001, the President’s 2017 Budget supports nanoscale science, engineering, and technology R&D at 11 agencies.

Another 9 agencies have nanotechnology-related mission interests or regulatory responsibilities.

An April 1, 2016 NNI news release, which originated the news item, affirms the Obama administration’s commitment to the NNI and notes the supplement serves as an annual report amongst other functions,

Throughout its two terms, the Obama Administration has maintained strong fiscal support for the NNI and has implemented new programs and activities to engage the broader nanotechnology community to support the NNI’s vision that the ability to understand and control matter at the nanoscale will lead to new innovations that will improve our quality of life and benefit society.

This Budget Supplement documents progress of these participating agencies in addressing the goals and objectives of the NNI. It also serves as the Annual Report for the NNI called for under the provisions of the 21st Century Nanotechnology Research and Development Act of 2003 (Public Law 108-153, 15 USC §7501). The report also addresses the requirement for Department of Defense reporting on its nanotechnology investments, per 10 USC §2358.

For additional details and to view the full document, visit www.nano.gov/2017BudgetSupplement.

I don’t seem to have posted about the 2016 NNI budget allotment but 2017’s $1.4B represents a drop of $100M since 2015’s $1.5 allotment.

The 2017 NNI budget supplement describes the NNI’s main focus,

Over the past year, the NNI participating agencies, the White House Office of Science and Technology Policy (OSTP), and the National Nanotechnology Coordination Office (NNCO) have been charting the future directions of the NNI, including putting greater focus on promoting commercialization and increasing education and outreach efforts to the broader nanotechnology community. As part of this effort, and in keeping with recommendations from the 2014 review of the NNI by the President’s Council of Advisors for Science and Technology, the NNI has been working to establish Nanotechnology-Inspired Grand Challenges, ambitious but achievable goals that will harness nanotechnology to solve National or global problems and that have the potential to capture the public’s imagination. Based upon inputs from NNI agencies and the broader community, the first Nanotechnology-Inspired Grand Challenge (for future computing) was announced by OSTP on October 20, 2015, calling for a collaborative effort to “create a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.” This Grand Challenge has generated broad interest within the nanotechnology community—not only NNI agencies, but also industry, technical societies, and private foundations—and planning is underway to address how the agencies and the community will work together to achieve this goal. Topics for additional Nanotechnology-Inspired Grand Challenges are under review.

Interestingly, it also offers an explanation of the images on its cover (Note: Links have been removed),

US_NNI_2017_budget_cover

About the cover

Each year’s National Nanotechnology Initiative Supplement to the President’s Budget features cover images illustrating recent developments in nanotechnology stemming from NNI activities that have the potential to make major contributions to National priorities. The text below explains the significance of each of the featured images on this year’s cover.

US_NNI_2017_front_cover_CloseUp

Front cover featured images (above): Images illustrating three novel nanomedicine applications. Center: microneedle array for glucose-responsive insulin delivery imaged using fluorescence microscopy. This “smart insulin patch” is based on painless microneedles loaded with hypoxia-sensitive vesicles ~100 nm in diameter that release insulin in response to high glucose levels. Dr. Zhen Gu and colleagues at the University of North Carolina (UNC) at Chapel Hill and North Carolina State University have demonstrated that this patch effectively regulates the blood glucose of type 1 diabetic mice with faster response than current pH-sensitive formulations. The inset image on the lower right shows the structure of the nanovesicles; each microneedle contains more than 100 million of these vesicles. The research was supported by the American Diabetes Association, the State of North Carolina, the National Institutes of Health (NIH), and the National Science Foundation (NSF). Left: colorized rendering of a candidate universal flu vaccine nanoparticle. The vaccine molecule, developed at the NIH Vaccine Research Center, displays only the conserved part of the viral spike and stimulates the production of antibodies to fight against the ever-changing flu virus. The vaccine is engineered from a ~13 nm ferritin core (blue) combined with a 7 nm influenza antigen (green). Image credit: NIH National Institute of Allergy and Infectious Diseases (NIAID). Right: colorized scanning electron micrograph of Ebola virus particles on an infected VERO E6 cell. Blue represents individual Ebola virus particles. The image was produced by John Bernbaum and Jiro Wada at NIAID. When the Ebola outbreak struck in 2014, the Food and Drug Administration authorized emergency use of lateral flow immunoassays for Ebola detection that use gold nanoparticles for visual interpretation of the tests.

US_NNI_2017_back_cover._CloseUp

Back cover featured images (above): Images illustrating examples of NNI educational outreach activities. Center: Comic from the NSF/NNI competition Generation Nano: Small Science Superheroes. Illustration by Amina Khan, NSF. Left of Center: Polymer Nanocone Array (biomimetic of antimicrobial insect surface) by Kyle Nowlin, UNC-Greensboro, winner from the first cycle of the NNI’s student image contest, EnvisioNano. Right of Center: Gelatin Nanoparticles in Brain (nasal delivery of stroke medication to the brain) by Elizabeth Sawicki, University of Illinois at Urbana-Champaign, winner from the second cycle of EnvisioNano. Outside right: still photo from the video Chlorination-less (water treatment method using reusable nanodiamond powder) by Abelardo Colon and Jennifer Gill, University of Puerto Rico at Rio Piedras, the winning video from the NNI’s Student Video Contest. Outside left: Society of Emerging NanoTechnologies (SENT) student group at the University of Central Florida, one of the initial nodes in the developing U.S. Nano and Emerging Technologies Student Network; photo by Alexis Vilaboy.

US White House’s grand computing challenge could mean a boost for research into artificial intelligence and brains

An Oct. 20, 2015 posting by Lynn Bergeson on Nanotechnology Now announces a US White House challenge incorporating nanotechnology, computing, and brain research (Note: A link has been removed),

On October 20, 2015, the White House announced a grand challenge to develop transformational computing capabilities by combining innovations in multiple scientific disciplines. See https://www.whitehouse.gov/blog/2015/10/15/nanotechnology-inspired-grand-challenge-future-computing The Office of Science and Technology Policy (OSTP) states that, after considering over 100 responses to its June 17, 2015, request for information, it “is excited to announce the following grand challenge that addresses three Administration priorities — the National Nanotechnology Initiative, the National Strategic Computing Initiative (NSCI), and the BRAIN initiative.” The grand challenge is to “[c]reate a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.”

Here’s where the Oct. 20, 2015 posting, which originated the news item, by Lloyd Whitman, Randy Bryant, and Tom Kalil for the US White House blog gets interesting,

 While it continues to be a national priority to advance conventional digital computing—which has been the engine of the information technology revolution—current technology falls far short of the human brain in terms of both the brain’s sensing and problem-solving abilities and its low power consumption. Many experts predict that fundamental physical limitations will prevent transistor technology from ever matching these twin characteristics. We are therefore challenging the nanotechnology and computer science communities to look beyond the decades-old approach to computing based on the Von Neumann architecture as implemented with transistor-based processors, and chart a new path that will continue the rapid pace of innovation beyond the next decade.

There are growing problems facing the Nation that the new computing capabilities envisioned in this challenge might address, from delivering individualized treatments for disease, to allowing advanced robots to work safely alongside people, to proactively identifying and blocking cyber intrusions. To meet this challenge, major breakthroughs are needed not only in the basic devices that store and process information and the amount of energy they require, but in the way a computer analyzes images, sounds, and patterns; interprets and learns from data; and identifies and solves problems. [emphases mine]

Many of these breakthroughs will require new kinds of nanoscale devices and materials integrated into three-dimensional systems and may take a decade or more to achieve. These nanotechnology innovations will have to be developed in close coordination with new computer architectures, and will likely be informed by our growing understanding of the brain—a remarkable, fault-tolerant system that consumes less power than an incandescent light bulb.

Recent progress in developing novel, low-power methods of sensing and computation—including neuromorphic, magneto-electronic, and analog systems—combined with dramatic advances in neuroscience and cognitive sciences, lead us to believe that this ambitious challenge is now within our reach. …

This is the first time I’ve come across anything that publicly links the BRAIN initiative to computing, artificial intelligence, and artificial brains. (For my own sake, I make an arbitrary distinction between algorithms [artificial intelligence] and devices that simulate neural plasticity [artificial brains].)The emphasis in the past has always been on new strategies for dealing with Parkinson’s and other neurological diseases and conditions.

US White House Office of Science and Technology Policy issues a Nanotechnology Grand Challenges request for information

First, there was the Bill and Melinda Gates Foundation Grand Challenges, then there was some sort of Canadian government Grand Challenges, and now there’s the US government Nanotechnology-Inspired Grand Challenges for the Next Decade.

I find it fascinating that ‘Grand Challenges’ have become so popular given the near certainty of at least one defeat and the possibility the entire project will fail. By definition, it’s not a challenge if it’s an easy accomplishment.

Enough musing, a June 18, 2015 news item on Azonano announces the US government (White House Office of Science and Technology Policy [OSTP]) request for information (RFI), which has a deadline of July 16, 2015,

The National Nanotechnology Coordination Office (NNCO) is pleased to highlight an important Request for Information (RFI) issued today by the White House Office of Science and Technology Policy (OSTP) seeking suggestions for Nanotechnology-Inspired Grand Challenges for the Next Decade: ambitious but achievable goals that harness nanoscience, nanotechnology, and innovation to solve important national or global problems and have the potential to capture the public’s imagination.

A June 17, 2015 NNCO news release further describes the RFI,

The RFI can be found online at https://federalregister.gov/a/2015-14914  [blog posting] and is discussed in a White House blog post at https://www.whitehouse.gov/blog/2015/06/17/call-nanotechnology-inspired-grand-challenges. Responses must be received by July 16, 2015, to be considered.

As explained by Dr. Michael Meador, Director of the NNCO, the RFI is a key step in responding to the most recent assessment of the National Nanotechnology Initiative (NNI) by the President’s Council of Advisors on Science and Technology (PCAST). “PCAST specifically recommended that the Federal government launch nanotechnology grand challenges in order to focus and amplify the impact of Federal nanotechnology investments and activities.”

The RFI includes a number of potential grand challenges as examples. Federal agencies participating in the NNI (see www.nano.gov), working with NNCO and OSTP, developed examples in the areas of health care, electronics, materials, sustainability, and product safety in order to illustrate how such grand challenges should be framed and to help stimulate the development of additional grand challenges by the wider community.

The RFI seeks input from nanotechnology stakeholders including researchers in academia and industry, non-governmental organizations, scientific and professional societies, and all other interested members of the public. “We strongly encourage everyone to spread the word about this request,” adds Meador. “We are excited about this request and hope to receive suggestions for bold and exciting challenges that nanotechnology can solve.”

A June 17, 2015 blog posting on the White House website (referred to previously) by Lloyd Whitman and Tom Kalil provides more insight into the ‘Grand Challenges’,

In a recent review of the NNI [US National Nanotechnology Initiative], the President’s Council of Advisors on Science and Technology called for government agencies, industry, and the research community to identify and pursue nanotechnology Grand Challenges. Through today’s RFI, we want to hear your game-changing ideas for Grand Challenges that harness nanoscience and nanotechnology to solve important national or global problems. These Grand Challenges should stimulate additional public and private investment, and foster the commercialization of Federally-funded nanotechnology research.

By 2025, the nanotechnology R&D community is challenged to achieve the following:

  1. Increase the five-year survival rates by 50% for the most difficult to treat cancers.
  2. Create devices no bigger than a grain of rice that can sense, compute, and communicate without wires or maintenance for 10 years, enabling an “internet of things” revolution.
  3. Create computer chips that are 100 times faster yet consume less power.
  4. Manufacture atomically-precise materials with fifty times the strength of aluminum at half the weight and the same cost.
  5. Reduce the cost of turning sea water into drinkable water by a factor of four.
  6. Determine the environmental, health, and safety characteristics of a nanomaterial in a month.

What would you propose? Read more about what makes an effective Grand Challenge and how to propose your own Nanotechnology-Inspired Grand Challenges for the Next Decade and comment on these examples here. Responses must be received by July 16, 2015 to be considered.

Good luck!

Memories, science, archiving, and authenticity

This is going to be one of my more freewheeling excursions into archiving and memory. I’ll be starting with  a movement afoot in the US government to give citizens open access to science research moving onto a network dedicated to archiving nanoscience- and nanotechnology-oriented information, examining the notion of authenticity in regard to the Tiananmen Square incident on June 4, 1989, and finishing with the Council of Canadian Academies’ Expert Panel on Memory Institutions and the Digital Revolution.

In his June 4, 2013 posting on the Pasco Phronesis blog, David Bruggeman features information and an overview of  the US Office of Science and Technology Policy’s efforts to introduce open access to science research for citizens (Note: Links have been removed),

Back in February, the Office of Science and Technology Policy (OSTP) issued a memorandum to federal science agencies on public access for research results.  Federal agencies with over $100 million in research funding have until August 22 to submit their access plans to OSTP.  This access includes research publications, metadata on those publications, and underlying research data (in a digital format).

A collection of academic publishers, including the Association of American Publishers and the organization formerly known as the American Association for the Advancement of Science (publisher of Science), has offered a proposal for a publishing industry repository for pubic access to federally funded research that they publish.

David provides a somewhat caustic perspective on the publishers’ proposal while Jocelyn Kaiser’s June 4, 2013 article for ScienceInsider details the proposal in more detail (Note: Links have been removed),

Organized in part by the Association of American Publishers (AAP), which represents many commercial and nonprofit journals, the group calls its project the Clearinghouse for the Open Research of the United States (CHORUS). In a fact sheet that AAP gave to reporters, the publishers describe CHORUS as a “framework” that would “provide a full solution for agencies to comply with the OSTP memo.”

As a starting point, the publishers have begun to index papers by the federal grant numbers that supported the work. That index, called FundRef, debuted in beta form last week. You can search by agency and get a list of papers linked to the journal’s own websites through digital object identifiers (DOIs), widely used ID codes for individual papers. The pilot project involved just a few agencies and publishers, but many more will soon join FundRef, says Fred Dylla, executive director of the American Institute of Physics. (AAAS, which publishes ScienceInsider, is among them and has also signed on to CHORUS.)

The next step is to make the full-text papers freely available after agencies decide on embargo dates, Dylla says. (The OSTP memo suggests 12 months but says that this may need to be adjusted for some fields and journals.) Eventually, the full CHORUS project will also allow searches of the full-text articles. “We will make the corpus available for anybody’s search tool,” says Dylla, who adds that search agreements will be similar to those that publishers already have with Google Scholar and Microsoft Academic Search.

I couldn’t find any mention in Kaiser’s article as to how long the materials would be available. Is this supposed to be an archive, as well as, a repository? Regardless, I found the beta project, FundRef, a little confusing. The link from the ScienceInsider article takes you to this May 28, 2013 news release,

FundRef, the funder identification service from CrossRef [crossref.org], is now available for publishers to contribute funding data and for retrieval of that information. FundRef is the result of collaboration between funding agencies and publishers that correlates grants and other funding with the scholarly output of that support.

Publishers participating in FundRef add funding data to the bibliographic metadata they already provide to CrossRef for reference linking. FundRef data includes the name of the funder and a grant or award number. Manuscript tracking systems can incorporate a taxonomy of 4000 global funder names, which includes alternate names, aliases, and abbreviations enabling authors to choose from a standard list of funding names. Then the tagged funding data will travel through publishers’ production systems to be stored at CrossRef.

I was hoping that clicking on the FundRef button would take me to a database that I could test or tour. At this point, I wouldn’t have described the project as being at the beta stage (from a user’s perspective) as they are still building it and gathering data. However, there is lots of information on the FundRef webpage including an Additional Resources section featuring a webinar,

Attend an Introduction to FundRef Webinar – Thursday, June 6, 2013 at 11:00 am EDT

You do need to sign up for the webinar. Happily, it is open to international participants, as well as, US participants.

Getting back to my question on whether or not this effort is also an archive of sorts, there is a project closer to home (nanotechnologywise, anyway) that touches on these issues from an unexpected perspective, from the Nanoscience and Emerging Technologies in Society (NETS); sharing research and learning tools About webpage,

The Nanoscience and Emerging Technologies in Society: Sharing Research and Learning Tools (NETS) is an IMLS-funded [Institute of Museum and Library Services] project to investigate the development of a disciplinary repository for the Ethical, Legal and Social Implications (ELSI) of nanoscience and emerging technologies research. NETS partners will explore future integration of digital services for researchers studying ethical, legal, and social implications associated with the development of nanotechnology and other emerging technologies.

NETS will investigate digital resources to advance the collection, dissemination, and preservation of this body of research,  addressing the challenge of marshaling resources, academic collaborators, appropriately skilled data managers, and digital repository services for large-scale, multi-institutional and disciplinary research projects. The central activity of this project involves a spring 2013 workshop that will gather key researchers in the field and digital librarians together to plan the development of a disciplinary repository of data, curricula, and methodological tools.

Societal dimensions research investigating the impacts of new and emerging technologies in nanoscience is among the largest research programs of its kind in the United States, with an explicit mission to communicate outcomes and insights to the public. By 2015, scholars across the country affiliated with this program will have spent ten years collecting qualitative and quantitative data and developing analytic and methodological tools for examining the human dimensions of nanotechnology. The sharing of data and research tools in this field will foster a new kind of social science inquiry and ensure that the outcomes of research reach public audiences through multiple pathways.

NETS will be holding a stakeholders workshop June 27 – 28, 2013 (invite only), from the workshop description webpage,

What is the value of creating a dedicated Nano ELSI repository?
The benefits of having these data in a shared infrastructure are: the centralization of research and ease of discovery; uniformity of access; standardization of metadata and the description of projects; and facilitation of compliance with funder requirements for data management going forward. Additional benefits of this project will be the expansion of data curation capabilities for data repositories into the nanotechnology domain, and research into the development of disciplinary repositories, for which very little literature exists.

What would a dedicated Nano ELSI repository contain?
Potential materials that need to be curated are both qualitative and quantitative in nature, including:

  • survey instruments, data, and analyses
  • interview transcriptions and analyses
  • images or multimedia
  • reports
  • research papers, books, and their supplemental data
  • curricular materials

What will the Stakeholder Workshop accomplish?
The Stakeholder Workshop aims to bring together the key researchers and digital librarians to draft a detailed project plan for the implementation of a dedicated Nano ELSI repository. The Workshop will be used as a venue to discuss questions such as:

  • How can a repository extend research in this area?
  • What is the best way to collect all the research in this area?
  • What tools would users envision using with this resource?
  • Who should maintain and staff a repository like this?
  • How much would a repository like this cost?
  • How long will it take to implement?

What is expected of Workshop participants?
The workshop will bring together key researchers and digital librarians to discuss the requirements for a dedicated Nano ELSI repository. To inform that discussion, some participants will be requested to present on their current or past research projects and collaborations. In addition, workshop participants will be enlisted to contribute to the draft of the final project report and make recommendations for the implementation plan.

While my proposal did not get accepted (full disclosure), I do look forward to hearing more about the repository although I notice there’s no mention made of archiving the materials.

The importance of repositories and archives was brought home to me when I came across a June 4, 2013 article by Glyn Moody for Techdirt about the Tiananmen Square incident and subtle and unsubtle ways of censoring access to information,

Today is June 4th, a day pretty much like any other day in most parts of the world. But in China, June 4th has a unique significance because of the events that took place in Tiananmen Square on that day in 1989.

Moody recounts some of the ways in which people have attempted to commemorate the day online while evading the authorities’ censorship efforts. Do check out the article for the inside scoop on why ‘Big Yellow Duck’ is a censored term. One of the more subtle censorship efforts provides some chills (from the Moody article),

… according to this article in the Wall Street Journal, it looks like the Chinese authorities are trying out a new tactic for handling this dangerous topic:

On Friday, a China Real Time search for “Tiananmen Incident” did not return the customary message from Sina informing the user that search results could not be displayed due to “relevant laws, regulations and policies.” Instead the search returned results about a separate Tiananmen incident that occurred on Tomb Sweeping Day in 1976, when Beijing residents flooded the area to protest after they were prevented from mourning the recently deceased Premiere [sic] Zhou Enlai.

This business of eliminating and substituting a traumatic and disturbing historical event with something less contentious reminded me both of the saying ‘history is written by the victors’ and of Luciana Duranti and her talk titled, Trust and Authenticity in the Digital Environment: An Increasingly Cloudy Issue, which took place in Vancouver (Canada) last year (mentioned in my May 18, 2012 posting).

Duranti raised many, many issues that most of us don’t consider when we blithely store information in the ‘cloud’ or create blogs that turn out to be repositories of a sort (and then don’t know what to do with them; ça c’est moi). She also previewed a Sept. 26 – 28, 2013 conference to be hosted in Vancouver by UNESCO [United Nations Educational, Scientific, and Cultural Organization), “Memory of the World in the Digital Age: Digitization and Preservation.” (UNESCO’s Memory of the World programme hosts a number of these themed conferences and workshops.)

The Sept. 2013 UNESCO ‘memory of the world’ conference in Vancouver seems rather timely in retrospect. The Council of Canadian Academies (CCA) announced that Dr. Doug Owram would be chairing their Memory Institutions and the Digital Revolution assessment (mentioned in my Feb. 22, 2013 posting; scroll down 80% of the way) and, after checking recently, I noticed that the Expert Panel has been assembled and it includes Duranti. Here’s the assessment description from the CCA’s ‘memory institutions’ webpage,

Library and Archives Canada has asked the Council of Canadian Academies to assess how memory institutions, which include archives, libraries, museums, and other cultural institutions, can embrace the opportunities and challenges of the changing ways in which Canadians are communicating and working in the digital age.
Background

Over the past three decades, Canadians have seen a dramatic transformation in both personal and professional forms of communication due to new technologies. Where the early personal computer and word-processing systems were largely used and understood as extensions of the typewriter, advances in technology since the 1980s have enabled people to adopt different approaches to communicating and documenting their lives, culture, and work. Increased computing power, inexpensive electronic storage, and the widespread adoption of broadband computer networks have thrust methods of communication far ahead of our ability to grasp the implications of these advances.

These trends present both significant challenges and opportunities for traditional memory institutions as they work towards ensuring that valuable information is safeguarded and maintained for the long term and for the benefit of future generations. It requires that they keep track of new types of records that may be of future cultural significance, and of any changes in how decisions are being documented. As part of this assessment, the Council’s expert panel will examine the evidence as it relates to emerging trends, international best practices in archiving, and strengths and weaknesses in how Canada’s memory institutions are responding to these opportunities and challenges. Once complete, this assessment will provide an in-depth and balanced report that will support Library and Archives Canada and other memory institutions as they consider how best to manage and preserve the mass quantity of communications records generated as a result of new and emerging technologies.

The Council’s assessment is running concurrently with the Royal Society of Canada’s expert panel assessment on Libraries and Archives in 21st century Canada. Though similar in subject matter, these assessments have a different focus and follow a different process. The Council’s assessment is concerned foremost with opportunities and challenges for memory institutions as they adapt to a rapidly changing digital environment. In navigating these issues, the Council will draw on a highly qualified and multidisciplinary expert panel to undertake a rigorous assessment of the evidence and of significant international trends in policy and technology now underway. The final report will provide Canadians, policy-makers, and decision-makers with the evidence and information needed to consider policy directions. In contrast, the RSC panel focuses on the status and future of libraries and archives, and will draw upon a public engagement process.

Question

How might memory institutions embrace the opportunities and challenges posed by the changing ways in which Canadians are communicating and working in the digital age?

Sub-questions

With the use of new communication technologies, what types of records are being created and how are decisions being documented?
How is information being safeguarded for usefulness in the immediate to mid-term across technologies considering the major changes that are occurring?
How are memory institutions addressing issues posed by new technologies regarding their traditional roles in assigning value, respecting rights, and assuring authenticity and reliability?
How can memory institutions remain relevant as a trusted source of continuing information by taking advantage of the collaborative opportunities presented by new social media?

From the Expert Panel webpage (go there for all the links), here’s a complete listing of the experts,

Expert Panel on Memory Institutions and the Digital Revolution

Dr. Doug Owram, FRSC, Chair
Professor and Former Deputy Vice-Chancellor and Principal, University of British Columbia Okanagan Campus (Kelowna, BC)

Sebastian Chan     Director of Digital and Emerging Media, Smithsonian Cooper-Hewitt National Design Museum (New York, NY)

C. Colleen Cook     Trenholme Dean of Libraries, McGill University (Montréal, QC)

Luciana Duranti   Chair and Professor of Archival Studies, the School of Library, Archival and Information Studies at the University of British Columbia (Vancouver, BC)

Lesley Ellen Harris     Copyright Lawyer; Consultant, Author, and Educator; Owner, Copyrightlaws.com (Washington, D.C.)

Kate Hennessy     Assistant Professor, Simon Fraser University, School of Interactive Arts and Technology (Surrey, BC)

Kevin Kee     Associate Vice-President Research (Social Sciences and Humanities) and Canada Research Chair in Digital Humanities, Brock University (St. Catharines, ON)

Slavko Manojlovich     Associate University Librarian (Information Technology), Memorial University of Newfoundland (St. John’s, NL)

David Nostbakken     President/CEO of Nostbakken and Nostbakken, Inc. (N + N); Instructor of Strategic Communication and Social Entrepreneurship at the School of Journalism and Communication, Carleton University (Ottawa, ON)

George Oates     Art Director, Stamen Design (San Francisco, CA)

Seamus Ross     Dean and Professor, iSchool, University of Toronto (Toronto, ON)

Bill Waiser, SOM, FRSC     Professor of History and A.S. Morton Distinguished Research Chair, University of Saskatchewan (Saskatoon, SK)

Barry Wellman, FRSC     S.D. Clark Professor, Department of Sociology, University of Toronto (Toronto, ON)

I notice they have a lawyer whose specialty is copyright, Lesley Ellen Harris. I did check out her website, copyrightlaws.com and could not find anything that hinted at any strong opinions on the topic. She seems to feel that copyright is a good thing but how far she’d like to take this is a mystery to me based on the blog postings I viewed.

I’ve also noticed that this panel has 13 people, four of whom are women which equals a little more (June 5, 2013, 1:35 pm PDT, I substituted the word ‘less’ for the word ‘more’; my apologies for the arithmetic error) than 25% representation. That’s a surprising percentage given how heavily weighted the fields of library and archival studies are weighted towards women.

I have meandered somewhat but my key points are this:

  • How we are going to keep information available? It’s all very well to have repository but how long will the data be kept in the repository and where does it go afterwards?
  • There’s a bias certainly with the NETS workshop and, likely, the CCA Expert Panel on Memory Institutions and the Digital Revolution toward institutions as the source for information that’s worth keeping for however long or short a time that should be. What about individual efforts? e.g. Don’t Leave Canada Behind ; FrogHeart; Techdirt; The Last Word on Nothing, and many other blogs?
  • The online redirection of Tiananmen Square incident queries is chilling but I’ve often wondered what happen if someone wanted to remove ‘objectionable material’ from an e-book, e.g. To Kill a Mockingbird. A new reader wouldn’t notice the loss if the material has been excised in a subtle or professional  fashion.

As for how this has an impact on science, it’s been claimed that Isaac Newton attempted to excise Robert Hooke from history (my Jan. 19, 2012 posting). Whether it’s true or not, there is remarkably little about Robert Hooke despite his accomplishments and his languishment is a reminder that we must always take care that we retain our memories.

ETA June 6, 2013: David Bruggeman added some more information links about CHORUS in his June 5, 2013 post (On The Novelty Of Corporate-Government Partnership In STEM Education),

Before I dive into today’s post, a brief word about CHORUS. Thanks to commenter Joe Kraus for pointing me to this Inside Higher Ed post, which includes a link to the fact sheet CHORUS organizers distributed to reporters. While there are additional details, there are still not many details to sink one’s teeth in. And I remain surprised at the relative lack of attention the announcement has received. On a related note, nobody who’s been following open access should be surprised by Michael Eisen’s reaction to CHORUS.

I encourage you to check out David’s post as he provides some information about a new STEM (science, technology, engineering, mathematics) collaboration between the US National Science Foundation and companies such as GE and Intel.

Report on nano EHS from US General Accountability Office (GAO)

According the June 22, 2012 news item on Nanowerk, The US General Accountability Office (GAO) has release a new report titled, Nanotechnology: Improved Performance Information Needed for Environmental, Health, and Safety Research (published May 2012). From the report,

Nanotechnology involves the ability to control matter at approximately 1 to 100 nanometers. Worldwide trends suggest that products that rely on nanotechnology will be a $3 trillion market by 2020. However, some of the EHS [Environmental, Health, and Safety]impacts of nanotechnology are unknown. The NSTC [National Science and Technology Council] coordinates and oversees the NNI [National Nanotechnology Initiative], an interagency program that, among other things, develops national strategy documents for federal efforts in nanotechnology.

In this context, GAO examined: (1) changes in federal funding for nanotechnology EHS research from fiscal years 2006 to 2010; (2) the nanomaterials that NNI member agencies’ EHS research focused on in fiscal year 2010; (3) the extent to which NNI member agencies collaborate with stakeholders on this research and related strategies; and (4) the extent to which NNI strategy documents address desirable characteristics of national strategies. GAO’s review included seven NNI agencies that funded 93 percent of the EHS research dollars in fiscal year 2010. This report is based on analysis of NNI and agency documents and responses to a questionnaire of nonfederal stakeholders.

GAO recommends that the Director of the Office of Science and Technology Policy (OSTP), which administers the NSTC, (1) coordinate development of performance information for NNI EHS research needs and publicly report this information; and (2) estimate the costs and resources necessary to meet the research needs. OSTP and the seven included agencies neither agreed nor disagreed with the recommendations. [p.2 of the PDF]

This provides some interesting contrast to the National Nanotechnology Initiative’s (NNI) 4th assessment report which I wrote about in my May 2, 2012 posting,

PCAST [President’s Council of Advisors on Science and Technology] acknowledges that the NSET [Nanoscale Science, Engineering, and Technology Subcommittee coordinates planning, budgeting, program implementation, and review of the NNI] has acted on our recommendation to identify a central coordina­tor for nanotechnology-related EHS research within NNCO. The EHS coordinator has done a laudable job developing and communicating the 2011 NNI EHS research strategy. [emphasis mine] However, there is still a lack of integration between nanotechnology-related EHS research funded through the NNI and the kind of information policy makers need to effectively manage potential risks from nanomaterials. The estab­lishment of the Emerging Technologies Interagency Policy Coordination Committee (ETIPC) through OSTP has begun to bridge that gap, but without close integration between ETIPC and the NEHI working group [Nanotechnology Environmental and Health Implications Working Group], the gap may not be sufficiently narrowed. OSTP and the NSET Subcommittee should expand the charter of the NEHI working group to enable the group to address cross-agency nanotechnology-related policy issues more broadly.

Alphabet soup, eh? The best I can gather is that the GAO report has identified gaps that are identified by the NNI (and which they have begun to address) as per my emphasis in the excerpt from the 4th assessment. As someone who does not know the politics or have access to inside information, the GAO report recommendations are much simpler to understand as the issues are laid out from a more ‘global’ perspective (or big picture perspective) as per US EHS nanotechnology research efforts. The NNI’s 4th assessment report offers more detail and, frankly, I found it more confusing.

This is my 2nd GAO report and, again, I love the writing and organization of the report. (Note: I am lauding the report writing skills.)  Thank you to Frank Rusco, Dan Haas, Krista Anderson, Nirmal Chaudhary, Elizabeth Curda, Lorraine Ettaro, Alison O’Neill, Tind Shepper Ryen, Jeanette Soares, Ruth Solomon, Hai Tran, and Jack Wang.

Public access to publicly funded research; a consultation in the US

There are two requests from the US White House’s Office of Science and Technology Policy (OSTP) for information about public access to publicly funded research. From the Nov.4, 2011 posting by David Bruggeman on his Pasco Phronesis blog,

In today’s Federal Register there are two requests for comment on the topic of public access to federally funded research.  They come from the Office of Science and Technology Policy (OSTP).  One focuses on the digital data produced by that research, the other concerns the publications that result from this research.  … part of the reauthorization of the America COMPETES Act.  The report is focused on determining standards and policies to help ensure long-term preservation and access to digital data and research publications produced from federally funded research.

So one request for information (RFI) is about open access to scientific data and the other is about open access to published research. The RFI for open access to scientific data is more detailed. Some 13 questions are asked, responders may choose to address their own open data access issues rather answering the questions. The questions are  split into two categories: (1) Preservation, Discoverability, and (2) Access and Standards for Interoperability, Re-Use and Re-Purposing. The deadline for responses on this request is January 12, 2012.

The RFI for public access to peer-reviewed, publicly funded research in scholarly publications is less detailed with eight questions being asked.  There’s this one for example,

(1) Are there steps that agencies could take to grow existing and new markets related to the access and analysis of peer-reviewed publications that result from federally funded scientific research? How can policies for archiving publications and making them publically accessible be used to grow the economy and improve the productivity of the scientific enterprise? What are the relative costs and benefits of such policies? What type of access to these publications is required to maximize U.S. economic growth and improve the productivity of the American scientific enterprise?

For this RFI, respondents need to meet a January 2, 2012 deadline.

Both of the RFIs ask questions about how open access can grow the economy. Although I didn’t see any reference to the economy when I was checking out a Canadian government pilot project ( Open Data Pilot Project) I expect we are just as interested in possible economic benefits as our US neighbour. (I mentioned the Canadian project in my March 13, 2011 posting.)

2011 Scientific integrity processes: the US and Canada

Given recent scientific misconduct  (July is science scandal month [July 25 2011] post at The Prodigal Academic blog) and a very slow news month this August,  I thought I’d take a look at scientific integrity in the US and in Canada.

First, here’s a little history. March 9, 2009 US President Barack Obama issued a Presidential Memorandum on Scientific Integrity (excerpted),

Science and the scientific process must inform and guide decisions of my Administration on a wide range of issues, including improvement of public health, protection of the environment, increased efficiency in the use of energy and other resources, mitigation of the threat of climate change, and protection of national security.

The public must be able to trust the science and scientific process informing public policy decisions.  Political officials should not suppress or alter scientific or technological findings and conclusions.  If scientific and technological information is developed and used by the Federal Government, it should ordinarily be made available to the public.  To the extent permitted by law, there should be transparency in the preparation, identification, and use of scientific and technological information in policymaking.  The selection of scientists and technology professionals for positions in the executive branch should be based on their scientific and technological knowledge, credentials, experience, and integrity.

December 17, 2010 John P. Holdren, Assistant to the President for Science and Technology and Director of the Office of Science and Technology Policy,  issued his own memorandum requesting compliance with the President’s order (from the Dec. 17, 2010 posting on The White House blog),

Today, in response to the President’s request, I am issuing a Memorandum to the Heads of Departments and Agencies that provides further guidance to Executive Branch leaders as they implement Administration policies on scientific integrity. The new memorandum describes the minimum standards expected as departments and agencies craft scientific integrity rules appropriate for their particular missions and cultures, including a clear prohibition on political interference in scientific processes and expanded assurances of transparency. It requires that department and agency heads report to me on their progress toward completing those rules within 120 days.

Here’s my edited version (I removed fluff, i.e. material along these lines: scientific integrity is of utmost importance …) of the list Holdren provided,

Foundations

  1. Ensure a culture of scientific integrity.
  2. Strengthen the actual and perceived credibility of Government research. Of particular importance are (a) ensuring that selection of candidates for scientific positions in executive branch is based primarily on their scientific and technological knowledge, credentials, experience, and integrity, (b) ensuring that data and research used to support policy decisions undergo independent peer review by qualified experts where feasibly and appropriate, and consistent with law, (c) setting clear standards governing conflicts, and (d) adopting appropriate whistleblower protections.
  3. Facilitate the free flow of scientific and technological information, consistent with privacy and classification standards. … Consistent with the Administration’s Open Government Initiative, agencies should expand and promote access to scientific and technological information by making it available  online in open formats. Where appropriate, this should include data and models underlying regulatory proposals and policy decisions.
  4. Establish principles for conveying scientific and technological information to the public. … Agencies should communicate scientific and technological findings by including a clear explication of underlying assumptions; accurate contextualization of uncertainties; and a description of the probabilities associated with optimistic and pessimistic projections, including best-case and worst-case scenarios where appropriate.

Public communication

  1. In response to media interview requests about the scientific and technological dimensions of their work, agencies will offer articulate and knowledgeable spokespersons who can, in an objective and nonpartisan fashion, describe and explain these dimension to the media and the American people.
  2. Federal scientists may speak to the media and the public about scientific and technological matters based on their official work, with appropriate coordination with their immediate supervisor and their public affairs office. In no circumstance may public affairs officers ask or direct Federal scientists to alter scientific findings.
  3. Mechanisms are in place to resolve disputes that arise from decisions to proceed or not to proceed  with proposed interviews or other public information-related activities. …

(The sections on Federal Advisory Committees and professional development were less relevant to this posting, so I haven’t included them here.)

It seems to have taken the agencies a little longer than the 120 day deadline that John Holdren gave them but all (or many of the agencies) have complied according to an August 15, 2011 posting by David J. Hanson on the Chemical & Engineering News (C&EN) website,

OSTP director John P. Holdren issued the call for the policies on May 5 in response to a 2009 Presidential memorandum (C&EN, Jan. 10, page 28). [emphasis mine] The memorandum was a response to concerns about politicization of science during the George W. Bush Administration.

The submitted integrity plans include 14 draft policies and five final policies. The final policies are from the National Aeronautics & Space Administration, the Director of National Intelligences for the intelligence agencies, and the Departments of Commerce, Justice, and Interior.

Draft integrity policies are in hand from the Departments of Agriculture, Defense, Education, Energy, Homeland Security, Health & Human Services, Labor, and Transportation and from the National Oceanic & Atmospheric Administration, National Science Foundation, Environmental Protection Agency, Social Security Administrations, OSTP, and Veterans Administration.

The drafts still under review are from the Department of State, the Agency for International Development, and the National Institute of Standards & Technology.

The dates in this posting don’t match up with what I’ve found but it’s possible that the original deadline was moved to better accommodate the various reporting agencies. In any event, David Bruggeman at his Pasco Phronesis blog has commented on this initiative in a number of posts including this August 10, 2011 posting,

… I’m happy to see something out there at all, given the paltry public response from most of the government.  Comments are open until September 6.Regrettably, the EPA [Environmental Protection Agency] policy falls into a trap that is all too common.  The support of scientific integrity is all too often narrowly assumed to simply mean that agency (or agency-funded) scientists need to behave, and there will be consequences for demonstrated bad behavior.

But there is a serious problem of interference from non-scientific agency staff that would go beyond reasonable needs for crafting the public message.

David goes on to discuss a lack of clarity in this policy and in the Dept. of the Interior’s policy.

His August 11, 2011 posting notes the OSTP claims that 19 departments/agencies have submitted draft or final policies,

… Not only does the OSTP blog post not include draft or finalized policies submitted to their office, it fails to mention any timeframe for making them publicly available.  Even more concerning, there is no mention of those policies that have been publicly released.  That is, regrettably, consistent with past practice. While the progress report notes that OSTP will create a policy for its own activities, and that OSTP is working with the Office of Management and Budget on a policy for all of the Executive Office of the President, there’s no discussion of a government-wide policy.

In the last one of his recent series, the August 12, 2011 posting focuses on a Dept. of Commerce memo (Note: The US Dept. of Commerce includes the National Oceanic and Atmospheric Administration and the National Institute of Standards and Technology),

“This memorandum confirms that DAO 219-1 [a Commerce Department order concerning scientific communications] allows scientists to engage in oral fundamental research communications (based on their official work) with the media and the public without notification or prior approval to their supervisor or to the Office of Public Affairs. [emphasis David Bruggeman] Electronic communications with the media related to fundamental research that are the equivalent of a dialogue are considered to be oral communications; thus, prior approval is not required for  scientist to engage in online discussions or email with the media about fundamental research, subject to restrictions on protected nonpublic information as set forth in 219-1.”

I find the exercise rather interesting especially in light of Margaret Munro’s July 27, 2011 article, Feds silence scientist over salmon study, for Postmedia,

Top bureaucrats in Ottawa have muzzled a leading fisheries scientist whose discovery could help explain why salmon stocks have been crashing off Canada’s West Coast, according to documents obtained by Postmedia News.

The documents show the Privy Council Office, which supports the Prime Minister’s Office, stopped Kristi Miller from talking about one of the most significant discoveries to come out of a federal fisheries lab in years.

Science, one of the world’s top research journals, published Miller’s findings in January. The journal considered the work so significant it notified “over 7,400” journalists worldwide about Miller’s “Suffering Salmon” study.

The documents show major media outlets were soon lining up to speak with Miller, but the Privy Council Office said no to the interviews.

In a Twitter conversation with me, David Bruggeman did note that the Science paywall also acts as a kind of muzzle.

I was originally going to end the posting with that last paragraph but I made a discovery, quite by accident. Canada’s Tri-Agency Funding Councils opened a consultation with stakeholders on Ethics and Integrity for Institutions, Applicants, and Award Holders on August 15, 2011 which will run until September 30, 2011. (This differs somewhat from the US exercise which is solely focussed on science as practiced in various government agencies.  The equivalent in Canada would be if Stephen Harper requested scientific integrity guidelines from the Ministries of Environment, Natural Resources, Health, Industry, etc.) From the NSERC Ethics and Integrity Guidelines page,

Upcoming Consultation on the Draft Tri-Agency Framework: Responsible Conduct of Research

The Canadian Institutes of Health Research (CIHR), the Social Sciences and Humanities Research Council of Canada (SSHRC), and NSERC (the tri-agencies) continue to work on improving their policy framework for research and scholarly integrity, and financial accountability. From August 15 to September 30, 2011, the three agencies are consulting with a wide range of stakeholders in the research community on the draft consultation document, Tri-Agency Framework: Responsible Conduct of Research.

I found the answers to these two questions in the FAQs particularly interesting,

  • What are some of the new elements in this draft Framework?

The draft Framework introduces new elements, including the following:

A strengthened Tri-Agency Research Integrity Policy
The draft Framework includes a strengthened Tri-Agency Research Integrity Policy that clarifies the responsibilities of the researcher.

‘Umbrella’ approach to RCR
The draft Framework provides an overview of all applicable research policies, including those related to the ethical conduct of research involving humans and financial management, as well as research integrity. It also clarifies the roles and responsibilities of researchers, institutions and Agencies in responding to all types of alleged breaches of Agency policies, for example, misuse of funds, unethical conduct of research involving human participants or plagiarism.

A definition of a policy breach
The draft Framework clarifies what constitutes a breach of an Agency policy.

Disclosure
The draft Framework requires researchers to disclose, at the time of application, whether they have ever been found to have breached any Canadian or other research policies, regardless of the source of funds that supported the research and whether or not the findings originated in Canada or abroad.

The Agencies are currently seeking advice from privacy experts on the scope of the information to be requested.

Institutional Investigations
The Agencies currently specify that institutional investigation committee membership must exclude those in conflict of interest. The draft Framework stipulates also that an investigation committee must include at least one member external to the Institution, and that an Agency may conduct its own review or compliance audit, or require the Institution to conduct an independent review/audit.

Timeliness of investigation
Currently, it is up to institutions to set timelines for investigations. The draft Framework states that inquiry and investigation reports are to be submitted to the relevant Agency within two and seven months, respectively, following receipt of the allegation by the institution.

  • Who is being consulted?

The Agencies have targeted their consultation to individual researchers, post-secondary institutions and other eligible organizations that apply for and receive Agency funding.

As far as I can tell, there is no mention of ethical issues where the government has interfered in the dissemination of scientific information; it seems there is an assumption that almost all ethical misbehaviour is on that part of the individual researcher or a problem with an institution following policy. There is one section devoted breaches by institutions (all two paragraphs of it),

5 Breaches of Agency Policies by Institutions

In accordance with the MOU signed by the Agencies and each Institution, the Agencies require that each Institution complies with Agency policies as a condition of eligibility to apply for and administer Agency funds.

The process followed by the Agencies to address an allegation of a breach of an Agency policy by an Institution, and the recourse that the Agencies may exercise, commensurate with the severity of a confirmed breach, are outlined in the MOU.

My criticism of this is similar to the one that David Bruggeman made of the US policies in that the focus is primarily on the individual.

A tale of two countries and nanotechnology strategies (part 2 of an occasional series)

The US National Nanotechnology Initiative’s (NNI) tenth anniversary celebration titled, Nanotechnology Innovation Summit was announced about a week ago around the same time I received a copy of the documentation outlining the Canadian government’s expenditures on nanotechnology from the fiscal years 2005/6 to 2008/9.

The documentation which was issued in response to a question by Member of Parliament Peter Julian is some 80 pages that’s not organized in a way that makes for easy reading. (I interviewed Peter Julian, New Democratic Party, about his private member’s bill on nanotechnology here in part 1, part 2, and part 3.) Since there is no single nanotechnology funding hub, each ministry or funding agency issues its own records which is usually in the form of spreadsheets and each agency has its own organizing strategy. It’s going to take a little more time before I can make much sense of it but once I do, I’ll try to post it here.

Meanwhile, I found this July 26, 2010 news item about the NNI’s 10 anniversary on Nanowerk,

The Nano Science and Technology Institute (NSTI), in cooperation with the Office of Science and Technology Policy (OSTP) and the National Nanotechnology Coordination Office (NNCO), announced today a National Nanotechnology Innovation Summit to mark the 10th anniversary of the National Nanotechnology Initiative (NNI) to be held December 8-10, 2010 at the Gaylord National Hotel & Convention Center in National Harbor, MD. The event, in cooperation with OSTP and NNCO and organized by NSTI, with key support from the National Venture Capital Association (NVCA), will serve as a forum for the nation’s nanotechnology innovators, investors, policy makers and leading corporate developers and integrators.

Since its formal launch in 2001 under President Clinton, the National Nanotechnology Initiative has strategically invested and coordinated over $12 billion in nanotechnology development. [emphasis mine] The NNI Nanotechnology Innovation Summit will spotlight revolutionary technologies from the 10-year NNI funding effort, with a special emphasis on showcasing commercially transformational technologies directly funded or catalyzed by the multi-agency partnership of the NNI. Participants will hear from some of the top researchers, industry leaders, technology investors and visionary policy makers of our time as they speak about the impact of nanotechnology innovation over the past 10 years and look toward the future.

Intriguing, yes? In the US, they can state they’ve spent 12B US over 10 years (I assume they can break those figures down) while in Canada, the figures don’t appear to have been aggregated even on agency by agency basis.

I think it comes down to a basic philosophical difference in how nanotechnology has been approached. In the US (and many other juridictions) it’s been treated as a specialty in and of itself. The approach makes sense since chemistry at the nanoscale is significantly different from chemistry at the macroscale.

In Canada, we seem to have taken the perspective that nanotechnology is a continuation of scientific exploration and while the particulars differ dramatically, nanotechnology itself is a logical progression of the scientific enterprise.

I don’t know that one approach is better than the other but the US approach makes funding questions a lot easier to answer.