At this point it’s more rumour than fact, but it does seem that the US White House is in the process of creating a new interagency group on emerging technologies. Andrew Maynard’s blog, 2020 Science, notes that the announcement was made by Tom Kalil, director of policy for the Office of Science and Technology Policy (OSTP), at a workshop in Washington, DC, last week. This news comes via the American Association for the Advancement of Science (AAAS). (I have checked the AAAS site since Andrew blogged yesterday but they still haven’t posted a news release or alert about this new interagency.)
What’s interesting about this proposed new interagency is the hope of a new approach to developing policies. From Andrew’s posting,
Looking forward, there is a need to develop emerging technology-related policies that are balanced by considerations other than technology promotion alone. But on top of this, there is a need to develop more holistic approaches to emerging technologies in general. Nanotechnology is not the only new technology on the block – technologies emerging under the banners of synthetic biology, robotics, geoengineering, cognitive enhancement and a plethora of others are coming up fast. Then there are the gray areas between these where convergence leads to increasingly complex and ill-defined technologies. In the face of accelerating innovation, should policies be developed for each and every new technology that comes along? This would be exceedingly difficult to achieve now, and an impossible task I suspect a few years down the line.
One solution – and the one the White House seems to be pursuing – is to take a high-level approach to emerging technology policy that ensures cross-agency coordination, identifies emerging hot-spots and enables a balanced and socially-responsible approach to emerging opportunities and issues. In some ways this is a role that the long-defunct Office of Technology Assessment within the US Congress played. But looking to an increasingly technologically-complex future, I suspect that a complete rethink of how to ensure the benefits of new technologies are realized and the dangers avoided is needed.
There does seem to be some sort of movement to respond in a manner that’s appropriate to the fast-paced and ever-changing science and technology environment of the 21st century. For example, there seems to be some interest in developing a more responsive rather than catch-up regulatory environment in the UK. I gather that their adoption of the precautionary principle towards nanotechnology research and adoption has resulted in a more inflexible and cumbersome response to the technology in many spheres not just the regulatory framework. (As I recall, this is Max Weber’s Law of Unintended Consequences. Note: According to Wikipedia, the ‘law‘ can be tracked back to Adam Smith, at least.)
In related news, the Project on Emerging Nanotechnologies is hosting an event,
REINVENTING TECHNOLOGY ASSESSMENT
A 21st Century ModelAround the world the pace, complexity, and social significance of technological changes are increasing. Yet the broad social ramifications are often not considered until after new technologies become widely adapted and entrenched. This makes the need for technology assessment (TA) greater than ever, sparking renewed interest in TA models, practices, and evaluation.
Join us on Wednesday, April 28th, at 3:00 p.m. for a discussion of a new report that explores possible future options for technology assessment and ways to use citizen participation, collaboration, and expert analysis to inform and improve decision-making on issues involving science and technology.
You must register to attend the event.
Please RSVP at stip@wilsoncenter.org
*** Webcast LIVE at www.wilsoncenter.org/stip ***
No RSVP required to view the webcast.
What:
REINVENTING TECHNOLOGY ASSESSMENT: A 21st Century Model
When:
Wednesday, April 28, 2010, 3:00 – 4:30 PM (reception to follow)
Who:
Richard Sclove, Ph.D., Founder and Senior Fellow, The Loka Institute
Commenter: Paul Stern, Ph.D., National Research Council
Moderator: David Rejeski, Director, Science and Technology Innovation Program
Where: Woodrow Wilson International Center for Scholars, 5th Floor Conference Room
Media planning to cover the event should contact Patrick Polischuk at (202) 691-4283 or at patrick.polischuk@wilsoncenter.org
More on memristors
Dr. Leon Chua very kindly (and on his way out of town) responded to an email asking about the 2nd part to a 2003 paper that he authored and that was mentioned by Forrest H Bennett III in his interview with me last week. Here’s Dr. Chua’s response,
Part 2 has not yet been written ! I had very little feedback from Part 1 so I thought…there is little interest–until the hp paper. Ever since I have been bombarded with: When it will be written? I will try to find some time next year.
He also added a few comments about the ‘fourth circuit element’ debate,
For now, it may help you to know that there are two technical reasons why the memristor is the fourth element.
First, one can prove from circuit-theoretic principles that it is impossible to build a memristor using only two-terminal resistors, inductors, and capacitors, even if one uses such active 2-terminal elements as negative resistors, or tunnel diodes. Following the logical principles from Aristotle, it would be only logical to classify the memristor as a different element from the other three.
The second reason is even though Part 1 [of the 2002 paper] shows there is an infinite number of circuit elements, and even though all can in principle be built using transistors (this does not contradict my statement above since transistors are 3-terminal devices, while the memristor being a 2-terminal device, should also be realized with 2-terminal devices), only memristors can be built without transistors, op amps, batteries, etc. All the higher-order elements are active, and hence do not exist in nature. They must be made with active elements and need a power supply.
In contrast, the hp memristor is passive and hence non-volatile. This is analogous to chemistry where elements with higher numbers are unstable,and radioactive. Hope above helps.
Thank you Dr. Chua.
Public engagement: I’m a scientist – Get me out of here
I have written about this project before and I mention it again because I so admire it. Sophia Collins, the producer for the event I’m a scientist – Get me out of here, has written a guest blog on 2020 Science where she discusses more details and some of the unintended outcomes of the project. From the posting,
“itz hometime but we want to stay and ask questions”
These are the words of a 14 year old student, at a school in inner-city London. The school has some of the poorest academic results in the school district, well below the national average. And yet a classroom science activity had the students so gripped that when the bell went for the end of the school day, they insisted on staying for another 15 minutes to ask more questions.
…
One scientist told me that this was “the most science-related fun I’ve had in ages,” while a teacher emailed to tell me her class was splitting into fan clubs for the different scientists, “with the sort of devotion they’ve only had for pop stars up until now.”
The project itself was designed with a money prize which the students voted to give to a scientist that they chose after asking questions and chatting online. Do read Collins’ posting if you’re interested in her theories on why this project worked and in learning about how the conversations and chats evolved over time and elicited some profound thinking.