Tag Archives: University of California

Revolutionizing electronics with liquid metal technology?

I’m not sure I’d call it the next big advance in electronics, there are too many advances jockeying for that position but this work from Australia and the US is fascinating. From a Feb. 17, 2017 news item on ScienceDaily,

A new technique using liquid metals to create integrated circuits that are just atoms thick could lead to the next big advance for electronics.

The process opens the way for the production of large wafers around 1.5 nanometres in depth (a sheet of paper, by comparison, is 100,000nm thick).

Other techniques have proven unreliable in terms of quality, difficult to scale up and function only at very high temperatures — 550 degrees or more.

A Feb. 17, 2017 RMIT University press release (also on EurekAlert), which originated the news item, expands on the theme (Note: A link has been removed),

Distinguished Professor Kourosh Kalantar-zadeh, from RMIT’s School of Engineering, led the project, which also included colleagues from RMIT and researchers from CSIRO, Monash University, North Carolina State University and the University of California.

He said the electronics industry had hit a barrier.

“The fundamental technology of car engines has not progressed since 1920 and now the same is happening to electronics. Mobile phones and computers are no more powerful than five years ago.

“That is why this new 2D printing technique is so important – creating many layers of incredibly thin electronic chips on the same surface dramatically increases processing power and reduces costs.

“It will allow for the next revolution in electronics.”

Benjamin Carey, a researcher with RMIT and the CSIRO, said creating electronic wafers just atoms thick could overcome the limitations of current chip production.

It could also produce materials that were extremely bendable, paving the way for flexible electronics.

“However, none of the current technologies are able to create homogenous surfaces of atomically thin semiconductors on large surface areas that are useful for the industrial scale fabrication of chips.

“Our solution is to use the metals gallium and indium, which have a low melting point.

“These metals produce an atomically thin layer of oxide on their surface that naturally protects them. It is this thin oxide which we use in our fabrication method.

“By rolling the liquid metal, the oxide layer can be transferred on to an electronic wafer, which is then sulphurised. The surface of the wafer can be pre-treated to form individual transistors.

“We have used this novel method to create transistors and photo-detectors of very high gain and very high fabrication reliability in large scale.”

Here’s a link to and a citation for the paper,

Wafer-scale two-dimensional semiconductors from printed oxide skin of liquid metals by Benjamin J. Carey, Jian Zhen Ou, Rhiannon M. Clark, Kyle J. Berean, Ali Zavabeti, Anthony S. R. Chesman, Salvy P. Russo, Desmond W. M. Lau, Zai-Quan Xu, Qiaoliang Bao, Omid Kevehei, Brant C. Gibson, Michael D. Dickey, Richard B. Kaner, Torben Daeneke, & Kourosh Kalantar-Zadeh. Nature Communications 8, Article number: 14482 (2017) doi:10.1038/ncomms14482
Published online: 17 February 2017

This paper is open access.

$81M for US National Nanotechnology Coordinated Infrastructure (NNCI)

Academics, small business, and industry researchers are the big winners in a US National Science Foundation bonanza according to a Sept. 16, 2015 news item on Nanowerk,

To advance research in nanoscale science, engineering and technology, the National Science Foundation (NSF) will provide a total of $81 million over five years to support 16 sites and a coordinating office as part of a new National Nanotechnology Coordinated Infrastructure (NNCI).

The NNCI sites will provide researchers from academia, government, and companies large and small with access to university user facilities with leading-edge fabrication and characterization tools, instrumentation, and expertise within all disciplines of nanoscale science, engineering and technology.

A Sept. 16, 2015 NSF news release provides a brief history of US nanotechnology infrastructures and describes this latest effort in slightly more detail (Note: Links have been removed),

The NNCI framework builds on the National Nanotechnology Infrastructure Network (NNIN), which enabled major discoveries, innovations, and contributions to education and commerce for more than 10 years.

“NSF’s long-standing investments in nanotechnology infrastructure have helped the research community to make great progress by making research facilities available,” said Pramod Khargonekar, assistant director for engineering. “NNCI will serve as a nationwide backbone for nanoscale research, which will lead to continuing innovations and economic and societal benefits.”

The awards are up to five years and range from $500,000 to $1.6 million each per year. Nine of the sites have at least one regional partner institution. These 16 sites are located in 15 states and involve 27 universities across the nation.

Through a fiscal year 2016 competition, one of the newly awarded sites will be chosen to coordinate the facilities. This coordinating office will enhance the sites’ impact as a national nanotechnology infrastructure and establish a web portal to link the individual facilities’ websites to provide a unified entry point to the user community of overall capabilities, tools and instrumentation. The office will also help to coordinate and disseminate best practices for national-level education and outreach programs across sites.

New NNCI awards:

Mid-Atlantic Nanotechnology Hub for Research, Education and Innovation, University of Pennsylvania with partner Community College of Philadelphia, principal investigator (PI): Mark Allen
Texas Nanofabrication Facility, University of Texas at Austin, PI: Sanjay Banerjee

Northwest Nanotechnology Infrastructure, University of Washington with partner Oregon State University, PI: Karl Bohringer

Southeastern Nanotechnology Infrastructure Corridor, Georgia Institute of Technology with partners North Carolina A&T State University and University of North Carolina-Greensboro, PI: Oliver Brand

Midwest Nano Infrastructure Corridor, University of  Minnesota Twin Cities with partner North Dakota State University, PI: Stephen Campbell

Montana Nanotechnology Facility, Montana State University with partner Carlton College, PI: David Dickensheets
Soft and Hybrid Nanotechnology Experimental Resource,

Northwestern University with partner University of Chicago, PI: Vinayak Dravid

The Virginia Tech National Center for Earth and Environmental Nanotechnology Infrastructure, Virginia Polytechnic Institute and State University, PI: Michael Hochella

North Carolina Research Triangle Nanotechnology Network, North Carolina State University with partners Duke University and University of North Carolina-Chapel Hill, PI: Jacob Jones

San Diego Nanotechnology Infrastructure, University of California, San Diego, PI: Yu-Hwa Lo

Stanford Site, Stanford University, PI: Kathryn Moler

Cornell Nanoscale Science and Technology Facility, Cornell University, PI: Daniel Ralph

Nebraska Nanoscale Facility, University of Nebraska-Lincoln, PI: David Sellmyer

Nanotechnology Collaborative Infrastructure Southwest, Arizona State University with partners Maricopa County Community College District and Science Foundation Arizona, PI: Trevor Thornton

The Kentucky Multi-scale Manufacturing and Nano Integration Node, University of Louisville with partner University of Kentucky, PI: Kevin Walsh

The Center for Nanoscale Systems at Harvard University, Harvard University, PI: Robert Westervelt

The universities are trumpeting this latest nanotechnology funding,

NSF-funded network set to help businesses, educators pursue nanotechnology innovation (North Carolina State University, Duke University, and University of North Carolina at Chapel Hill)

Nanotech expertise earns Virginia Tech a spot in National Science Foundation network

ASU [Arizona State University] chosen to lead national nanotechnology site

UChicago, Northwestern awarded $5 million nanotechnology infrastructure grant

That is a lot of excitement.

American National Standards Institute’s (ANSI) nanotechnology standards panel to meet in Februrary 2013 and one more standard

The American National Standards Institute’s (ANSI) Nanotechnology Standards Panel (NSP) was scheduled to meet in Oct. 2012 but Hurricane Sandy, which hit the eastern part of the continent at that time, necessitated rescheduling to Feb. 4, 2013 as per the Dec. 20, 2012 posting on Thomas.net,

Originally scheduled for October 30, 2012, ANSI’s Nanotechnology Standards Panel meeting was postponed as a result of Hurricane Sandy and will now be held on February 4, 2013. Meeting will examine how current nanotechnology standards are being utilized and how standards activities meet existing stakeholder needs. Benefits of participating in nanotechnology standardization and the possibilities for greater collaboration between stakeholders in this area will also be discussed.

The Dec. 14, 2012 ANSI news release provides more details about the Feb. 4, 2012 meeting to be held in Washington, DC,

The half-day meeting will examine how current nanotechnology standards are being utilized and how standards activities meet existing stakeholder needs. The benefits for companies, organizations, and other groups to participate in nanotechnology standardization and the possibilities for greater collaboration between stakeholders in this area will also be discussed.

Formed in 2004, ANSI’s NSP serves as the cross-sector coordinating body for the facilitation of standards development in the area of nanotechnology. Shaun Clancy, Ph.D., the director of product regulatory services for the Evonik Degussa Corporation, and Ajit Jilavenkatesa, Ph.D., the senior standards policy advisor for the National Institute of Science and Technology (NIST) of the U.S. Department of Commerce (DoC), serve as the ANSI-NSP’s co-chairs.

… The ANSI-NSP works to provide a forum for standards developing organizations (SDOs), government entities, academia, and industry to identify needs and establish recommendations for the creation or updating of standards related to nanotechnology and nanomaterials. In addition, the ANSI-NSP solicits participation from nanotechnology-related groups that have not traditionally been involved in the voluntary consensus standards system, while also promoting cross-sector collaborative efforts.

Attendance at the February meeting is free. All attendees are required to register here for the meeting; individuals who registered for the October 2012 event must register again. [emphasis mine] For more information, visit the ANSI-NSP webpage or contact Heather Benko (hbenko@ansi.org), ANSI senior manager, nanotechnology standardization activities.

Standardization is one of the topics highlighted in Michael Berger’s Dec. 20, 2012 Nanowerk Spotlight article about environmental health and safety and a high-throughput screening (HTS) platform developed at the University of California’s Center for Environmental Implications of Nanotechnology (CEIN) that can perform toxicity screening of 24 metal oxide nanoparticles simultaneously,

According to the team, the HTS platform that has been demonstrated in this study could easily be adapted to study other nanomaterials of interest. The capability of HTS would also allow researchers to analyze multiple samples at different concentrations, time points, as well as varying experimental parameters – all in one setup. The standardization of the whole screening process by this HTS platform also minimizes human intervention and errors during the experiment.

I guess it’s the season for standardization. Ho, ho, ho!

Princeton goes Open Access; arXiv is 10 years old

Open access to science research papers seems only right given that most Canadian research is publicly funded. (As I understand it most research worldwide is publicly funded.)

This week, Princeton University declared that their researchers’ work would be mostly open access (from the Sept. 28, 2011 news item on physrog.com),

Prestigious US academic institution Princeton University has banned researchers from giving the copyright of scholarly articles to journal publishers, except in certain cases where a waiver may be granted.

Here’s a little more from Sunanda Creagh’s (based in Australia) Sept.28, 2011 posting on The Conversation blog,

The new rule is part of an Open Access policy aimed at broadening the reach of their scholarly work and encouraging publishers to adjust standard contracts that commonly require exclusive copyright as a condition of publication.

Universities pay millions of dollars a year for academic journal subscriptions. People without subscriptions, which can cost up to $25,000 a year for some journals or hundreds of dollars for a single issue, are often prevented from reading taxpayer funded research. Individual articles are also commonly locked behind pay walls.

Researchers and peer reviewers are not paid for their work but academic publishers have said such a business model is required to maintain quality.

This Sept. 29, 2011 article by James Chang for the Princetonian adds a few more details,

“In the interest of better disseminating the fruits of our scholarship to the world, we did not want to put it artificially behind a pay wall where much of the world won’t have access to it,” committee chair and computer science professor Andrew Appel ’81 said.

The policy passed the Faculty Advisory Committee on Policy with a unanimous vote, and the proposal was approved on Sept. 19 by the general faculty without any changes.

A major challenge for the committee, which included faculty members in both the sciences and humanities, was designing a policy that could comprehensively address the different cultures of publication found across different disciplines.

While science journals have generally adopted open-access into their business models, humanities publishers have not. In the committee, there was an initial worry that bypassing the scholarly peer-review process that journals facilitate, particularly in the humanities, could hurt the scholarly industry.

At the end, however, the committee said they felt that granting the University non-exclusive rights would not harm the publishing system and would, in fact, give the University leverage in contract negotiations.

That last comment about contract negotiations is quite interesting as it brings to mind the California boycott of the Nature journals last year when Nature made a bold attempt to raise subscription fees substantively (400%) after having given the University of California special deals for years (my June 15, 2010 posting).

Creagh’s posting features some responses from Australian academics such as Simon Marginson,

Having prestigious universities such as Princeton and Harvard fly the open access flag represented a step forward, said open access advocate Professor Simon Marginson from the University of Melbourne’s Centre for the Study of Higher Education.

“The achievement of free knowledge flows, and installation of open access publishing on the web as the primary form of publishing rather than oligopolistic journal publishing subject to price barriers, now depends on whether this movement spreads further among the peak research and scholarly institutions,” he said.

“Essentially, this approach – if it becomes general – normalises an open access regime and offers authors the option of opting out of that regime. This is a large improvement on the present position whereby copyright restrictions and price barriers are normal and authors have to attempt to opt in to open access publishing, or risk prosecution by posting their work in breach of copyright.”

“The only interests that lose out under the Princeton proposal are the big journal publishers. Everyone else gains.”

Whether you view Princeton’s action as a negotiating ploy and/or a high minded attempt to give freer access to publicly funded research,  this certainly puts pressure on the business models that scholarly publishers follow.

arXiv, celebrating its 10th anniversary this year, is another open access initiative although it didn’t start that way. From the Sept. 28, 2011 news item on physorg.com,

“I’ve heard a lot about how democratic the arXiv is,” Ginsparg [Paul Ginsparg, professor of physics and information science] said Sept. 23 in a talk commemorating the anniversary. People have, for example, praised the fact that the arXiv makes scientific papers easily available to scientists in developing countries where subscriptions to journals are not always affordable. “But what I was trying to do was set up a system that eliminated the hierarchy in my field,” he said. As a physicist at Los Alamos National Laboratory, “I was receiving preprints long before graduate students further down the food chain,” Ginsparg said. “When we have success we like to think it was because we worked harder, not just because we happened to have access.”

Bill Steele’s Sept. 27, 2011 article for Cornell Univesity’s ChronicleOnline notes,

One of the surprises, Ginsparg said, is that electronic publishing has not transformed the seemingly irrational scholarly publishing system in which researchers give their work to publishing houses from which their academic institutions buy it back by subscribing to journals. Scholarly publishing is still in transition, Ginsparg said, due to questions about how to fund electronic publication and how to maintain quality control. The arXiv has no peer-review process, although it does restrict submissions to those with scientific credentials.

But the lines of communication are definitely blurring. Ginsparg reported that a recent paper posted on the arXiv by Alexander Gaeta, Cornell professor of applied and engineering physics, was picked up by bloggers and spread out from there. The paper is to be published in the journal Nature and is still under a press embargo, but an article about it has appeared in the journal Science.

Interesting, eh? It seems that scholarly publishing need not disappear but there’s no question its business models are changing.

California boycott of Nature journals?

It seems the California Digital Library (CDL) which manages subscriptions for the University of  California has been getting a deal on its Nature journal subscriptions and now the publisher, Nature Publishing Group (NPG), has raised their subscription price by approximately 400 percent. Predictably the librarians are protesting the rate hike. Less predictably, they are calling for a boycott.

The Pasco Phronesis blog notes,

The negotiations continue via press releases. Independent of the claims both sides are making, this fight brings out the point that journal subscription rates have continually increased at rates that challenge many universities to keep up. NPG is not the only company charging high rates, it’s just that the long-standing agreement with the CDL has become no longer sustainable for NPG. Given the continuing budget problems California faces, it seems quite likely that the CDL may no longer find NPG subscriptions sustainable.

The article by Jennifer Howard for the Chronicle of Higher Education offers some details about the proposed boycott. In addition to canceling subscriptions,

The voluntary boycott would “strongly encourage” researchers not to contribute papers to those journals or review manuscripts for them. It would urge them to resign from Nature’s editorial boards and to encourage similar “sympathy actions” among colleagues outside the University of California system.

The boycott’s impact on faculty is not something that immediately occurred to me but Dr. Free-Ride at Adventures in Ethics and Science notes,

One bullet point that I think ought to be included above — something that I hope UC faculty and administrators will consider seriously — is that hiring, retention, tenure, and promotion decisions within the UC system should not unfairly penalize those who have opted to publish their scholarly work elsewhere, including in peer-reviewed journals that may not currently have the impact factor (or whatever other metric that evaluators lean on so as not to have to evaluate the quality of scholarly output themselves) that the NPG journals do. Otherwise, there’s a serious career incentive for faculty to knuckle under to NPG rather than honoring the boycott.

There is both support and precedent for such a boycott according to Howard’s article,

Keith Yamamoto is a professor of molecular biology and executive vice dean of the School of Medicine at UC-San Francisco. He stands ready to help organize a boycott, if necessary, a tactic he and other researchers used successfully in 2003 when another big commercial publisher, Elsevier, bought Cell Press and tried to raise its journal prices.

After the letter went out on Tuesday, Mr. Yamamoto received an “overwhelmingly positive” response from other university researchers. He said he’s confident that there will be broad support for a boycott among the faculty if the Nature Group doesn’t negotiate, even if it means some hardships for individual researchers.

“There’s a strong feeling that this is an irresponsible action on the part of NPG,” he told The Chronicle. That feeling is fueled by what he called “a broad awareness in the scientific community that the world is changing rather rapidly with respect to scholarly publication.”

Although researchers still have “a very strong tie to traditional journals” like Nature, he said, scientific publishing has evolved in the seven years since the Elsevier boycott. “In many ways it doesn’t matter where the work’s published, because scientists will be able to find it,” Mr. Yamamoto said.

I feel sympathy for both sides as neither side is doing well economically these days. I do have to wonder at the decision to quadruple the subscription rates overnight as it smacks of a negotiating tactic in a situation where the CDL had come to expect a significantly lowered subscription rate. With this tactic there’s the potential for a perceived win-win situation. The CDL will triumphantly negotiate a lower subscription rate and the publisher will get the increase they wanted in the first place. That’s my theory.

Plenty of Room at the Bottom’s 50th anniversary; new advance in nanoassembly; satirizing the copyright wars; China’s social media map

There’s plenty of room at the bottom, Richard Feynman’s December 29, 1959 talk for the American Physical Society is considered to be the starting point or origin for nanotechnology and this December marks its 50th anniversary. Chris Toumey, a cultural anthropologist at the University of South Carolina NanoCenter, has an interesting commentary about it (on Nanowerk) and he poses the question, would nanotechnology have existed without Richard Feynman’s talk? Toumey answers yes. You can read the commentary here.

In contrast to Toumey’s speculations, there’s  Colin Milburn (professor at University of California, Davis) who in his essay, Nanotechnology in the Age of Posthuman Engineering: Science Fiction as Science, suggests that nanotechnology originated in science fiction. You can read more about Milburn, find the citations for the essay I’ve mentioned, and/or download three of his other essays from here.

Ting Xu and her colleagues at the US Dept. of Energy’s Lawrence Berkeley National Laboratory have developed a new technique for self-assembling nanoparticles. From the news item on Physorg.com,

“Bring together the right basic components – nanoparticles, polymers and small molecules – stimulate the mix with a combination of heat, light or some other factors, and these components will assemble into sophisticated structures or patterns,” says Xu. “It is not dissimilar from how nature does it.”

More details are available here.

TechDirt featured a clip from This hour has 22 minutes, a satirical Canadian comedy tv programme, which pokes fun at the scaremongering which features mightily in discussions about copyright. You can find the clip here on YouTube.

I’ve been meaning to mention this tiny item from Fast Company (by Noah Robischon) about China’s social media. From the news bit,

The major players in the U.S. social media world can be counted on one hand: Facebook, MySpace, Twitter, LinkedIn. Not so in China, where the country’s 300 million online users have a panoply of popular social networks to choose from–and Facebook doesn’t even crack the top 10.

Go here to see the infographic illustrating China’s social media landscape.

Happy weekend!