An April 11, 2016 news item on phys.org highlights some research presented at the IEEE (Institute of Electrical and Electronics Engineers) Haptics (touch) Symposium 2016,
Using your skin as a touchscreen has been brought a step closer after UK scientists successfully created tactile sensations on the palm using ultrasound sent through the hand.
The University of Sussex-led study – funded by the Nokia Research Centre and the European Research Council – is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand.
This solves one of the biggest challenges for technology companies who see the human body, particularly the hand, as the ideal display extension for the next generation of smartwatches and other smart devices.
Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display.
However, this new innovation, called SkinHaptics, sends sensations to the palm from the other side of the hand, leaving the palm free to display the screen.
The device uses ‘time-reversal’ processing to send ultrasound waves through the hand. This technique is effectively like ripples in water but in reverse – the waves become more targeted as they travel through the hand, ending at a precise point on the palm.
It draws on a rapidly growing field of technology called haptics, which is the science of applying touch sensation and control to interaction with computers and technology.
Professor Sriram Subramanian, who leads the research team at the University of Sussex, says that technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an ‘eye-free’ age of technology.
He says: “Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important.
“If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user.
“What we offer people is the ability to feel their actions when they are interacting with the hand.”
The findings were presented at the IEEE Haptics Symposium [April 8 – 11] 2016 in Philadelphia, USA, by the study’s co-author Dr Daniel Spelmezan, a research assistant in the Interact Lab.
There is a video of the work (I was not able to activate sound, if there is any accompanying this video),
The consequence of watching this silent video was that I found the whole thing somewhat mysterious.
A March 31, 2016 news item on ScienceDaily highlights some research into 2D (two-dimensional) boron at Rice University (Texas, US),
Rice University scientists have determined that two-dimensional boron is a natural low-temperature superconductor. In fact, it may be the only 2-D material with such potential.
Rice theoretical physicist Boris Yakobson and his co-workers published their calculations that show atomically flat boron is metallic and will transmit electrons with no resistance. …
The hitch, as with most superconducting materials, is that it loses its resistivity only when very cold, in this case between 10 and 20 kelvins (roughly, minus-430 degrees Fahrenheit). But for making very small superconducting circuits, it might be the only game in town.
The basic phenomenon of superconductivity has been known for more than 100 years, said Evgeni Penev, a research scientist in the Yakobson group, but had not been tested for its presence in atomically flat boron.
“It’s well-known that the material is pretty light because the atomic mass is small,” Penev said. “If it’s metallic too, these are two major prerequisites for superconductivity. That means at low temperatures, electrons can pair up in a kind of dance in the crystal.”
“Lower dimensionality is also helpful,” Yakobson said. “It may be the only, or one of very few, two-dimensional metals. So there are three factors that gave the initial motivation for us to pursue the research. Then we just got more and more excited as we got into it.”
Electrons with opposite momenta and spins effectively become Cooper pairs; they attract each other at low temperatures with the help of lattice vibrations, the so-called “phonons,” and give the material its superconducting properties, Penev said. “Superconductivity becomes a manifestation of the macroscopic wave function that describes the whole sample. It’s an amazing phenomenon,” he said.
It wasn’t entirely by chance that the first theoretical paper establishing conductivity in a 2-D material appeared at roughly the same time the first samples of the material were made by laboratories in the United States and China. In fact, an earlier paper by the Yakobson group had offered a road map for doing so.
That 2-D boron has now been produced is a good thing, according to Yakobson and lead authors Penev and Alex Kutana, a postdoctoral researcher at Rice. “We’ve been working to characterize boron for years, from cage clusters to nanotubes to planer sheets, but the fact that these papers appeared so close together means these labs can now test our theories,” Yakobson said.
“In principle, this work could have been done three years ago as well,” he said. “So why didn’t we? Because the material remained hypothetical; okay, theoretically possible, but we didn’t have a good reason to carry it too far.
“But then last fall it became clear from professional meetings and interactions that it can be made. Now those papers are published. When you think it’s coming for real, the next level of exploration becomes more justifiable,” Yakobson said.
Boron atoms can make more than one pattern when coming together as a 2-D material, another characteristic predicted by Yakobson and his team that has now come to fruition. These patterns, known as polymorphs, may allow researchers to tune the material’s conductivity “just by picking a selective arrangement of the hexagonal holes,” Penev said.
He also noted boron’s qualities were hinted at when researchers discovered more than a decade ago that magnesium diborite is a high-temperature electron-phonon superconductor. “People realized a long time ago the superconductivity is due to the boron layer,” Penev said. “The magnesium acts to dope the material by spilling some electrons into the boron layer. In this case, we don’t need them because the 2-D boron is already metallic.”
Penev suggested that isolating 2-D boron between layers of inert hexagonal boron nitride (aka “white graphene”) might help stabilize its superconducting nature.
Without the availability of a block of time on several large government supercomputers, the study would have taken a lot longer, Yakobson said. “Alex did the heavy lifting on the computational work,” he said. “To turn it from a lunchtime discussion into a real quantitative research result took a very big effort.”
The paper is the first by Yakobson’s group on the topic of superconductivity, though Penev is a published author on the subject. “I started working on superconductivity in 1993, but it was always kind of a hobby, and I hadn’t done anything on the topic in 10 years,” Penev said. “So this paper brings it full circle.”
Dexter Johnson has published an April 5, 2016 post on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) about this latest Rice University work on 2D boron that includes comments from his email interview with Penev.
Both the University of Georgia (US) and the American Associates Ben-Gurion University of the Negev (Israel) have issued press releases about a joint research project resulting in the world’s smallest diode.
Researchers at the University of Georgia and at Ben-Gurion University in Israel have demonstrated for the first time that nanoscale electronic components can be made from single DNA molecules. Their study, published in the journal Nature Chemistry, represents a promising advance in the search for a replacement for the silicon chip.
The finding may eventually lead to smaller, more powerful and more advanced electronic devices, according to the study’s lead author, Bingqian Xu.
“For 50 years, we have been able to place more and more computing power onto smaller and smaller chips, but we are now pushing the physical limits of silicon,” said Xu, an associate professor in the UGA College of Engineering and an adjunct professor in chemistry and physics. “If silicon-based chips become much smaller, their performance will become unstable and unpredictable.”
To find a solution to this challenge, Xu turned to DNA. He says DNA’s predictability, diversity and programmability make it a leading candidate for the design of functional electronic devices using single molecules.
In the Nature Chemistry paper, Xu and collaborators at Ben-Gurion University of the Negev describe using a single molecule of DNA to create the world’s smallest diode. A diode is a component vital to electronic devices that allows current to flow in one direction but prevents its flow in the other direction.
Xu and a team of graduate research assistants at UGA isolated a specifically designed single duplex DNA of 11 base pairs and connected it to an electronic circuit only a few nanometers in size. After the measured current showed no special behavior, the team site-specifically intercalated a small molecule named coralyne into the DNA. They found the current flowing through the DNA was 15 times stronger for negative voltages than for positive voltages, a necessary feature of a diode.
“This finding is quite counterintuitive because the molecular structure is still seemingly symmetrical after coralyne intercalation,” Xu said.
A theoretical model developed by Yanantan Dubi of Ben-Gurion University indicated the diode-like behavior of DNA originates from the bias voltage-induced breaking of spatial symmetry inside the DNA molecule after the coralyne is inserted.
“Our discovery can lead to progress in the design and construction of nanoscale electronic elements that are at least 1,000 times smaller than current components,” Xu said.
The research team plans to continue its work, with the goal of constructing additional molecular devices and enhancing the performance of the molecular diode.
The world’s smallest diode, the size of a single molecule, has been developed collaboratively by U.S. and Israeli researchers from the University of Georgia and Ben-Gurion University of the Negev (BGU).
“Creating and characterizing the world’s smallest diode is a significant milestone in the development of molecular electronic devices,” explains Dr. Yoni Dubi, a researcher in the BGU Department of Chemistry and Ilse Katz Institute for Nanoscale Science and Technology. “It gives us new insights into the electronic transport mechanism.”
Continuous demand for more computing power is pushing the limitations of present day methods. This need is driving researchers to look for molecules with interesting properties and find ways to establish reliable contacts between molecular components and bulk materials in an electrode, in order to mimic conventional electronic elements at the molecular scale.
An example for such an element is the nanoscale diode (or molecular rectifier), which operates like a valve to facilitate electronic current flow in one direction. A collection of these nanoscale diodes, or molecules, has properties that resemble traditional electronic components such as a wire, transistor or rectifier. The emerging field of single molecule electronics may provide a way to overcome Moore’s Law– the observation that over the history of computing hardware the number of transistors in a dense integrated circuit has doubled approximately every two years – beyond the limits of conventional silicon integrated circuits.
Prof. Bingqian Xu’s group at the College of Engineering at the University of Georgia took a single DNA molecule constructed from 11 base pairs and connected it to an electronic circuit only a few nanometers in size. When they measured the current through the molecule, it did not show any special behavior. However, when layers of a molecule called “coralyne,” were inserted (or intercalated) between layers of DNA, the behavior of the circuit changed drastically. The current jumped to 15 times larger negative vs. positive voltages–a necessary feature for a nano diode. “In summary, we have constructed a molecular rectifier by intercalating specific, small molecules into designed DNA strands,” explains Prof. Xu.
Dr. Dubi and his student, Elinor Zerah-Harush, constructed a theoretical model of the DNA molecule inside the electric circuit to better understand the results of the experiment. “The model allowed us to identify the source of the diode-like feature, which originates from breaking spatial symmetry inside the DNA molecule after coralyne is inserted.”
There’s an April 4, 2016 posting on the Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) which provides a brief overview and a link to a previous essay, Whatever Happened to the Molecular Computer?
Since that news story broke, damage control from the NGI [UK National Graphene Institute], the University of Manchester, and BGT Materials, the company identified in the Times article, has been coming fast and furious. Even this blog’s coverage of the story has gotten comments from representatives of BGT Materials and the University of Manchester.
There was perhaps no greater effort in this coordinated defense than getting Andre Geim, a University of Manchester researcher who was a co-discoverer of graphene, to weigh in. …
Despite Geim’s recent public defense, and a full-on PR campaign to turn around the perception that the UK government was investing millions into UK research only to have the fruits of that research sold off to foreign interests, there was news last week that the UK Parliament would be launching an inquiry into the “benefits and disbenefits of the way that graphene’s intellectual property and commercialisation has been managed, including through research and innovation collaborations.”
The timing for the inquiry is intriguing but there have been no public comments or hints that the NGI kerfuffle precipitated the Graphene Inquiry,
The Science and Technology Committee issues a call for written submissions for its inquiry on graphene.
The inquiry explores the lessons from graphene for research and innovation in other areas, as well as the management and commercialisation of graphene’s intellectual property. Issues include:
The research obstacles that have had to be overcome for graphene, including identifying research priorities and securing research funding, and the lessons from this for other areas of research.
The factors that have contributed to the successful development of graphene and how these might be applied in other areas, including translating research into innovation, managing/sharing intellectual property, securing development funding, and bringing key stakeholders together.
The benefits and disbenefits of the way that graphene’s intellectual property and commercialisation has been managed, including through research and innovation collaborations, and the lessons from this for other areas.
The deadline for submissions is midday on Monday 18 April 2016.
The Committee expects to take oral evidence later in April 2016.
Getting back to the NGI, BGT Materials, and University of Manchester situation, there’s a forceful comment from Daniel Cochlin (identified as a graphene communications and marketing manager at the University of Manchester in an April 2, 2015 posting on Nanoclast) in Dexter’s latest posting about the NGI. From the comments section of a March 29, 2016 posting on the Nanoclast blog,
Maybe the best way to respond is to directly counter some of your assertions.
1. The NGI’s comments on this blog were to counter factual inaccuracies contained in your story. Your Editor-in-Chief and Editorial Director, Digital were also emailed to complain about the story, with not so much as an acknowledgement of the email.
2. There was categorically no ‘coaxing’ of Sir Andre to make comments. He was motivated to by the inaccuracies and insinuations of the Sunday Times article.
3. Members of the Science and Technology Select Committee visited the NGI about ten days before the Sunday Times article and this was followed by their desire to hold an evidence session to discuss graphene commercialisation.
4. The matter of how many researchers work in the NGI is not ‘hotly contested’. The NGI is 75% full with around 130 researchers regularly working there. We would expect this figure to grow by 10-15% within the next few days as other facilities are closed down.
5. Graphene Lighting PLC is the spin-out company set up to produce and market the lightbulb. To describe them as a ‘shadowy spin-out’ is unjustified and, I would suggest, libelous [emphasis mine].
6. Your question about why, if BGT Materials is a UK company, was it not mentioned [emphasis mine] in connection with the lightbulb is confusing – as stated earlier the company set up to manage the lightbulb was Graphene Lighting PLC.
Let’s hope it doesn’t take three days for this to be accepted by your moderators, as it did last time.
*ETA March 31, 2016 at 1530 hours PDT: Dexter has posted response comments in answer to Cochlin’s. You can read them for youself here .* I have a couple of observations (1) The use of the word ‘libelous’ seems a bit over the top. However, it should be noted that it’s much easier to sue someone for libel in England where the University of Manchester is located than it is in most jurisdictions. In fact, there’s an industry known as ‘libel tourism’ where litigious companies and individuals shop around for a jurisdiction such as England where they can easily file suit. (2) As for BGT Materials not being mentioned in the 2015 press release for the graphene lightbulb, I cannot emphasize how unusual that is. Generally speaking, everyone and every agency that had any involvement in developing and bringing to market a new product, especially one that was the ‘first consumer graphene-based product’, is mentioned. When you consider that BGT Materials is a newish company according to its About page,
BGT Materials Limited (BGT), established in 2013, is dedicated to the development of graphene technologies that utilize this “wonder material” to enhance our lives. BGT has pioneered the mass production of large-area, high-quality graphene rapidly achieving the first milestone required for the commercialization of graphene-enhanced applications.
the situation grows more peculiar. A new company wants and needs that kind of exposure to attract investment and/or keep current stakeholders happy. One last comment about BGT Materials and its public relations, Thanasis Georgiou, VP BGT Materials, Visiting scientist at the University of Manchester (more can be found on his website’s About page), waded into the comments section of Dexter’s March 15, 2016 posting and the first about the kerfuffle. Gheorgiou starts out in a relatively friendly fashion but his followup has a sharper tone,
I appreciate your position but a simple email to us and we would clarify most of the issues that you raised. Indeed your article carries the same inaccuracies that the initial Sunday Times article does, which is currently the subject of a legal claim by BGT Materials. [emphasis mine]
For example, BGT Materials is a UK registered company, not a Taiwanese one. A quick google search and you can confirm this. There was no “shadowy Canadian investor”, the company went through a round of financing, as most technology startups do, in order to reach the market quickly.
It’s hard to tell if Gheorgiou is trying to inform Dexter or threaten him in his comment to the March 15, 2016 posting but taken together with Daniel Cochlin’s claim of libel in his comment to the March 29, 2016 posting, it suggests an attempt at intimidation.
These are understandable responses given the stakes involved but moving to the most damaging munitions in your arsenal is usually not a good choice for your first or second response.
Some research from RMIT University (Australia) and the University of Adelaide (Australia) is make quite an impression. A Feb. 19, 2016 article by Caleb Radford for The Lead explains some of the excitement,
NEW light-manipulating nano-technology may soon be used to make smart contact lenses.
The University of Adelaide in South Australia worked closely with RMIT University to develop small hi-tech lenses to filter harmful optical radiation without distorting vision.
Dr Withawat Withayachumnankul from the University of Adelaide helped conceive the idea and said the potential applications of the technology included creating new high-performance devices that connect to the Internet.
The light manipulation relies on creating tiny artificial crystals termed “dielectric resonators”, which are a fraction of the wavelength of light – 100-200 nanometers, or over 500 times thinner than a human hair.
The research combined the University of Adelaide researchers’ expertise in interaction of light with artificial materials with the materials science and nanofabrication expertise at RMIT University.
Dr Withawat Withayachumnankul, from the University of Adelaide’s School of Electrical and Electronic Engineering, said: “Manipulation of light using these artificial crystals uses precise engineering.
“With advanced techniques to control the properties of surfaces, we can dynamically control their filter properties, which allow us to potentially create devices for high data-rate optical communication or smart contact lenses.
“The current challenge is that dielectric resonators only work for specific colours, but with our flexible surface we can adjust the operation range simply by stretching it.”
Associate Professor Madhu Bhaskaran, Co-Leader of the Functional Materials and Microsystems Research Group at RMIT, said the devices were made on a rubber-like material used for contact lenses.
“We embed precisely-controlled crystals of titanium oxide, a material that is usually found in sunscreen, in these soft and pliable materials,” she said.
“Both materials are proven to be bio-compatible, forming an ideal platform for wearable optical devices.
“By engineering the shape of these common materials, we can create a device that changes properties when stretched. This modifies the way the light interacts with and travels through the device, which holds promise of making smart contact lenses and stretchable colour changing surfaces.”
Lead author and RMIT researcher Dr. Philipp Gutruf said the major scientific hurdle overcome by the team was combining high temperature processed titanium dioxide with the rubber-like material, and achieving nanoscale features.
“With this technology, we now have the ability to develop light weight wearable optical components which also allow for the creation of futuristic devices such as smart contact lenses or flexible ultra thin smartphone cameras,” Gutruf said.
ETA Feb. 24, 2016: Dexter Johnson (Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website) has chimed in with additional insight into this research in his Feb. 23, 2016 posting.
There are already a number of biosensors based on plasmonic interferometry in use but this latest breakthrough from Brown University (US) could make them cheaper and more accessible. A Feb. 16, 2016 Brown University news release (also on EurekAlert), announces the new technique,
Imagine a hand-held environmental sensor that can instantly test water for lead, E. coli, and pesticides all at the same time, or a biosensor that can perform a complete blood workup from just a single drop. That’s the promise of nanoscale plasmonic interferometry, a technique that combines nanotechnology with plasmonics–the interaction between electrons in a metal and light.
Now researchers from Brown University’s School of Engineering have made an important fundamental advance that could make such devices more practical. The research team has developed a technique that eliminates the need for highly specialized external light sources that deliver coherent light, which the technique normally requires. The advance could enable more versatile and more compact devices.
“It has always been assumed that coherent light was necessary for plasmonic interferometry,” said Domenico Pacifici, a professor of engineering who oversaw the work with his postdoctoral researcher Dongfang Li, and graduate student Jing Feng. “But we were able to disprove that assumption.”
The research is described in Nature Scientific Reports.
Plasmonic interferometers make use of the interaction between light and surface plasmon polaritons, density waves created when light energy rattles free electrons in a metal. One type of interferometer looks like a bull’s-eye structure etched into a thin layer of metal. In the center is a hole poked through the metal layer with a diameter of about 300 nanometers–about 1,000 times smaller than the diameter of a human hair. The hole is encircled by a series of etched grooves, with diameters of a few micrometers. Thousands of these bulls-eyes can be placed on a chip the size of a fingernail.
When light from an external source is shown onto the surface of an interferometer, some of the photons go through the central hole, while others are scattered by the grooves. Those scattered photons generate surface plasmons that propagate through the metal inward toward the hole, where they interact with photons passing through the hole. That creates an interference pattern in the light emitted from the hole, which can be recorded by a detector beneath the metal surface.
When a liquid is deposited on top of an interferometer, the light and the surface plasmons propagate through that liquid before they interfere with each other. That alters the interference patterns picked up by the detector depending on the chemical makeup of the liquid or compounds present in it. By using different sizes of groove rings around the hole, the interferometers can be tuned to detect the signature of specific compounds or molecules. With the ability to put many differently tuned interferometers on one chip, engineers can hypothetically make a versatile detector.
Up to now, all plasmonic interferometers have required the use of highly specialized external light sources that can deliver coherent light–beams in which light waves are parallel, have the same wavelength, and travel in-phase (meaning the peaks and valleys of the waves are aligned). Without coherent light sources, the interferometers cannot produce usable interference patterns. Those kinds of light sources, however, tend to be bulky, expensive, and require careful alignment and periodic recalibration to obtain a reliable optical response.
But Pacifici and his group have come up with a way to eliminate the need for external coherent light. In the new method, fluorescent light-emitting atoms are integrated directly within the tiny hole in the center of the interferometer. An external light source is still necessary to excite the internal emitters, but it need not be a specialized coherent source.
“This is a whole new concept for optical interferometry,” Pacifici said, “an entirely new device.”
In this new device, incoherent light shown on the interferometer causes the fluorescent atoms inside the center hole to generate surface plasmons. Those plasmons propagate outward from the hole, bounce off the groove rings, and propagate back toward the hole after. Once a plasmon propagates back, it interacts with the atom that released it, causing an interference with the directly transmitted photon. Because the emission of a photon and the generation of a plasmon are indistinguishable, alternative paths originating from the same emitter, the process is naturally coherent and interference can therefore occur even though the emitters are excited incoherently.
“The important thing here is that this is a self-interference process,” Pacifici said. “It doesn’t matter that you’re using incoherent light to excite the emitters, you still get a coherent process.”
In addition to eliminating the need for specialized external light sources, the approach has several advantages, Pacifici said. Because the surface plasmons travel out from the hole and back again, they probe the sample on top of the interferometer surface twice. That makes the device more sensitive.
But that’s not the only advantage. In the new device, external light can be projected from underneath the metal surface containing the interferometers instead of from above. That eliminates the need for complex illumination architectures on top of the sensing surface, which could make for easier integration into compact devices.
The embedded light emitters also eliminate the need to control the amount of sample liquid deposited on the interferometer’s surface. Large droplets of liquid can cause lensing effects, a bending of light that can scramble the results from the interferometer. Most plasmonic sensors make use of tiny microfluidic channels to deliver a thin film of liquid to avoid lensing problems. But with internal light emitters excited from the bottom surface, the external light never comes in contact with the sample, so lensing effects are negated, as is the need for microfluidics.
Finally, the internal emitters produce a low intensity light. That’s good for probing delicate samples, such as proteins, than can be damaged by high-intensity light.
More work is required to get the system out of the lab and into devices, and Pacifici and his team plan to continue to refine the idea. The next step will be to try eliminating the external light source altogether. It might be possible, the researchers say, to eventually excite the internal emitters using tiny fiber optic lines, or perhaps electric current.
Still, this initial proof-of-concept is promising, Pacifici said.
“From a fundamental standpoint, we think this new device represents a significant step forward,” he said, “a first demonstration of plasmonic interferometry with incoherent light”.
One final comment, Dexter Johnson has a Feb. 18, 2016 posting about this interferometer where he references Pacifici’s past work in this area, as well as, this latest breakthrough. Dexter’s posting can be found on his Nanoclast blog which is on the IEEE (Institute of Electrical and Electronics Engineers) website.
There’s some exciting news from the University of Alberta. It emerges from a team that has reconsidered transistor architecture, from a Feb. 9, 2016 news item on ScienceDaily,
An engineering research team at the University of Alberta has invented a new transistor that could revolutionize thin-film electronic devices.
The team was exploring new uses for thin film transistors (TFT), which are most commonly found in low-power, low-frequency devices like the display screen you’re reading from now. Efforts by researchers and the consumer electronics industry to improve the performance of the transistors have been slowed by the challenges of developing new materials or slowly improving existing ones for use in traditional thin film transistor architecture, known technically as the metal oxide semiconductor field effect transistor (MOSFET).
But the U of A electrical engineering team did a run-around on the problem. Instead of developing new materials, the researchers improved performance by designing a new transistor architecture that takes advantage of a bipolar action. In other words, instead of using one type of charge carrier, as most thin film transistors do, it uses electrons and the absence of electrons (referred to as “holes”) to contribute to electrical output. Their first breakthrough was forming an ‘inversion’ hole layer in a ‘wide-bandgap’ semiconductor, which has been a great challenge in the solid-state electronics field.
Once this was achieved, “we were able to construct a unique combination of semiconductor and insulating layers that allowed us to inject “holes” at the MOS interface,” said Gem Shoute, a PhD student in the Department of Electrical and Computer Engineering who is lead author on the article. Adding holes at the interface increased the chances of an electron “tunneling” across a dielectric barrier. Through this phenomenon, a type of quantum tunnelling, “we were finally able to achieve a transistor that behaves like a bipolar transistor.”
“It’s actually the best performing [TFT] device of its kind–ever,” said materials engineering professor Ken Cadien, a co-author on the paper. “This kind of device is normally limited by the non-crystalline nature of the material that they are made of”
The dimension of the device itself can be scaled with ease in order to improve performance and keep up with the need of miniaturization, an advantage that modern TFTs lack. The transistor has power-handling capabilities at least 10 times greater than commercially produced thin film transistors.
Electrical engineering professor Doug Barlage, who is Shoute’s PhD supervisor and one of the paper’s lead authors, says his group was determined to try new approaches and break new ground. He says the team knew it could produce a high-power thin film transistor–it was just a matter of finding out how.
“Our goal was to make a thin film transistor with the highest power handling and switching speed possible. Not many people want to look into that, but the raw properties of the film indicated dramatic performance increase was within reach,” he said. “The high quality sub 30 nanometre (a human hair is 50,000 nanometres wide) layers of materials produced by Professor Cadien’s group enabled us to successfully try these difficult concepts”
In the end, the team took advantage of the very phenomena other researchers considered roadblocks.
“Usually tunnelling current is considered a bad thing in MOSFETs and it contributes to unnecessary loss of power, which manifests as heat,” explained Shoute. “What we’ve done is build a transistor that considers tunnelling current a benefit.”
ETA Feb. 12, 2016: Dexter Johnson has written up the research in a Feb. 11, 2016 posting (on this Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers) where he offers enthusiam (rare) and additional explanation.
On Thursday, Dec. 31, 2015, the bear ate me (borrowed from Joan Armatrading’s song “Eating the bear”) or, if you prefer this phrase, I had a meltdown when I lost more than 1/2 of a post that I’d worked on for hours.
There’s been a problem dogging me for some months. I will write up something and save it as a draft only to find that most of the text has been replaced by a single URL repeated several times. I have not been able to source the problem which is intermittent. (sigh)
Moving on to happier thoughts, it’s a new year. Happy 2016!
As a way of swinging into the new year, here’s a brief wrap up for 2015.
As always, I thank my international colleagues David Bruggeman (Pasco Phronesis blog), Dexter Johnson (Nanoclast blog on the IEEE [International Electrical and Electronics Engineers website]), and Dr. Andrew Maynard (2020 science blog and Risk Innovation Laboratory at Arizona State University), all of whom have been blogging as long or longer than I have (FYI, FrogHeart began in April/May 2008). More importantly, they have been wonderful sources of information and inspiration.
In particular, David, thank you for keeping me up to date on the Canadian and international science policy situations. Also, darn you for scooping me on the Canadian science policy scene, on more than one occasion.
Dexter, thank you for all those tidbits about the science and the business of nanotechnology that you tuck into your curated blog. There’s always a revelation or two to be found in your writings.
Andrew, congratulations on your move to Arizona State University (from the University of Michigan Risk Science Center) where you are founding their Risk Innovation Lab.
While Andrew’s blog has become more focused on the topic of risk, Andrew continues to write about nanotechnology by extending the topic to emerging technologies.
In fact, I have a Dec. 3, 2015 post featuring a recent Nature article by Andrew on the occasion of the upcoming 2016 World Economic Forum in Davos. In it he discusses new approaches to risk as occasioned by the rise of emerging technologies such synthetic biology, nanotechnology, and more.
While Tim Harper, serial entrepreneur and scientist, is not actively blogging about nanotechnology these days, his writings do pop up in various places, notably on the Azonano website where he is listed as an expert, which he most assuredly is. His focus these days is in establishing graphene-based startups.
Moving on to another somewhat related topic. While no one else seems to be writing about nanotechnology as extensively as I do, there are many, many Canadian science bloggers.
Science Borealis (scroll down to get to the feeds), a Canadian science blog aggregator, is my main source of information on the Canadian scene. Thank you for my second Editors Pick award. In 2014 the award was in the Science in Society category and in 2015 it’s in the Engineering & Tech category (last item on the list).
While I haven’t yet heard about the results of Paige Jarreau’s and Science Borealis’ joint survey on the Canadian science blog readers (the reader doesn’t have to be Canadian but the science blog has to be), I was delighted to be asked and to participate. My Dec. 14, 2015 posting listed preliminary results,
They have compiled some preliminary results:
21 bloggers + Science Borealis hosted the survey.
523 respondents began the survey.
338 respondents entered their email addresses to win a prize
63% of 400 Respondents are not science bloggers
56% of 402 Respondents describe themselves as scientists
76% of 431 Respondents were not familiar with Science Borealis before taking the survey
85% of 403 Respondents often, very often or always seek out science information online.
59% of 402 Respondents rarely or never seek science content that is specifically Canadian
And most of all, a heartfelt thank you to all who read this blog.
FrogHeart and 2015
There won’t be any statistics from the software packaged with my hosting service (AWSTATS and Webalizer). Google and its efforts to minimize spam (or so it claims) had a devastating effect on my visit numbers. As I used those numbers as motivation, fantasizing that my readership was increasing, I had to find other means for motivation and am not quite sure how I did it but I upped publication to three posts per day (five-day week) throughout most of the year.
With 260 working days (roughly) in a year that would have meant a total of 780 posts. I’ve rounded that down to 700 posts to allow for days off and days where I didn’t manage three.
In 2015 I logged my 4000th post and substantially contributed to the Science Borealis 2015 output. In the editors’ Dec. 20, 2015 post,
… Science Borealis now boasts a membership of 122 blogs — about a dozen up from last year. Together, this year, our members have posted over 4,400 posts, with two weeks still to go….
At a rough guess, I’d estimate that FrogHeart was responsible for 15% of the Science Borealis output and 121 bloggers were responsible for the other 85%.
That’s enough for 2015.
FrogHeart and 2016
Bluntly, I do not know anything other than a change of some sort is likely.
Hopefully, I will be doing more art/science projects (my last one was ‘A digital poetry of gold nanoparticles’). I was awarded a small grant ($400 CAD) from the Canadian Academy of Independent Scholars (thank you!) for a spoken word project to be accomplished later this year.
As for this blog, I hope to continue.
In closing, I think it’s only fair to share Joan Armatrading’s song, ‘Eating the bear’. May we all do so in 2016,
Duke University’s (North Carolina, US) Center for Environmental Implications of Nano Technology (CEINT) is back in the news. An August 18, 2015 news item on Nanotechnology Now highlights two new projects intended to launch the field of nanoinformatics,
In two new studies, researchers from across the country spearheaded by Duke University faculty have begun to design the framework on which to build the emerging field of nanoinformatics.
Nanoinformatics is, as the name implies, the combination of nanoscale research and informatics. It attempts to determine which information is relevant to the field and then develop effective ways to collect, validate, store, share, analyze, model and apply that information — with the ultimate goal of helping scientists gain new insights into human health, the environment and more.
In the first paper, published on August 10, 2015, in the Beilstein Journal of Nanotechnology, researchers begin the conversation of how to standardize the way nanotechnology data are curated.
Because the field is young and yet extremely diverse, data are collected and reported in different ways in different studies, making it difficult to compare apples to apples. Silver nanoparticles in a Florida swamp could behave entirely differently if studied in the Amazon River. And even if two studies are both looking at their effects in humans, slight variations like body temperature, blood pH levels or nanoparticles only a few nanometers larger can give different results. For future studies to combine multiple datasets to explore more complex questions, researchers must agree on what they need to know when curating nanomaterial data.
“We chose curation as the focus of this first paper because there are so many disparate efforts that are all over the road in terms of their missions, and the only thing they all have in common is that somehow they have to enter data into their resources,” said Christine Hendren, a research scientist at Duke and executive director of the Center for the Environmental Implications of NanoTechnology (CEINT). “So we chose that as the kernel of this effort to be as broad as possible in defining a baseline for the nanoinformatics community.”
The paper is the first in a series of six that will explore what people mean — their vocabulary, definitions, assumptions, research environments, etc. — when they talk about gathering data on nanomaterials in digital form. And to get everyone on the same page, the researchers are seeking input from all stakeholders, including those conducting basic research, studying environmental implications, harnessing nanomaterial properties for applications, developing products and writing government regulations.
The daunting task is being undertaken by the Nanomaterial Data Curation Initiative (NDCI), a project of the National Cancer Informatics Nanotechnology Working Group (NCIP NanoWG) lead by a diverse team of nanomaterial data stakeholders. If successful, not only will these disparate interests be able to combine their data, the project will highlight what data are missing and help drive the research priorities of the field.
In the second paper, published on July 16, 2015, in Science of The Total Environment, Hendren and her colleagues at CEINT propose a new, standardized way of studying the properties of nanomaterials.
“If we’re going to move the field forward, we have to be able to agree on what measurements are going to be useful, which systems they should be measured in and what data gets reported, so that we can make comparisons,” said Hendren.
The proposed strategy uses functional assays — relatively simple tests carried out in standardized, well-described environments — to measure nanomaterial behavior in actual systems.
For some time, the nanomaterial research community has been trying to use measured nanomaterial properties to predict outcomes. For example, what size and composition of a nanoparticle is most likely to cause cancer? The problem, argues Mark Wiesner, director of CEINT, is that this question is far too complex to answer.
“Environmental researchers use a parameter called biological oxygen demand to predict how much oxygen a body of water needs to support its ecosystem,” explains Wiesner. “What we’re basically trying to do with nanomaterials is the equivalent of trying to predict the oxygen level in a lake by taking an inventory of every living organism, mathematically map all of their living mechanisms and interactions, add up all of the oxygen each would take, and use that number as an estimate. But that’s obviously ridiculous and impossible. So instead, you take a jar of water, shake it up, see how much oxygen is taken and extrapolate that. Our functional assay paper is saying do that for nanomaterials.”
The paper makes suggestions as to what nanomaterials’ “jar of water” should be. It identifies what parameters should be noted when studying a specific environmental system, like digestive fluids or wastewater, so that they can be compared down the road.
It also suggests two meaningful processes for nanoparticles that should be measured by functional assays: attachment efficiency (does it stick to surfaces or not) and dissolution rate (does it release ions).
In describing how a nanoinformatics approach informs the implementation of a functional assay testing strategy, Hendren said “We’re trying to anticipate what we want to ask the data down the road. If we’re banking all of this comparable data while doing our near-term research projects, we should eventually be able to support more mechanistic investigations to make predictions about how untested nanomaterials will behave in a given scenario.”
The first paper listed in open access while the second paper is behind a paywall.
I’m (mostly) giving the final comments to Dexter Johnson who in an August 20, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website) had this to say (Note: Links have been removed),
It can take days for a supercomputer to unravel all the data contained in a single human genome. So it wasn’t long after mapping the first human genome that researchers coined the umbrella term “bioinformatics” in which a variety of methods and computer technologies are used for organizing and analyzing all that data.
Now teams of researchers led by scientists at Duke University believe that the field of nanotechnology has reached a critical mass of data and that a new field needs to be established, dubbed “nanoinformatics.
While being able to better organize and analyze data to study the impact of nanomaterials on the environment should benefit the field, what seems to remain a more pressing concern is having the tools for measuring nanomaterials outside of a vacuum and in water and air environments.”
I gather Christine Hendren has succeeded Mark Weisner as CEINT’s executive director.
Princeton University’s 3D printed contact lenses with LED (light-emitting diodes) included are not meant for use by humans or other living beings but they are a flashy demonstration. From a Dec. 10, 2014 news item on phys.org,
As part of a project demonstrating new 3-D printing techniques, Princeton researchers have embedded tiny light-emitting diodes into a standard contact lens, allowing the device to project beams of colored light.
Michael McAlpine, the lead researcher, cautioned that the lens is not designed for actual use—for one, it requires an external power supply. Instead, he said the team created the device to demonstrate the ability to “3-D print” electronics into complex shapes and materials.
“This shows that we can use 3-D printing to create complex electronics including semiconductors,” said McAlpine, an assistant professor of mechanical and aerospace engineering. “We were able to 3-D print an entire device, in this case an LED.”
A Dec. 9, 2014 Princeton University news release by John Sullivan, which originated the news item, describes the 3D lens, the objectives for this project, and an earlier project involving a ‘bionic ear’ in more detail (Note: Links have been removed),
The hard contact lens is made of plastic. The researchers used tiny crystals, called quantum dots, to create the LEDs that generated the colored light. Different size dots can be used to generate various colors.
“We used the quantum dots [also known as nanoparticles] as an ink,” McAlpine said. “We were able to generate two different colors, orange and green.”
The contact lens is also part of an ongoing effort to use 3-D printing to assemble diverse, and often hard-to-combine, materials into functioning devices. In the recent past, a team of Princeton professors including McAlpine created a bionic ear out of living cells with an embedded antenna that could receive radio signals.
Yong Lin Kong, a researcher on both projects, said the bionic ear presented a different type of challenge.
“The main focus of the bionic ear project was to demonstrate the merger of electronics and biological materials,” said Kong, a graduate student in mechanical and aerospace engineering.
Kong, the lead author of the Oct. 31  article describing the current work in the journal Nano Letters, said that the contact lens project, on the other hand, involved the printing of active electronics using diverse materials. The materials were often mechanically, chemically or thermally incompatible — for example, using heat to shape one material could inadvertently destroy another material in close proximity. The team had to find ways to handle these incompatibilities and also had to develop new methods to print electronics, rather than use the techniques commonly used in the electronics industry.
“For example, it is not trivial to pattern a thin and uniform coating of nanoparticles and polymers without the involvement of conventional microfabrication techniques, yet the thickness and uniformity of the printed films are two of the critical parameters that determine the performance and yield of the printed active device,” Kong said.
To solve these interdisciplinary challenges, the researchers collaborated with Ian Tamargo, who graduated this year with a bachelor’s degree in chemistry; Hyoungsoo Kim, a postdoctoral research associate and fluid dynamics expert in the mechanical and aerospace engineering department; and Barry Rand, an assistant professor of electrical engineering and the Andlinger Center for Energy and the Environment.
McAlpine said that one of 3-D printing’s greatest strengths is its ability to create electronics in complex forms. Unlike traditional electronics manufacturing, which builds circuits in flat assemblies and then stacks them into three dimensions, 3-D printers can create vertical structures as easily as horizontal ones.
“In this case, we had a cube of LEDs,” he said. “Some of the wiring was vertical and some was horizontal.”
To conduct the research, the team built a new type of 3-D printer that McAlpine described as “somewhere between off-the-shelf and really fancy.” Dan Steingart, an assistant professor of mechanical and aerospace engineering and the Andlinger Center, helped design and build the new printer, which McAlpine estimated cost in the neighborhood of $20,000.
McAlpine said that he does not envision 3-D printing replacing traditional manufacturing in electronics any time soon; instead, they are complementary technologies with very different strengths. Traditional manufacturing, which uses lithography to create electronic components, is a fast and efficient way to make multiple copies with a very high reliability. Manufacturers are using 3-D printing, which is slow but easy to change and customize, to create molds and patterns for rapid prototyping.
Prime uses for 3-D printing are situations that demand flexibility and that need to be tailored to a specific use. For example, conventional manufacturing techniques are not practical for medical devices that need to be fit to a patient’s particular shape or devices that require the blending of unusual materials in customized ways.
“Trying to print a cellphone is probably not the way to go,” McAlpine said. “It is customization that gives the power to 3-D printing.”
In this case, the researchers were able to custom 3-D print electronics on a contact lens by first scanning the lens, and feeding the geometric information back into the printer. This allowed for conformal 3-D printing of an LED on the contact lens.
Here’s what the contact lens looks like,
Michael McAlpine, an assistant professor of mechanical and aerospace engineering at Princeton, is leading a research team that uses 3-D printing to create complex electronics devices such as this light-emitting diode printed in a plastic contact lens. (Photos by Frank Wojciechowski)
Also, here’s a link to and a citation for the research paper,
3D Printed Quantum Dot Light-Emitting Diodes by Yong Lin Kong, Ian A. Tamargo, Hyoungsoo Kim, Blake N. Johnson, Maneesh K. Gupta, Tae-Wook Koh, Huai-An Chin, Daniel A. Steingart, Barry P. Rand, and Michael C. McAlpine. Nano Lett., 2014, 14 (12), pp 7017–7023 DOI: 10.1021/nl5033292 Publication Date (Web): October 31, 2014
I’m always a day behind for Dexter Johnson’s postings on the Nanoclast blog (located on the IEEE [institute of Electrical and Electronics Engineers]) so I didn’t see his Dec. 11, 2014 post about these 3Dprinted LED[embedded contact lenses until this morning (Dec. 12, 2014). In any event, I’m excerpting his very nice description of quantum dots,
The LED was made out of the somewhat exotic nanoparticles known as quantum dots. Quantum dots are a nanocrystal that have been fashioned out of semiconductor materials and possess distinct optoelectronic properties, most notably fluorescence, which makes them applicable in this case for the LEDs of the contact lens.
“We used the quantum dots [also known as nanoparticles] as an ink,” McAlpine said. “We were able to generate two different colors, orange and green.”
I encourage you to read Dexter’s post as he provides additional insights based on his long-standing membership within the nanotechnology community.