Tag Archives: Massachusetts Institute of Technology

Swallow your technology and wear it inside (wearable tech: 2 of 3)

While there are a number of wearable and fashionable pieces of technology that monitor heart rate and breathing, they are all worn on the outside of your body. Researchers are working on an alternative that can be swallowed and will monitor vital signs from within the gastrointestinal tract. I believe this is a prototype of the device,

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Courtesy: MIT

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Image: Albert Swiston/MIT Lincoln Laboratory Courtesy: MIT

From a Nov. 18, 2015 news item on phys.org,

This type of sensor could make it easier to assess trauma patients, monitor soldiers in battle, perform long-term evaluation of patients with chronic illnesses, or improve training for professional and amateur athletes, the researchers say.

The new sensor calculates heart and breathing rates from the distinctive sound waves produced by the beating of the heart and the inhalation and exhalation of the lungs.

“Through characterization of the acoustic wave, recorded from different parts of the GI tract, we found that we could measure both heart rate and respiratory rate with good accuracy,” says Giovanni Traverso, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, a gastroenterologist at Massachusetts General Hospital, and one of the lead authors of a paper describing the device in the Nov. 18 issue of the journal PLOS One.

A Nov. 18, 2015 Massachusetts Institute of Technology (MIT) news release by Anne Trafton, which originated the news item, further explains the research,

Doctors currently measure vital signs such as heart and respiratory rate using techniques including electrocardiograms (ECG) and pulse oximetry, which require contact with the patient’s skin. These vital signs can also be measured with wearable monitors, but those are often uncomfortable to wear.

Inspired by existing ingestible devices that can measure body temperature, and others that take internal digestive-tract images, the researchers set out to design a sensor that would measure heart and respiratory rate, as well as temperature, from inside the digestive tract.

The simplest way to achieve this, they decided, would be to listen to the body using a small microphone. Listening to the sounds of the chest is one of the oldest medical diagnostic techniques, practiced by Hippocrates in ancient Greece. Since the 1800s, doctors have used stethoscopes to listen to these sounds.

The researchers essentially created “an extremely tiny stethoscope that you can swallow,” Swiston says. “Using the same sensor, we can collect both your heart sounds and your lung sounds. That’s one of the advantages of our approach — we can use one sensor to get two pieces of information.”

To translate these acoustic data into heart and breathing rates, the researchers had to devise signal processing systems that distinguish the sounds produced by the heart and lungs from each other, as well as from background noise produced by the digestive tract and other parts of the body.

The entire sensor is about the size of a multivitamin pill and consists of a tiny microphone packaged in a silicone capsule, along with electronics that process the sound and wirelessly send radio signals to an external receiver, with a range of about 3 meters.

In tests along the GI tract of pigs, the researchers found that the device could accurately pick up heart rate and respiratory rate, even when conditions such as the amount of food being digested were varied.

“The authors introduce some interesting and radically different approaches to wearable physiological status monitors, in which the devices are not worn on the skin or on clothing, but instead reside, in a transient fashion, inside the gastrointestinal tract. The resulting capabilities provide a powerful complement to those found in wearable technologies as traditionally conceived,” says John Rogers, a professor of materials science and engineering at the University of Illinois who was not part of the research team.

Better diagnosis

The researchers expect that the device would remain in the digestive tract for only a day or two, so for longer-term monitoring, patients would swallow new capsules as needed.

For the military, this kind of ingestible device could be useful for monitoring soldiers for fatigue, dehydration, tachycardia, or shock, the researchers say. When combined with a temperature sensor, it could also detect hypothermia, hyperthermia, or fever from infections.

In the future, the researchers plan to design sensors that could diagnose heart conditions such as abnormal heart rhythms (arrhythmias), or breathing problems including emphysema or asthma. Currently doctors require patients to wear a harness (Holter) monitor for up to a week to detect such problems, but these often fail to produce a diagnosis because patients are uncomfortable wearing them 24 hours a day.

“If you could ingest a device that would listen for those pathological sounds, rather than wearing an electrical monitor, that would improve patient compliance,” Swiston says.

The researchers also hope to create sensors that would not only diagnose a problem but also deliver a drug to treat it.

“We hope that one day we’re able to detect certain molecules or a pathogen and then deliver an antibiotic, for example,” Traverso says. “This development provides the foundation for that kind of system down the line.”

MIT has provided a video with two of the researchers describing their work and and plans for its future development,

Here’s a link to and a citation for the paper,

Physiologic Status Monitoring via the Gastrointestinal Tract by G. Traverso, G. Ciccarelli, S. Schwartz, T. Hughes, T. Boettcher, R. Barman, R. Langer, & A. Swiston. PLOS DOI: 10.1371/journal.pone.0141666 Published: November 18, 2015

This paper is open access.

Note added Nov. 25, 2015 at 1625 hours PDT: US National Public Radio (NPR) has a story on this research. You can find Nov. 23, 2015 podcast (about six minutes) and a series of textual excerpts featuring Albert Swiston, biomaterials scientist at MIT, and Stephen Shankland, senior writer for CNET covering digital technology, from the podcast here.

Royal Institution, science, and nanotechnology 101 and #RE_IMAGINE at the London College of Fashion

I’m featuring two upcoming events in London (UK).

Nanotechnology 101: The biggest thing you’ve never seen

 Gold Nanowire Array Credit: lacomj via Flickr: www.flickr.com/photos/40137058@N07/3790862760

Gold Nanowire Array
Credit: lacomj via Flickr: www.flickr.com/photos/40137058@N07/3790862760 [downloaded from http://www.rigb.org/whats-on/events-2015/october/public-nanotechnology-101-the-biggest-thing-you]

Already sold out, this event is scheduled for Oct. 20, 2015. Here’s why you might want to put yourself on a waiting list, from the Royal Institution’s Nanotechnology 101 event page,

How could nanotechnology be used to create smart and extremely resilient materials? Or to boil water three times faster? Join former NASA Nanotechnology Project Manager Michael Meador to learn about the fundamentals of nanotechnology—what it is and why it’s unique—and how this emerging, disruptive technology will change the world. From invisibility cloaks to lightweight fuel-efficient vehicles and a cure for cancer, nanotechnology might just be the biggest thing you can’t see.

About the speaker

Michael Meador is currently Director of the U.S. National Nanotechnology Coordination Office, on secondment from NASA where he had been managing the Nanotechnology Project in the Game Changing Technology Program, working to mature nanotechnologies with high potential for impact on NASA missions. One part of his current job is to communicate nanotechnology research to policy-makers and the public.

Here’s some logistical information from the event page,

7.00pm to 8.30pm, Tuesday 20 October
The Theatre

Standard £12
Concession £8
Associate £6
Free to Members, Faraday Members and Fellows

For anyone who may not know offhand where the Royal Institution and its theatre is located,

The Royal Institution of Great Britain
21 Albemarle Street

+44 (0) 20 7409 2992
(9.00am – 6.00pm Mon – Fri)

Here’s a description of the Royal Institution from its Wikipedia entry (Note: Links have been removed),

The Royal Institution of Great Britain (often abbreviated as the Royal Institution or RI) is an organisation devoted to scientific education and research, based in London.

The Royal Institution was founded in 1799 by the leading British scientists of the age, including Henry Cavendish and its first president, George Finch, the 9th Earl of Winchilsea,[1] for

diffusing the knowledge, and facilitating the general introduction, of useful mechanical inventions and improvements; and for teaching, by courses of philosophical lectures and experiments, the application of science to the common purposes of life.
— [2]

Much of its initial funding and the initial proposal for its founding were given by the Society for Bettering the Conditions and Improving the Comforts of the Poor, under the guidance of philanthropist Sir Thomas Bernard and American-born British scientist Sir Benjamin Thompson, Count Rumford. Since its founding it has been based at 21 Albemarle Street in Mayfair. Its Royal Charter was granted in 1800. The Institution announced in January 2013 that it was considering sale of its Mayfair headquarters to meet its mounting debts.[3]


While this isn’t a nanotechnology event, it does touch on topics discussed here many times: wearable technology, futuristic fashion, and the integration of technology into the body. The Digital Anthropology Lab (of the  London College of Fashion, which is part of the University of the Arts London) is being officially launched with a special event on Oct. 16, 2015. Before describing the event, here’s more about the Digital Anthropology Lab from its homepage,

Crafting fashion experience digitally

The Digital Anthropology Lab, launching in Autumn 2015, London College of Fashion, University of the Arts London is a research studio bringing industry and academia together to develop a new way of making smarter with technology.

The Digital Anthropology Lab, London College of Fashion, experiments with artefacts, communities, consumption and making in the digital space, using 3D printing, body scanning, code and electronics. We focus on an experimental approach to digital anthropology, allowing us to practically examine future ways in which digital collides with the human experience. We connect commercial partners to leading research academics and graduate students, exploring seed ideas for fashion tech.


We radically re-imagine this emerging fashion- tech space, exploring both the beautification of technology for wearables and critically explore the ‘why.’


Join us to experiment with, ‘The Internet of Fashion Things.’ Where the Internet of Things, invisible big data technologies, virtual fit and meta-data collide.


With the luxury of the imagination, we aim to re- wire our digital ambitions and think again about designing future digital fashion experiences for generation 2050.

Here’s information I received from the Sept. 30, 2015 announcement I received via email,

The Digital Anthropology Lab at London College of Fashion, UAL invites you to #RE_IMAGINE: A forum exploring the now, near and future of fashion technology.

#RE_IMAGINE, the Digital Anthropology Lab’s launch event, will present a fantastically diverse range of digital speakers and ask them to respond to the question – ‘Where are our digital selves heading?’

Join us to hear from pioneers, risk takers, entrepreneurs, designers and inventors including Ian Livingston CBE, Luke Robert Mason from New Bionics, Katie Baron from Stylus, J. Meejin Yoon from MIT among others. Also come to see what happened when we made fashion collide with the Internet of Things, they are wearable but not as you know it…

#RE_IMAGINE aims to be an informative, networked and enlightening brainstorm of a day. To book your place please follow this link.

To coincide with the exhibition Digital Disturbances, Fashion Space Gallery presents a late night opening event. Alongside a curator tour will be a series of interactive demonstrations and displays which bring together practitioners working across design, science and technology to investigate possible human and material futures. We’d encourage you to stay and enjoy this networking opportunity.

Friday 16th October 2015

9.30am – 5pm – Forum event 

5pm – 8.30pm – Digital Disturbances networking event

London College of Fashion

20 John Princes Street
W1G 0BJ 

Ticket prices are £75.00 for a standard ticket and £35.00 for concession tickets (more details here).

For more #RE_IMAGINE specifics, there’s the event’s Agenda page. As for Digital Disturbances, here’s more from the Fashion Space Gallery’s Exhibition homepage,

Digital Disturbances

11th September – 12th December 2015

Digital Disturbances examines the influence of digital concepts and tools on fashion. It provides a lens onto the often strange effects that emerge from interactions across material and virtual platforms – information both lost and gained in the process of translation. It presents the work of seven designers and creative teams whose work documents these interactions and effects, both in the design and representation of fashion. They can be traced across the surfaces of garments, through the realisation of new silhouettes, in the remixing of images and bodies in photography and film, and into the nuances of identity projected into social and commercial spaces.

Designers include: ANREALAGE, Bart Hess, POSTmatter, Simone C. Niquille and Alexander Porter, Flora Miranda, Texturall and Tigran Avetisyan.

Digital Disturbances is curated by Leanne Wierzba.

Two events—two peeks into the future.

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.

People for the Ethical Treatment of Animals (PETA) and a grant for in vitro nanotoxicity testing

This grant seems to have gotten its start at a workshop held at the US Environmental Protection Agency (EPA) in Washington, D.C., Feb. 24-25, 2015 as per this webpage on the People for Ethical Treatment of Animals (PETA) International Science Consortium Limited website,

The invitation-only workshop included experts from different sectors (government, industry, academia and NGO) and disciplines (in vitro and in vivo inhalation studies of NMs, fibrosis, dosimetry, fluidic models, aerosol engineering, and regulatory assessment). It focused on the technical details for the development and preliminary assessment of the relevance and reliability of an in vitro test to predict the development of pulmonary fibrosis in cells co-cultured at the air-liquid interface following exposure to aerosolized multi-walled carbon nanotubes (MWCNTs). During the workshop, experts made recommendations on cell types, exposure systems, endpoints and dosimetry considerations required to develop the in vitro model for hazard identification of MWCNTs.

The method is intended to be included in a non-animal test battery to reduce and eventually replace the use of animals in studies to assess the inhalation toxicity of engineered NMs. The long-term vision is to develop a battery of in silico and in vitro assays that can be used in an integrated testing strategy, providing comprehensive information on biological endpoints relevant to inhalation exposure to NMs which could be used in the hazard ranking of substances in the risk assessment process.

A September 1, 2015 news item on Azonano provides an update,

The PETA International Science Consortium Ltd. announced today the winners of a $200,000 award for the design of an in vitro test to predict the development of lung fibrosis in humans following exposure to nanomaterials, such as multi-walled carbon nanotubes.

Professor Dr. Barbara Rothen-Rutishauser of the Adolphe Merkle Institute at the University of Fribourg, Switzerland and Professor Dr. Vicki Stone of the School of Life Sciences at Heriot-Watt University, Edinburgh, U.K. will jointly develop the test method. Professor Rothen-Rutishauser co-chairs the BioNanomaterials research group at the Adolphe Merkle Institute, where her research is focused on the study of nanomaterial-cell interactions in the lung using three-dimensional cell models. Professor Vicki Stone is the Director of the Nano Safety Research Group at Heriot-Watt University and the Director of Toxicology for SAFENANO.

The Science Consortium is also funding MatTek Corporation for the development of a three-dimensional reconstructed primary human lung tissue model to be used in Professors Rothen-Rutishauser and Stone’s work. MatTek Corporation has extensive expertise in manufacturing human cell-based, organotypic in vitro models for use in regulatory and basic research applications. The work at MatTek will be led by Dr. Patrick Hayden, Vice President of Scientific Affairs, and Dr. Anna Maione, head of MatTek’s airway models research group.

I was curious about MatTek Corporation and found this on company’s About Us webpage,

MatTek Corporation was founded in 1985 by two chemical engineering professors from MIT. In 1991 the company leveraged its core polymer surface modification technology into the emerging tissue engineering market.

MatTek Corporation is at the forefront of tissue engineering and is a world leader in the production of innovative 3D reconstructed human tissue models. Our skin, ocular, and respiratory tissue models are used in regulatory toxicology (OECD, EU guidelines) and address toxicology and efficacy concerns throughout the cosmetics, chemical, pharmaceutical and household product industries.

EpiDerm™, MatTek’s first 3D human cell based in vitro model, was introduced in 1993 and became an immediate technical and commercial success.

I wish them good luck in their research on developing better ways to test toxicity.

Carbon nanotubes as sensors in the body

Rachel Ehrenberg has written an Aug. 21, 2015 news item about the latest and greatest carbon nanotube-based biomedical sensors for the journal Nature,

The future of medical sensors may be going down the tubes. Chemists are developing tiny devices made from carbon nanotubes wrapped with polymers to detect biologically important compounds such as insulin, nitric oxide and the blood-clotting protein fibrinogen. The hope is that these sensors could simplify and automate diagnostic tests.

Preliminary experiments in mice, reported by scientists at a meeting of the American Chemical Society in Boston, Massachusetts, this week [Aug. 16 – 20, 2015], suggest that the devices are safe to introduce into the bloodstream or implant under the skin. Researchers also presented data showing that the nanotube–polymer complexes could measure levels of large molecules, a feat that has been difficult for existing technologies.

Ehrenberg focuses on one laboratory in particular (Note: Links have been removed),

“Anything the body makes, it is meant to degrade,” says chemical engineer Michael Strano, whose lab at the Massachusetts Institute of Technology (MIT) in Cambridge is behind much of the latest work1. “Our vision is to make a sensing platform that can monitor a whole range of molecules, and do it in the long term.”

To design one sensor, MIT  researchers coated nanotubes with a mix of polymers and nucleotides and screened for configurations that would bind to the protein fibrinogen. This large molecule is important for building blood clots; its concentration can indicate bleeding disorders, liver disease or impending cardiovascular trouble. The team recently hit on a material that worked — a first for such a large molecule, according to MIT nanotechnology specialist Gili Bisker. Bisker said at the chemistry meeting that the fibrinogen-detecting nanotubes could be used to measure levels of the protein in blood samples, or implanted in body tissue to detect changing fibrinogen levels that might indicate a clot.

The MIT team has also developed2 a sensor that can be inserted beneath the skin to monitor glucose or insulin levels in real time, Bisker reported. The team imagines putting a small patch that contains a wireless device on the skin just above the embedded sensor. The patch would shine light on the sensor and measure its fluorescence, then transmit that data to a mobile phone for real-time monitoring.

Another version of the sensor, developed3 at MIT by biomedical engineer Nicole Iverson and colleagues, detects nitric oxide. This signalling molecule typically indicates inflammation and is associated with many cancer cells. When embedded in a hydrogel matrix, the sensor kept working in mice for more than 400 days and caused no local inflammation, MIT chemical engineer Michael Lee reported. The nitric oxide sensors also performed well when injected into the bloodstreams of mice, successfully passing through small capillaries in the lungs, which are an area of concern for nanotube toxicity. …

There’s at least one corporate laboratory (Google X), working on biosensors although their focus is a little different. From a Jan. 9, 2015 article by Brian Womack and Anna Edney for BloombergBusiness,

Google Inc. sent employees with ties to its secretive X research group to meet with U.S. regulators who oversee medical devices, raising the possibility of a new product that may involve biosensors from the unit that developed computerized glasses.

The meeting included at least four Google workers, some of whom have connections with Google X — and have done research on sensors, including contact lenses that help wearers monitor their biological data. Google staff met with those at the Food and Drug Administration who regulate eye devices and diagnostics for heart conditions, according to the agency’s public calendar. [emphasis mine]

This approach from Google is considered noninvasive,

“There is actually one interface on the surface of the body that can literally provide us with a window of what happens inside, and that’s the surface of the eye,” Parviz [Babak Parviz, … was involved in the Google Glass project and has talked about putting displays on contact lenses, including lenses that monitor wearer’s health]  said in a video posted on YouTube. “It’s a very interesting chemical interface.”

Of course, the assumption is that all this monitoring is going to result in  healthier people but I can’t help thinking about an old saying ‘a little knowledge can be a dangerous thing’. For example, we lived in a world where bacteria roamed free and then we learned how to make them visible, determined they were disease-causing, and began campaigns for killing them off. Now, it turns out that at least some bacteria are good for us and, moreover, we’ve created other, more dangerous bacteria that are drug-resistant. Based on the bacteria example, is it possible that with these biosensors we will observe new phenomena and make similar mistakes?

Scaling graphene production up to industrial strength

If graphene is going to be a ubiquitous material in the future, production methods need to change. An Aug. 7, 2015 news item on Nanowerk announces a new technique to achieve that goal,

Producing graphene in bulk is critical when it comes to the industrial exploitation of this exceptional two-dimensional material. To that end, [European Commission] Graphene Flagship researchers have developed a novel variant on the chemical vapour deposition process which yields high quality material in a scalable manner. This advance should significantly narrow the performance gap between synthetic and natural graphene.

An Aug. 7, 2015 European Commission Graphene Flagship press release by Francis Sedgemore, which originated the news item, describes the problem,

Media-friendly Nobel laureates peeling layers of graphene from bulk graphite with sticky tape may capture the public imagination, but as a manufacturing process the technique is somewhat lacking. Mechanical exfoliation may give us pristine graphene, but industry requires scalable and cost-effective production processes with much higher yields.

On to the new method (from the press release),

Flagship-affiliated physicists from RWTH Aachen University and Forschungszentrum Jülich have together with colleagues in Japan devised a method for peeling graphene flakes from a CVD substrate with the help of intermolecular forces. …

Key to the process is the strong van der Waals interaction that exists between graphene and hexagonal boron nitride, another 2d material within which it is encapsulated. The van der Waals force is the attractive sum of short-range electric dipole interactions between uncharged molecules.

Thanks to strong van der Waals interactions between graphene and boron nitride, CVD graphene can be separated from the copper and transferred to an arbitrary substrate. The process allows for re-use of the catalyst copper foil in further growth cycles, and minimises contamination of the graphene due to processing.

Raman spectroscopy and transport measurements on the graphene/boron nitride heterostructures reveals high electron mobilities comparable with those observed in similar assemblies based on exfoliated graphene. Furthermore – and this comes as something of a surprise to the researchers – no noticeable performance changes are detected between devices developed in the first and subsequent growth cycles. This confirms the copper as a recyclable resource in the graphene fabrication process.

“Chemical vapour deposition is a highly scalable and cost-efficient technology,” says Christoph Stampfer, head of the 2nd Institute of Physics A in Aachen, and co-author of the technical article. “Until now, graphene synthesised this way has been significantly lower in quality than that obtained with the scotch-tape method, especially when it comes to the material’s electronic properties. But no longer. We demonstrate a novel fabrication process based on CVD that yields ultra-high quality synthetic graphene samples. The process is in principle suitable for industrial-scale production, and narrows the gap between graphene research and its technological applications.”

With their dry-transfer process, Banszerus and his colleagues have shown that the electronic properties of CVD-grown graphene can in principle match those of ultrahigh-mobility exfoliated graphene. The key is to transfer CVD graphene from its growth substrate in such a way that chemical contamination is avoided. The high mobility of pristine graphene is thus preserved, and the approach allows for the substrate material to be recycled without degradation.

Here’s a link to and citation for the paper,

Ultrahigh-mobility graphene devices from chemical vapor deposition on reusable copper by Luca Banszerus, Michael Schmitz, Stephan Engels, Jan Dauber, Martin Oellers, Federica Haupt, Kenji Watanabe, Takashi Taniguchi, Bernd Beschoten, and Christoph Stampfer. Science Advances  31 Jul 2015: Vol. 1, no. 6, e1500222 DOI: 10.1126/sciadv.1500222

This article appears to be open access.

For those interested in finding out more about chemical vapour deposition (CVD), David Chandler has written a June 19, 2015 article for the Massachusetts Institute of Technology (MIT) titled:  Explained: chemical vapor deposition (Technique enables production of pure, uniform coatings of metals or polymers, even on contoured surfaces.)

Nanoscale imaging of a mouse brain

Researchers have developed a new brain imaging tool they would like to use as a founding element for a national brain observatory. From a July 30, 2015 news item on Azonano,

A new imaging tool developed by Boston scientists could do for the brain what the telescope did for space exploration.

In the first demonstration of how the technology works, published July 30 in the journal Cell, the researchers look inside the brain of an adult mouse at a scale previously unachievable, generating images at a nanoscale resolution. The inventors’ long-term goal is to make the resource available to the scientific community in the form of a national brain observatory.

A July 30, 2015 Cell Press news release on EurekAlert, which originated the news item, expands on the theme,

“I’m a strong believer in bottom up-science, which is a way of saying that I would prefer to generate a hypothesis from the data and test it,” says senior study author Jeff Lichtman, of Harvard University. “For people who are imagers, being able to see all of these details is wonderful and we’re getting an opportunity to peer into something that has remained somewhat intractable for so long. It’s about time we did this, and it is what people should be doing about things we don’t understand.”

The researchers have begun the process of mining their imaging data by looking first at an area of the brain that receives sensory information from mouse whiskers, which help the animals orient themselves and are even more sensitive than human fingertips. The scientists used a program called VAST, developed by co-author Daniel Berger of Harvard and the Massachusetts Institute of Technology, to assign different colors and piece apart each individual “object” (e.g., neuron, glial cell, blood vessel cell, etc.).

“The complexity of the brain is much more than what we had ever imagined,” says study first author Narayanan “Bobby” Kasthuri, of the Boston University School of Medicine. “We had this clean idea of how there’s a really nice order to how neurons connect with each other, but if you actually look at the material it’s not like that. The connections are so messy that it’s hard to imagine a plan to it, but we checked and there’s clearly a pattern that cannot be explained by randomness.”

The researchers see great potential in the tool’s ability to answer questions about what a neurological disorder actually looks like in the brain, as well as what makes the human brain different from other animals and different between individuals. Who we become is very much a product of the connections our neurons make in response to various life experiences. To be able to compare the physical neuron-to-neuron connections in an infant, a mathematical genius, and someone with schizophrenia would be a leap in our understanding of how our brains shape who we are (or vice versa).

The cost and data storage demands for this type of research are still high, but the researchers expect expenses to drop over time (as has been the case with genome sequencing). To facilitate data sharing, the scientists are now partnering with Argonne National Laboratory with the hopes of creating a national brain laboratory that neuroscientists around the world can access within the next few years.

“It’s bittersweet that there are many scientists who think this is a total waste of time as well as a big investment in money and effort that could be better spent answering questions that are more proximal,” Lichtman says. “As long as data is showing you things that are unexpected, then you’re definitely doing the right thing. And we are certainly far from being out of the surprise element. There’s never a time when we look at this data that we don’t see something that we’ve never seen before.”

Here’s a link to and a citation for the paper,

Saturated Reconstruction of a Volume of Neocortex by Narayanan Kasthuri, Kenneth Jeffrey Hayworth, Daniel Raimund Berger, Richard Lee Schalek, José Angel Conchello, Seymour Knowles-Barley, Dongil Lee, Amelio Vázquez-Reina, Verena Kaynig, Thouis Raymond Jones, Mike Roberts, Josh Lyskowski Morgan, Juan Carlos Tapia, H. Sebastian Seung, William Gray Roncal, Joshua Tzvi Vogelstein, Randal Burns, Daniel Lewis Sussman, Carey Eldin Priebe, Hanspeter Pfister, Jeff William Lichtman. Cell Volume 162, Issue 3, p648–661, 30 July 2015 DOI: http://dx.doi.org/10.1016/j.cell.2015.06.054

This appears to be an open access paper.

IBM and its working 7nm test chip

I wrote abut IBM and its plans for a 7nm computer chip last year in a July 11, 2014 posting, which featured IBM and mention of HP Labs and other company’s plans for shrinking their computer chips. Almost one year later, IBM has announced, in a July 9, 2015 IBM news release on PRnewswire.com the accomplishment of a working 7nm test chip,

An alliance led by IBM Research (NYSE: IBM) today announced that it has produced the semiconductor industry’s first 7nm (nanometer) node test chips with functioning transistors.  The breakthrough, accomplished in partnership with GLOBALFOUNDRIES and Samsung at SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering (SUNY Poly CNSE), could result in the ability to place more than 20 billion tiny switches — transistors — on the fingernail-sized chips that power everything from smartphones to spacecraft.

To achieve the higher performance, lower power and scaling benefits promised by 7nm technology, researchers had to bypass conventional semiconductor manufacturing approaches. Among the novel processes and techniques pioneered by the IBM Research alliance were a number of industry-first innovations, most notably Silicon Germanium (SiGe) channel transistors and Extreme Ultraviolet (EUV) lithography integration at multiple levels.

Industry experts consider 7nm technology crucial to meeting the anticipated demands of future cloud computing and Big Data systems, cognitive computing, mobile products and other emerging technologies. Part of IBM’s $3 billion, five-year investment in chip R&D (announced in 2014), this accomplishment was made possible through a unique public-private partnership with New York State and joint development alliance with GLOBALFOUNDRIES, Samsung and equipment suppliers. The team is based at SUNY Poly’s NanoTech Complex in Albany [New York state].

“For business and society to get the most out of tomorrow’s computers and devices, scaling to 7nm and beyond is essential,” said Arvind Krishna, senior vice president and director of IBM Research. “That’s why IBM has remained committed to an aggressive basic research agenda that continually pushes the limits of semiconductor technology. Working with our partners, this milestone builds on decades of research that has set the pace for the microelectronics industry, and positions us to advance our leadership for years to come.”

Microprocessors utilizing 22nm and 14nm technology power today’s servers, cloud data centers and mobile devices, and 10nm technology is well on the way to becoming a mature technology. The IBM Research-led alliance achieved close to 50 percent area scaling improvements over today’s most advanced technology, introduced SiGe channel material for transistor performance enhancement at 7nm node geometries, process innovations to stack them below 30nm pitch and full integration of EUV lithography at multiple levels. These techniques and scaling could result in at least a 50 percent power/performance improvement for next generation mainframe and POWER systems that will power the Big Data, cloud and mobile era.

“Governor Andrew Cuomo’s trailblazing public-private partnership model is catalyzing historic innovation and advancement. Today’s [July 8, 2015] announcement is just one example of our collaboration with IBM, which furthers New York State’s global leadership in developing next generation technologies,” said Dr. Michael Liehr, SUNY Poly Executive Vice President of Innovation and Technology and Vice President of Research.  “Enabling the first 7nm node transistors is a significant milestone for the entire semiconductor industry as we continue to push beyond the limitations of our current capabilities.”

“Today’s announcement marks the latest achievement in our long history of collaboration to accelerate development of next-generation technology,” said Gary Patton, CTO and Head of Worldwide R&D at GLOBALFOUNDRIES. “Through this joint collaborative program based at the Albany NanoTech Complex, we are able to maintain our focus on technology leadership for our clients and partners by helping to address the development challenges central to producing a smaller, faster, more cost efficient generation of semiconductors.”

The 7nm node milestone continues IBM’s legacy of historic contributions to silicon and semiconductor innovation. They include the invention or first implementation of the single cell DRAM, the Dennard Scaling Laws, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed SiGe, High-k gate dielectrics, embedded DRAM, 3D chip stacking and Air gap insulators.

In 2014, they were talking about carbon nanotubes with regard to the 7nm chip, this shift to silicon germanium is interesting.

Sebastian Anthony in a July 9, 2015 article for Ars Technica offers some intriguing insight into the accomplishment and the technology (Note: A link has been removed),

… While it should be stressed that commercial 7nm chips remain at least two years away, this test chip from IBM and its partners is extremely significant for three reasons: it’s a working sub-10nm chip (this is pretty significant in itself); it’s the first commercially viable sub-10nm FinFET logic chip that uses silicon-germanium as the channel material; and it appears to be the first commercially viable design produced with extreme ultraviolet (EUV) lithography.

Technologically, SiGe and EUV are both very significant. SiGe has higher electron mobility than pure silicon, which makes it better suited for smaller transistors. The gap between two silicon nuclei is about 0.5nm; as the gate width gets ever smaller (about 7nm in this case), the channel becomes so small that the handful of silicon atoms can’t carry enough current. By mixing some germanium into the channel, electron mobility increases, and adequate current can flow. Silicon generally runs into problems at sub-10nm nodes, and we can expect Intel and TSMC to follow a similar path to IBM, GlobalFoundries, and Samsung (aka the Common Platform alliance).

EUV lithography is an more interesting innovation. Basically, as chip features get smaller, you need a narrower beam of light to etch those features accurately, or you need to use multiple patterning (which we won’t go into here). The current state of the art for lithography is a 193nm ArF (argon fluoride) laser; that is, the wavelength is 193nm wide. Complex optics and multiple painstaking steps are required to etch 14nm features using a 193nm light source. EUV has a wavelength of just 13.5nm, which will handily take us down into the sub-10nm realm, but so far it has proven very difficult and expensive to deploy commercially (it has been just around the corner for quite a few years now).

If you’re interested in the nuances, I recommend reading Anthony’s article in its entirety.

One final comment, there was no discussion of electrodes or other metallic components associated with computer chips. The metallic components are a topic of some interest to me (anyway), given some research published by scientists at the Massachusetts Institute of Technology (MIT) last year. From my Oct. 14, 2014 posting,

Research from the Massachusetts Institute of Technology (MIT) has revealed a new property of metal nanoparticles, in this case, silver. From an Oct. 12, 2014 news item on ScienceDaily,

A surprising phenomenon has been found in metal nanoparticles: They appear, from the outside, to be liquid droplets, wobbling and readily changing shape, while their interiors retain a perfectly stable crystal configuration.

The research team behind the finding, led by MIT professor Ju Li, says the work could have important implications for the design of components in nanotechnology, such as metal contacts for molecular electronic circuits. [my emphasis added]

This discovery and others regarding materials and phase changes at ever diminishing sizes hint that a computer with a functioning 7nm chip might be a bit further off than IBM is suggesting.

Yarns of niobium nanowire for small electronic device boost at the University of British Columbia (Canada) and Massachusetts Institute of Technology (US)

It turns out that this research concerning supercapacitors is a collaboration between the University of British Columbia (Canada) and the Massachusetts Institute of Technology (MIT). From a July 7, 2015 news item by Stuart Milne for Azonano,

A team of researchers from MIT and University of British Columbia has discovered an innovative method to deliver short bursts of high power required by wearable electronic devices.

Such devices are used for monitoring health and fitness and as such are rapidly growing in the consumer electronics industry. However, a major drawback of these devices is that they are integrated with small batteries, which fail to deliver sufficient amount of power required for data transmission.

According to the research team, one way to resolve this issue is to develop supercapacitors, which are capable of storing and releasing short bursts of electrical power required to transmit data from smartphones, computers, heart-rate monitors, and other wearable devices. supercapacitors can also prove useful for other applications where short bursts of high power is required, for instance autonomous microrobots.

A July 7, 2015 MIT news release provides more detail about the research,

The new approach uses yarns, made from nanowires of the element niobium, as the electrodes in tiny supercapacitors (which are essentially pairs of electrically conducting fibers with an insulator between). The concept is described in a paper in the journal ACS Applied Materials and Interfaces by MIT professor of mechanical engineering Ian W. Hunter, doctoral student Seyed M. Mirvakili, and three others at the University of British Columbia.

Nanotechnology researchers have been working to increase the performance of supercapacitors for the past decade. Among nanomaterials, carbon-based nanoparticles — such as carbon nanotubes and graphene — have shown promising results, but they suffer from relatively low electrical conductivity, Mirvakili says.

In this new work, he and his colleagues have shown that desirable characteristics for such devices, such as high power density, are not unique to carbon-based nanoparticles, and that niobium nanowire yarn is a promising an alternative.

“Imagine you’ve got some kind of wearable health-monitoring system,” Hunter says, “and it needs to broadcast data, for example using Wi-Fi, over a long distance.” At the moment, the coin-sized batteries used in many small electronic devices have very limited ability to deliver a lot of power at once, which is what such data transmissions need.

“Long-distance Wi-Fi requires a fair amount of power,” says Hunter, the George N. Hatsopoulos Professor in Thermodynamics in MIT’s Department of Mechanical Engineering, “but it may not be needed for very long.” Small batteries are generally poorly suited for such power needs, he adds.

“We know it’s a problem experienced by a number of companies in the health-monitoring or exercise-monitoring space. So an alternative is to go to a combination of a battery and a capacitor,” Hunter says: the battery for long-term, low-power functions, and the capacitor for short bursts of high power. Such a combination should be able to either increase the range of the device, or — perhaps more important in the marketplace — to significantly reduce size requirements.

The new nanowire-based supercapacitor exceeds the performance of existing batteries, while occupying a very small volume. “If you’ve got an Apple Watch and I shave 30 percent off the mass, you may not even notice,” Hunter says. “But if you reduce the volume by 30 percent, that would be a big deal,” he says: Consumers are very sensitive to the size of wearable devices.

The innovation is especially significant for small devices, Hunter says, because other energy-storage technologies — such as fuel cells, batteries, and flywheels — tend to be less efficient, or simply too complex to be practical when reduced to very small sizes. “We are in a sweet spot,” he says, with a technology that can deliver big bursts of power from a very small device.

Ideally, Hunter says, it would be desirable to have a high volumetric power density (the amount of power stored in a given volume) and high volumetric energy density (the amount of energy in a given volume). “Nobody’s figured out how to do that,” he says. However, with the new device, “We have fairly high volumetric power density, medium energy density, and a low cost,” a combination that could be well suited for many applications.

Niobium is a fairly abundant and widely used material, Mirvakili says, so the whole system should be inexpensive and easy to produce. “The fabrication cost is cheap,” he says. Other groups have made similar supercapacitors using carbon nanotubes or other materials, but the niobium yarns are stronger and 100 times more conductive. Overall, niobium-based supercapacitors can store up to five times as much power in a given volume as carbon nanotube versions.

Niobium also has a very high melting point — nearly 2,500 degrees Celsius — so devices made from these nanowires could potentially be suitable for use in high-temperature applications.

In addition, the material is highly flexible and could be woven into fabrics, enabling wearable forms; individual niobium nanowires are just 140 nanometers in diameter — 140 billionths of a meter across, or about one-thousandth the width of a human hair.

So far, the material has been produced only in lab-scale devices. The next step, already under way, is to figure out how to design a practical, easily manufactured version, the researchers say.

“The work is very significant in the development of smart fabrics and future wearable technologies,” says Geoff Spinks, a professor of engineering at the University of Wollongong, in Australia, who was not associated with this research. This paper, he adds, “convincingly demonstrates the impressive performance of niobium-based fiber supercapacitors.”

Here’s a link to and a citation for the paper,

High-Performance Supercapacitors from Niobium Nanowire Yarns by Seyed M. Mirvakili, Mehr Negar Mirvakili, Peter Englezos, John D. W. Madden, and Ian W. Hunter. ACS Appl. Mater. Interfaces, 2015, 7 (25), pp 13882–13888 DOI: 10.1021/acsami.5b02327 Publication Date (Web): June 12, 2015

Copyright © 2015 American Chemical Society

This paper is behind a paywall.

LiquiGlide, a nanotechnology-enabled coating for food packaging and oil and gas pipelines

Getting condiments out of their bottles should be a lot easier in several European countries in the near future. A June 30, 2015 news item on Nanowerk describes the technology and the business deal (Note: A link has been removed),

The days of wasting condiments — and other products — that stick stubbornly to the sides of their bottles may be gone, thanks to MIT [Massachusetts Institute of Technology] spinout LiquiGlide, which has licensed its nonstick coating to a major consumer-goods company.

Developed in 2009 by MIT’s Kripa Varanasi and David Smith, LiquiGlide is a liquid-impregnated coating that acts as a slippery barrier between a surface and a viscous liquid. Applied inside a condiment bottle, for instance, the coating clings permanently to its sides, while allowing the condiment to glide off completely, with no residue.

In 2012, amidst a flurry of media attention following LiquiGlide’s entry in MIT’s $100K Entrepreneurship Competition, Smith and Varanasi founded the startup — with help from the Institute — to commercialize the coating.

Today [June 30, 2015], Norwegian consumer-goods producer Orkla has signed a licensing agreement to use the LiquiGlide’s coating for mayonnaise products sold in Germany, Scandinavia, and several other European nations. This comes on the heels of another licensing deal, with Elmer’s [Elmer’s Glue & Adhesives], announced in March [2015].

A June 30, 2015 MIT news release, which originated the news item, provides more details about the researcher/entrepreneurs’ plans,

But this is only the beginning, says Varanasi, an associate professor of mechanical engineering who is now on LiquiGlide’s board of directors and chief science advisor. The startup, which just entered the consumer-goods market, is courting deals with numerous producers of foods, beauty supplies, and household products. “Our coatings can work with a whole range of products, because we can tailor each coating to meet the specific requirements of each application,” Varanasi says.

Apart from providing savings and convenience, LiquiGlide aims to reduce the surprising amount of wasted products — especially food — that stick to container sides and get tossed. For instance, in 2009 Consumer Reports found that up to 15 percent of bottled condiments are ultimately thrown away. Keeping bottles clean, Varanasi adds, could also drastically cut the use of water and energy, as well as the costs associated with rinsing bottles before recycling. “It has huge potential in terms of critical sustainability,” he says.

Varanasi says LiquiGlide aims next to tackle buildup in oil and gas pipelines, which can cause corrosion and clogs that reduce flow. [emphasis mine] Future uses, he adds, could include coatings for medical devices such as catheters, deicing roofs and airplane wings, and improving manufacturing and process efficiency. “Interfaces are ubiquitous,” he says. “We want to be everywhere.”

The news release goes on to describe the research process in more detail and offers a plug for MIT’s innovation efforts,

LiquiGlide was originally developed while Smith worked on his graduate research in Varanasi’s research group. Smith and Varanasi were interested in preventing ice buildup on airplane surfaces and methane hydrate buildup in oil and gas pipelines.

Some initial work was on superhydrophobic surfaces, which trap pockets of air and naturally repel water. But both researchers found that these surfaces don’t, in fact, shed every bit of liquid. During phase transitions — when vapor turns to liquid, for instance — water droplets condense within microscopic gaps on surfaces, and steadily accumulate. This leads to loss of anti-icing properties of the surface. “Something that is nonwetting to macroscopic drops does not remain nonwetting for microscopic drops,” Varanasi says.

Inspired by the work of researcher David Quéré, of ESPCI in Paris, on slippery “hemisolid-hemiliquid” surfaces, Varanasi and Smith invented permanently wet “liquid-impregnated surfaces” — coatings that don’t have such microscopic gaps. The coatings consist of textured solid material that traps a liquid lubricant through capillary and intermolecular forces. The coating wicks through the textured solid surface, clinging permanently under the product, allowing the product to slide off the surface easily; other materials can’t enter the gaps or displace the coating. “One can say that it’s a self-lubricating surface,” Varanasi says.

Mixing and matching the materials, however, is a complicated process, Varanasi says. Liquid components of the coating, for instance, must be compatible with the chemical and physical properties of the sticky product, and generally immiscible. The solid material must form a textured structure while adhering to the container. And the coating can’t spoil the contents: Foodstuffs, for instance, require safe, edible materials, such as plants and insoluble fibers.

To help choose ingredients, Smith and Varanasi developed the basic scientific principles and algorithms that calculate how the liquid and solid coating materials, and the product, as well as the geometry of the surface structures will all interact to find the optimal “recipe.”

Today, LiquiGlide develops coatings for clients and licenses the recipes to them. Included are instructions that detail the materials, equipment, and process required to create and apply the coating for their specific needs. “The state of the coating we end up with depends entirely on the properties of the product you want to slide over the surface,” says Smith, now LiquiGlide’s CEO.

Having researched materials for hundreds of different viscous liquids over the years — from peanut butter to crude oil to blood — LiquiGlide also has a database of optimal ingredients for its algorithms to pull from when customizing recipes. “Given any new product you want LiquiGlide for, we can zero in on a solution that meets all requirements necessary,” Varanasi says.

MIT: A lab for entrepreneurs

For years, Smith and Varanasi toyed around with commercial applications for LiquiGlide. But in 2012, with help from MIT’s entrepreneurial ecosystem, LiquiGlide went from lab to market in a matter of months.

Initially the idea was to bring coatings to the oil and gas industry. But one day, in early 2012, Varanasi saw his wife struggling to pour honey from its container. “And I thought, ‘We have a solution for that,’” Varanasi says.

The focus then became consumer packaging. Smith and Varanasi took the idea through several entrepreneurship classes — such as 6.933 (Entrepreneurship in Engineering: The Founder’s Journey) — and MIT’s Venture Mentoring Service and Innovation Teams, where student teams research the commercial potential of MIT technologies.

“I did pretty much every last thing you could do,” Smith says. “Because we have such a brilliant network here at MIT, I thought I should take advantage of it.”

That May [2012], Smith, Varanasi, and several MIT students entered LiquiGlide in the MIT $100K Entrepreneurship Competition, earning the Audience Choice Award — and the national spotlight. A video of ketchup sliding out of a LiquiGlide-coated bottle went viral. Numerous media outlets picked up the story, while hundreds of companies reached out to Varanasi to buy the coating. “My phone didn’t stop ringing, my website crashed for a month,” Varanasi says. “It just went crazy.”

That summer [2012], Smith and Varanasi took their startup idea to MIT’s Global Founders’ Skills Accelerator program, which introduced them to a robust network of local investors and helped them build a solid business plan. Soon after, they raised money from family and friends, and won $100,000 at the MassChallenge Entrepreneurship Competition.

When LiquiGlide Inc. launched in August 2012, clients were already knocking down the door. The startup chose a select number to pay for the development and testing of the coating for its products. Within a year, LiquiGlide was cash-flow positive, and had grown from three to 18 employees in its current Cambridge headquarters.

Looking back, Varanasi attributes much of LiquiGlide’s success to MIT’s innovation-based ecosystem, which promotes rapid prototyping for the marketplace through experimentation and collaboration. This ecosystem includes the Deshpande Center for Technological Innovation, the Martin Trust Center for MIT Entrepreneurship, the Venture Mentoring Service, and the Technology Licensing Office, among other initiatives. “Having a lab where we could think about … translating the technology to real-world applications, and having this ability to meet people, and bounce ideas … that whole MIT ecosystem was key,” Varanasi says.

Here’s the latest LiquiGlide video,


Video: Melanie Gonick/MIT
Additional footage courtesy of LiquiGlide™
Music sampled from “Candlepower” by Chris Zabriskie

I had thought the EU (European Union) offered more roadblocks to marketing nanotechnology-enabled products used in food packaging than the US. If anyone knows why a US company would market its products in Europe first I would love to find out.