Tag Archives: Massachusetts Institute of Technology

Handling massive digital datasets the quantum way

A Jan. 25, 2016 news item on phys.org describes a new approach to analyzing and managing huge datasets,

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data—far more information than people can actually process, manage, or understand.

Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at [Massachusetts Institute of Technology] MIT, the University of Waterloo, and the University of Southern California [USC}.

A Jan. 25, 2016 MIT news release (*also on EurekAlert*), which originated the news item, describes the theory in more detail,

… Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.

In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd [ explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”

It doesn’t matter what kind of dataset is being analyzed, he says. The topological approach to looking for connections and holes “works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.”

Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive,” Lloyd says. “This is where quantum mechanics kicks in.” The new quantum-based approach, he says, could exponentially speed up such calculations.

Lloyd offers an example to illustrate that potential speedup: If you have a dataset with 300 points, a conventional approach to analyzing all the topological features in that system would require “a computer the size of the universe,” he says. That is, it would take 2300 (two to the 300th power) processing units — approximately the number of all the particles in the universe. In other words, the problem is simply not solvable in that way.

“That’s where our algorithm kicks in,” he says. Solving the same problem with the new system, using a quantum computer, would require just 300 quantum bits — and a device this size may be achieved in the next few years, according to Lloyd.

“Our algorithm shows that you don’t need a big quantum computer to kick some serious topological butt,” he says.

There are many important kinds of huge datasets where the quantum-topological approach could be useful, Lloyd says, for example understanding interconnections in the brain. “By applying topological analysis to datasets gleaned by electroencephalography or functional MRI, you can reveal the complex connectivity and topology of the sequences of firing neurons that underlie our thought processes,” he says.

The same approach could be used for analyzing many other kinds of information. “You could apply it to the world’s economy, or to social networks, or almost any system that involves long-range transport of goods or information,” says Lloyd, who holds a joint appointment as a professor of physics. But the limits of classical computation have prevented such approaches from being applied before.

While this work is theoretical, “experimentalists have already contacted us about trying prototypes,” he says. “You could find the topology of simple structures on a very simple quantum computer. People are trying proof-of-concept experiments.”

Ignacio Cirac, a professor at the Max Planck Institute of Quantum Optics in Munich, Germany, who was not involved in this research, calls it “a very original idea, and I think that it has a great potential.” He adds “I guess that it has to be further developed and adapted to particular problems. In any case, I think that this is top-quality research.”

Here’s a link to and a citation for the paper,

Quantum algorithms for topological and geometric analysis of data by Seth Lloyd, Silvano Garnerone, & Paolo Zanardi. Nature Communications 7, Article number: 10138 doi:10.1038/ncomms10138 Published 25 January 2016

This paper is open access.

ETA Jan. 25, 2016 1245 hours PST,

Shown here are the connections between different regions of the brain in a control subject (left) and a subject under the influence of the psychedelic compound psilocybin (right). This demonstrates a dramatic increase in connectivity, which explains some of the drug’s effects (such as “hearing” colors or “seeing” smells). Such an analysis, involving billions of brain cells, would be too complex for conventional techniques, but could be handled easily by the new quantum approach, the researchers say. Courtesy of the researchers

Shown here are the connections between different regions of the brain in a control subject (left) and a subject under the influence of the psychedelic compound psilocybin (right). This demonstrates a dramatic increase in connectivity, which explains some of the drug’s effects (such as “hearing” colors or “seeing” smells). Such an analysis, involving billions of brain cells, would be too complex for conventional techniques, but could be handled easily by the new quantum approach, the researchers say. Courtesy of the researchers

*’also on EurekAlert’ text and link added Jan. 26, 2016.

Swallow your technology and wear it inside (wearable tech: 2 of 3)

While there are a number of wearable and fashionable pieces of technology that monitor heart rate and breathing, they are all worn on the outside of your body. Researchers are working on an alternative that can be swallowed and will monitor vital signs from within the gastrointestinal tract. I believe this is a prototype of the device,

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Courtesy: MIT

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Image: Albert Swiston/MIT Lincoln Laboratory Courtesy: MIT

From a Nov. 18, 2015 news item on phys.org,

This type of sensor could make it easier to assess trauma patients, monitor soldiers in battle, perform long-term evaluation of patients with chronic illnesses, or improve training for professional and amateur athletes, the researchers say.

The new sensor calculates heart and breathing rates from the distinctive sound waves produced by the beating of the heart and the inhalation and exhalation of the lungs.

“Through characterization of the acoustic wave, recorded from different parts of the GI tract, we found that we could measure both heart rate and respiratory rate with good accuracy,” says Giovanni Traverso, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, a gastroenterologist at Massachusetts General Hospital, and one of the lead authors of a paper describing the device in the Nov. 18 issue of the journal PLOS One.

A Nov. 18, 2015 Massachusetts Institute of Technology (MIT) news release by Anne Trafton, which originated the news item, further explains the research,

Doctors currently measure vital signs such as heart and respiratory rate using techniques including electrocardiograms (ECG) and pulse oximetry, which require contact with the patient’s skin. These vital signs can also be measured with wearable monitors, but those are often uncomfortable to wear.

Inspired by existing ingestible devices that can measure body temperature, and others that take internal digestive-tract images, the researchers set out to design a sensor that would measure heart and respiratory rate, as well as temperature, from inside the digestive tract.

The simplest way to achieve this, they decided, would be to listen to the body using a small microphone. Listening to the sounds of the chest is one of the oldest medical diagnostic techniques, practiced by Hippocrates in ancient Greece. Since the 1800s, doctors have used stethoscopes to listen to these sounds.

The researchers essentially created “an extremely tiny stethoscope that you can swallow,” Swiston says. “Using the same sensor, we can collect both your heart sounds and your lung sounds. That’s one of the advantages of our approach — we can use one sensor to get two pieces of information.”

To translate these acoustic data into heart and breathing rates, the researchers had to devise signal processing systems that distinguish the sounds produced by the heart and lungs from each other, as well as from background noise produced by the digestive tract and other parts of the body.

The entire sensor is about the size of a multivitamin pill and consists of a tiny microphone packaged in a silicone capsule, along with electronics that process the sound and wirelessly send radio signals to an external receiver, with a range of about 3 meters.

In tests along the GI tract of pigs, the researchers found that the device could accurately pick up heart rate and respiratory rate, even when conditions such as the amount of food being digested were varied.

“The authors introduce some interesting and radically different approaches to wearable physiological status monitors, in which the devices are not worn on the skin or on clothing, but instead reside, in a transient fashion, inside the gastrointestinal tract. The resulting capabilities provide a powerful complement to those found in wearable technologies as traditionally conceived,” says John Rogers, a professor of materials science and engineering at the University of Illinois who was not part of the research team.

Better diagnosis

The researchers expect that the device would remain in the digestive tract for only a day or two, so for longer-term monitoring, patients would swallow new capsules as needed.

For the military, this kind of ingestible device could be useful for monitoring soldiers for fatigue, dehydration, tachycardia, or shock, the researchers say. When combined with a temperature sensor, it could also detect hypothermia, hyperthermia, or fever from infections.

In the future, the researchers plan to design sensors that could diagnose heart conditions such as abnormal heart rhythms (arrhythmias), or breathing problems including emphysema or asthma. Currently doctors require patients to wear a harness (Holter) monitor for up to a week to detect such problems, but these often fail to produce a diagnosis because patients are uncomfortable wearing them 24 hours a day.

“If you could ingest a device that would listen for those pathological sounds, rather than wearing an electrical monitor, that would improve patient compliance,” Swiston says.

The researchers also hope to create sensors that would not only diagnose a problem but also deliver a drug to treat it.

“We hope that one day we’re able to detect certain molecules or a pathogen and then deliver an antibiotic, for example,” Traverso says. “This development provides the foundation for that kind of system down the line.”

MIT has provided a video with two of the researchers describing their work and and plans for its future development,

Here’s a link to and a citation for the paper,

Physiologic Status Monitoring via the Gastrointestinal Tract by G. Traverso, G. Ciccarelli, S. Schwartz, T. Hughes, T. Boettcher, R. Barman, R. Langer, & A. Swiston. PLOS DOI: 10.1371/journal.pone.0141666 Published: November 18, 2015

This paper is open access.

Note added Nov. 25, 2015 at 1625 hours PDT: US National Public Radio (NPR) has a story on this research. You can find Nov. 23, 2015 podcast (about six minutes) and a series of textual excerpts featuring Albert Swiston, biomaterials scientist at MIT, and Stephen Shankland, senior writer for CNET covering digital technology, from the podcast here.

Royal Institution, science, and nanotechnology 101 and #RE_IMAGINE at the London College of Fashion

I’m featuring two upcoming events in London (UK).

Nanotechnology 101: The biggest thing you’ve never seen

 Gold Nanowire Array Credit: lacomj via Flickr: www.flickr.com/photos/40137058@N07/3790862760

Gold Nanowire Array
Credit: lacomj via Flickr: www.flickr.com/photos/40137058@N07/3790862760 [downloaded from http://www.rigb.org/whats-on/events-2015/october/public-nanotechnology-101-the-biggest-thing-you]

Already sold out, this event is scheduled for Oct. 20, 2015. Here’s why you might want to put yourself on a waiting list, from the Royal Institution’s Nanotechnology 101 event page,

How could nanotechnology be used to create smart and extremely resilient materials? Or to boil water three times faster? Join former NASA Nanotechnology Project Manager Michael Meador to learn about the fundamentals of nanotechnology—what it is and why it’s unique—and how this emerging, disruptive technology will change the world. From invisibility cloaks to lightweight fuel-efficient vehicles and a cure for cancer, nanotechnology might just be the biggest thing you can’t see.

About the speaker

Michael Meador is currently Director of the U.S. National Nanotechnology Coordination Office, on secondment from NASA where he had been managing the Nanotechnology Project in the Game Changing Technology Program, working to mature nanotechnologies with high potential for impact on NASA missions. One part of his current job is to communicate nanotechnology research to policy-makers and the public.

Here’s some logistical information from the event page,

7.00pm to 8.30pm, Tuesday 20 October
The Theatre

Standard £12
Concession £8
Associate £6
Free to Members, Faraday Members and Fellows

For anyone who may not know offhand where the Royal Institution and its theatre is located,

The Royal Institution of Great Britain
21 Albemarle Street

+44 (0) 20 7409 2992
(9.00am – 6.00pm Mon – Fri)

Here’s a description of the Royal Institution from its Wikipedia entry (Note: Links have been removed),

The Royal Institution of Great Britain (often abbreviated as the Royal Institution or RI) is an organisation devoted to scientific education and research, based in London.

The Royal Institution was founded in 1799 by the leading British scientists of the age, including Henry Cavendish and its first president, George Finch, the 9th Earl of Winchilsea,[1] for

diffusing the knowledge, and facilitating the general introduction, of useful mechanical inventions and improvements; and for teaching, by courses of philosophical lectures and experiments, the application of science to the common purposes of life.
— [2]

Much of its initial funding and the initial proposal for its founding were given by the Society for Bettering the Conditions and Improving the Comforts of the Poor, under the guidance of philanthropist Sir Thomas Bernard and American-born British scientist Sir Benjamin Thompson, Count Rumford. Since its founding it has been based at 21 Albemarle Street in Mayfair. Its Royal Charter was granted in 1800. The Institution announced in January 2013 that it was considering sale of its Mayfair headquarters to meet its mounting debts.[3]


While this isn’t a nanotechnology event, it does touch on topics discussed here many times: wearable technology, futuristic fashion, and the integration of technology into the body. The Digital Anthropology Lab (of the  London College of Fashion, which is part of the University of the Arts London) is being officially launched with a special event on Oct. 16, 2015. Before describing the event, here’s more about the Digital Anthropology Lab from its homepage,

Crafting fashion experience digitally

The Digital Anthropology Lab, launching in Autumn 2015, London College of Fashion, University of the Arts London is a research studio bringing industry and academia together to develop a new way of making smarter with technology.

The Digital Anthropology Lab, London College of Fashion, experiments with artefacts, communities, consumption and making in the digital space, using 3D printing, body scanning, code and electronics. We focus on an experimental approach to digital anthropology, allowing us to practically examine future ways in which digital collides with the human experience. We connect commercial partners to leading research academics and graduate students, exploring seed ideas for fashion tech.


We radically re-imagine this emerging fashion- tech space, exploring both the beautification of technology for wearables and critically explore the ‘why.’


Join us to experiment with, ‘The Internet of Fashion Things.’ Where the Internet of Things, invisible big data technologies, virtual fit and meta-data collide.


With the luxury of the imagination, we aim to re- wire our digital ambitions and think again about designing future digital fashion experiences for generation 2050.

Here’s information I received from the Sept. 30, 2015 announcement I received via email,

The Digital Anthropology Lab at London College of Fashion, UAL invites you to #RE_IMAGINE: A forum exploring the now, near and future of fashion technology.

#RE_IMAGINE, the Digital Anthropology Lab’s launch event, will present a fantastically diverse range of digital speakers and ask them to respond to the question – ‘Where are our digital selves heading?’

Join us to hear from pioneers, risk takers, entrepreneurs, designers and inventors including Ian Livingston CBE, Luke Robert Mason from New Bionics, Katie Baron from Stylus, J. Meejin Yoon from MIT among others. Also come to see what happened when we made fashion collide with the Internet of Things, they are wearable but not as you know it…

#RE_IMAGINE aims to be an informative, networked and enlightening brainstorm of a day. To book your place please follow this link.

To coincide with the exhibition Digital Disturbances, Fashion Space Gallery presents a late night opening event. Alongside a curator tour will be a series of interactive demonstrations and displays which bring together practitioners working across design, science and technology to investigate possible human and material futures. We’d encourage you to stay and enjoy this networking opportunity.

Friday 16th October 2015

9.30am – 5pm – Forum event 

5pm – 8.30pm – Digital Disturbances networking event

London College of Fashion

20 John Princes Street
W1G 0BJ 

Ticket prices are £75.00 for a standard ticket and £35.00 for concession tickets (more details here).

For more #RE_IMAGINE specifics, there’s the event’s Agenda page. As for Digital Disturbances, here’s more from the Fashion Space Gallery’s Exhibition homepage,

Digital Disturbances

11th September – 12th December 2015

Digital Disturbances examines the influence of digital concepts and tools on fashion. It provides a lens onto the often strange effects that emerge from interactions across material and virtual platforms – information both lost and gained in the process of translation. It presents the work of seven designers and creative teams whose work documents these interactions and effects, both in the design and representation of fashion. They can be traced across the surfaces of garments, through the realisation of new silhouettes, in the remixing of images and bodies in photography and film, and into the nuances of identity projected into social and commercial spaces.

Designers include: ANREALAGE, Bart Hess, POSTmatter, Simone C. Niquille and Alexander Porter, Flora Miranda, Texturall and Tigran Avetisyan.

Digital Disturbances is curated by Leanne Wierzba.

Two events—two peeks into the future.

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.

People for the Ethical Treatment of Animals (PETA) and a grant for in vitro nanotoxicity testing

This grant seems to have gotten its start at a workshop held at the US Environmental Protection Agency (EPA) in Washington, D.C., Feb. 24-25, 2015 as per this webpage on the People for Ethical Treatment of Animals (PETA) International Science Consortium Limited website,

The invitation-only workshop included experts from different sectors (government, industry, academia and NGO) and disciplines (in vitro and in vivo inhalation studies of NMs, fibrosis, dosimetry, fluidic models, aerosol engineering, and regulatory assessment). It focused on the technical details for the development and preliminary assessment of the relevance and reliability of an in vitro test to predict the development of pulmonary fibrosis in cells co-cultured at the air-liquid interface following exposure to aerosolized multi-walled carbon nanotubes (MWCNTs). During the workshop, experts made recommendations on cell types, exposure systems, endpoints and dosimetry considerations required to develop the in vitro model for hazard identification of MWCNTs.

The method is intended to be included in a non-animal test battery to reduce and eventually replace the use of animals in studies to assess the inhalation toxicity of engineered NMs. The long-term vision is to develop a battery of in silico and in vitro assays that can be used in an integrated testing strategy, providing comprehensive information on biological endpoints relevant to inhalation exposure to NMs which could be used in the hazard ranking of substances in the risk assessment process.

A September 1, 2015 news item on Azonano provides an update,

The PETA International Science Consortium Ltd. announced today the winners of a $200,000 award for the design of an in vitro test to predict the development of lung fibrosis in humans following exposure to nanomaterials, such as multi-walled carbon nanotubes.

Professor Dr. Barbara Rothen-Rutishauser of the Adolphe Merkle Institute at the University of Fribourg, Switzerland and Professor Dr. Vicki Stone of the School of Life Sciences at Heriot-Watt University, Edinburgh, U.K. will jointly develop the test method. Professor Rothen-Rutishauser co-chairs the BioNanomaterials research group at the Adolphe Merkle Institute, where her research is focused on the study of nanomaterial-cell interactions in the lung using three-dimensional cell models. Professor Vicki Stone is the Director of the Nano Safety Research Group at Heriot-Watt University and the Director of Toxicology for SAFENANO.

The Science Consortium is also funding MatTek Corporation for the development of a three-dimensional reconstructed primary human lung tissue model to be used in Professors Rothen-Rutishauser and Stone’s work. MatTek Corporation has extensive expertise in manufacturing human cell-based, organotypic in vitro models for use in regulatory and basic research applications. The work at MatTek will be led by Dr. Patrick Hayden, Vice President of Scientific Affairs, and Dr. Anna Maione, head of MatTek’s airway models research group.

I was curious about MatTek Corporation and found this on company’s About Us webpage,

MatTek Corporation was founded in 1985 by two chemical engineering professors from MIT. In 1991 the company leveraged its core polymer surface modification technology into the emerging tissue engineering market.

MatTek Corporation is at the forefront of tissue engineering and is a world leader in the production of innovative 3D reconstructed human tissue models. Our skin, ocular, and respiratory tissue models are used in regulatory toxicology (OECD, EU guidelines) and address toxicology and efficacy concerns throughout the cosmetics, chemical, pharmaceutical and household product industries.

EpiDerm™, MatTek’s first 3D human cell based in vitro model, was introduced in 1993 and became an immediate technical and commercial success.

I wish them good luck in their research on developing better ways to test toxicity.

Carbon nanotubes as sensors in the body

Rachel Ehrenberg has written an Aug. 21, 2015 news item about the latest and greatest carbon nanotube-based biomedical sensors for the journal Nature,

The future of medical sensors may be going down the tubes. Chemists are developing tiny devices made from carbon nanotubes wrapped with polymers to detect biologically important compounds such as insulin, nitric oxide and the blood-clotting protein fibrinogen. The hope is that these sensors could simplify and automate diagnostic tests.

Preliminary experiments in mice, reported by scientists at a meeting of the American Chemical Society in Boston, Massachusetts, this week [Aug. 16 – 20, 2015], suggest that the devices are safe to introduce into the bloodstream or implant under the skin. Researchers also presented data showing that the nanotube–polymer complexes could measure levels of large molecules, a feat that has been difficult for existing technologies.

Ehrenberg focuses on one laboratory in particular (Note: Links have been removed),

“Anything the body makes, it is meant to degrade,” says chemical engineer Michael Strano, whose lab at the Massachusetts Institute of Technology (MIT) in Cambridge is behind much of the latest work1. “Our vision is to make a sensing platform that can monitor a whole range of molecules, and do it in the long term.”

To design one sensor, MIT  researchers coated nanotubes with a mix of polymers and nucleotides and screened for configurations that would bind to the protein fibrinogen. This large molecule is important for building blood clots; its concentration can indicate bleeding disorders, liver disease or impending cardiovascular trouble. The team recently hit on a material that worked — a first for such a large molecule, according to MIT nanotechnology specialist Gili Bisker. Bisker said at the chemistry meeting that the fibrinogen-detecting nanotubes could be used to measure levels of the protein in blood samples, or implanted in body tissue to detect changing fibrinogen levels that might indicate a clot.

The MIT team has also developed2 a sensor that can be inserted beneath the skin to monitor glucose or insulin levels in real time, Bisker reported. The team imagines putting a small patch that contains a wireless device on the skin just above the embedded sensor. The patch would shine light on the sensor and measure its fluorescence, then transmit that data to a mobile phone for real-time monitoring.

Another version of the sensor, developed3 at MIT by biomedical engineer Nicole Iverson and colleagues, detects nitric oxide. This signalling molecule typically indicates inflammation and is associated with many cancer cells. When embedded in a hydrogel matrix, the sensor kept working in mice for more than 400 days and caused no local inflammation, MIT chemical engineer Michael Lee reported. The nitric oxide sensors also performed well when injected into the bloodstreams of mice, successfully passing through small capillaries in the lungs, which are an area of concern for nanotube toxicity. …

There’s at least one corporate laboratory (Google X), working on biosensors although their focus is a little different. From a Jan. 9, 2015 article by Brian Womack and Anna Edney for BloombergBusiness,

Google Inc. sent employees with ties to its secretive X research group to meet with U.S. regulators who oversee medical devices, raising the possibility of a new product that may involve biosensors from the unit that developed computerized glasses.

The meeting included at least four Google workers, some of whom have connections with Google X — and have done research on sensors, including contact lenses that help wearers monitor their biological data. Google staff met with those at the Food and Drug Administration who regulate eye devices and diagnostics for heart conditions, according to the agency’s public calendar. [emphasis mine]

This approach from Google is considered noninvasive,

“There is actually one interface on the surface of the body that can literally provide us with a window of what happens inside, and that’s the surface of the eye,” Parviz [Babak Parviz, … was involved in the Google Glass project and has talked about putting displays on contact lenses, including lenses that monitor wearer’s health]  said in a video posted on YouTube. “It’s a very interesting chemical interface.”

Of course, the assumption is that all this monitoring is going to result in  healthier people but I can’t help thinking about an old saying ‘a little knowledge can be a dangerous thing’. For example, we lived in a world where bacteria roamed free and then we learned how to make them visible, determined they were disease-causing, and began campaigns for killing them off. Now, it turns out that at least some bacteria are good for us and, moreover, we’ve created other, more dangerous bacteria that are drug-resistant. Based on the bacteria example, is it possible that with these biosensors we will observe new phenomena and make similar mistakes?

Scaling graphene production up to industrial strength

If graphene is going to be a ubiquitous material in the future, production methods need to change. An Aug. 7, 2015 news item on Nanowerk announces a new technique to achieve that goal,

Producing graphene in bulk is critical when it comes to the industrial exploitation of this exceptional two-dimensional material. To that end, [European Commission] Graphene Flagship researchers have developed a novel variant on the chemical vapour deposition process which yields high quality material in a scalable manner. This advance should significantly narrow the performance gap between synthetic and natural graphene.

An Aug. 7, 2015 European Commission Graphene Flagship press release by Francis Sedgemore, which originated the news item, describes the problem,

Media-friendly Nobel laureates peeling layers of graphene from bulk graphite with sticky tape may capture the public imagination, but as a manufacturing process the technique is somewhat lacking. Mechanical exfoliation may give us pristine graphene, but industry requires scalable and cost-effective production processes with much higher yields.

On to the new method (from the press release),

Flagship-affiliated physicists from RWTH Aachen University and Forschungszentrum Jülich have together with colleagues in Japan devised a method for peeling graphene flakes from a CVD substrate with the help of intermolecular forces. …

Key to the process is the strong van der Waals interaction that exists between graphene and hexagonal boron nitride, another 2d material within which it is encapsulated. The van der Waals force is the attractive sum of short-range electric dipole interactions between uncharged molecules.

Thanks to strong van der Waals interactions between graphene and boron nitride, CVD graphene can be separated from the copper and transferred to an arbitrary substrate. The process allows for re-use of the catalyst copper foil in further growth cycles, and minimises contamination of the graphene due to processing.

Raman spectroscopy and transport measurements on the graphene/boron nitride heterostructures reveals high electron mobilities comparable with those observed in similar assemblies based on exfoliated graphene. Furthermore – and this comes as something of a surprise to the researchers – no noticeable performance changes are detected between devices developed in the first and subsequent growth cycles. This confirms the copper as a recyclable resource in the graphene fabrication process.

“Chemical vapour deposition is a highly scalable and cost-efficient technology,” says Christoph Stampfer, head of the 2nd Institute of Physics A in Aachen, and co-author of the technical article. “Until now, graphene synthesised this way has been significantly lower in quality than that obtained with the scotch-tape method, especially when it comes to the material’s electronic properties. But no longer. We demonstrate a novel fabrication process based on CVD that yields ultra-high quality synthetic graphene samples. The process is in principle suitable for industrial-scale production, and narrows the gap between graphene research and its technological applications.”

With their dry-transfer process, Banszerus and his colleagues have shown that the electronic properties of CVD-grown graphene can in principle match those of ultrahigh-mobility exfoliated graphene. The key is to transfer CVD graphene from its growth substrate in such a way that chemical contamination is avoided. The high mobility of pristine graphene is thus preserved, and the approach allows for the substrate material to be recycled without degradation.

Here’s a link to and citation for the paper,

Ultrahigh-mobility graphene devices from chemical vapor deposition on reusable copper by Luca Banszerus, Michael Schmitz, Stephan Engels, Jan Dauber, Martin Oellers, Federica Haupt, Kenji Watanabe, Takashi Taniguchi, Bernd Beschoten, and Christoph Stampfer. Science Advances  31 Jul 2015: Vol. 1, no. 6, e1500222 DOI: 10.1126/sciadv.1500222

This article appears to be open access.

For those interested in finding out more about chemical vapour deposition (CVD), David Chandler has written a June 19, 2015 article for the Massachusetts Institute of Technology (MIT) titled:  Explained: chemical vapor deposition (Technique enables production of pure, uniform coatings of metals or polymers, even on contoured surfaces.)

Nanoscale imaging of a mouse brain

Researchers have developed a new brain imaging tool they would like to use as a founding element for a national brain observatory. From a July 30, 2015 news item on Azonano,

A new imaging tool developed by Boston scientists could do for the brain what the telescope did for space exploration.

In the first demonstration of how the technology works, published July 30 in the journal Cell, the researchers look inside the brain of an adult mouse at a scale previously unachievable, generating images at a nanoscale resolution. The inventors’ long-term goal is to make the resource available to the scientific community in the form of a national brain observatory.

A July 30, 2015 Cell Press news release on EurekAlert, which originated the news item, expands on the theme,

“I’m a strong believer in bottom up-science, which is a way of saying that I would prefer to generate a hypothesis from the data and test it,” says senior study author Jeff Lichtman, of Harvard University. “For people who are imagers, being able to see all of these details is wonderful and we’re getting an opportunity to peer into something that has remained somewhat intractable for so long. It’s about time we did this, and it is what people should be doing about things we don’t understand.”

The researchers have begun the process of mining their imaging data by looking first at an area of the brain that receives sensory information from mouse whiskers, which help the animals orient themselves and are even more sensitive than human fingertips. The scientists used a program called VAST, developed by co-author Daniel Berger of Harvard and the Massachusetts Institute of Technology, to assign different colors and piece apart each individual “object” (e.g., neuron, glial cell, blood vessel cell, etc.).

“The complexity of the brain is much more than what we had ever imagined,” says study first author Narayanan “Bobby” Kasthuri, of the Boston University School of Medicine. “We had this clean idea of how there’s a really nice order to how neurons connect with each other, but if you actually look at the material it’s not like that. The connections are so messy that it’s hard to imagine a plan to it, but we checked and there’s clearly a pattern that cannot be explained by randomness.”

The researchers see great potential in the tool’s ability to answer questions about what a neurological disorder actually looks like in the brain, as well as what makes the human brain different from other animals and different between individuals. Who we become is very much a product of the connections our neurons make in response to various life experiences. To be able to compare the physical neuron-to-neuron connections in an infant, a mathematical genius, and someone with schizophrenia would be a leap in our understanding of how our brains shape who we are (or vice versa).

The cost and data storage demands for this type of research are still high, but the researchers expect expenses to drop over time (as has been the case with genome sequencing). To facilitate data sharing, the scientists are now partnering with Argonne National Laboratory with the hopes of creating a national brain laboratory that neuroscientists around the world can access within the next few years.

“It’s bittersweet that there are many scientists who think this is a total waste of time as well as a big investment in money and effort that could be better spent answering questions that are more proximal,” Lichtman says. “As long as data is showing you things that are unexpected, then you’re definitely doing the right thing. And we are certainly far from being out of the surprise element. There’s never a time when we look at this data that we don’t see something that we’ve never seen before.”

Here’s a link to and a citation for the paper,

Saturated Reconstruction of a Volume of Neocortex by Narayanan Kasthuri, Kenneth Jeffrey Hayworth, Daniel Raimund Berger, Richard Lee Schalek, José Angel Conchello, Seymour Knowles-Barley, Dongil Lee, Amelio Vázquez-Reina, Verena Kaynig, Thouis Raymond Jones, Mike Roberts, Josh Lyskowski Morgan, Juan Carlos Tapia, H. Sebastian Seung, William Gray Roncal, Joshua Tzvi Vogelstein, Randal Burns, Daniel Lewis Sussman, Carey Eldin Priebe, Hanspeter Pfister, Jeff William Lichtman. Cell Volume 162, Issue 3, p648–661, 30 July 2015 DOI: http://dx.doi.org/10.1016/j.cell.2015.06.054

This appears to be an open access paper.

IBM and its working 7nm test chip

I wrote abut IBM and its plans for a 7nm computer chip last year in a July 11, 2014 posting, which featured IBM and mention of HP Labs and other company’s plans for shrinking their computer chips. Almost one year later, IBM has announced, in a July 9, 2015 IBM news release on PRnewswire.com the accomplishment of a working 7nm test chip,

An alliance led by IBM Research (NYSE: IBM) today announced that it has produced the semiconductor industry’s first 7nm (nanometer) node test chips with functioning transistors.  The breakthrough, accomplished in partnership with GLOBALFOUNDRIES and Samsung at SUNY Polytechnic Institute’s Colleges of Nanoscale Science and Engineering (SUNY Poly CNSE), could result in the ability to place more than 20 billion tiny switches — transistors — on the fingernail-sized chips that power everything from smartphones to spacecraft.

To achieve the higher performance, lower power and scaling benefits promised by 7nm technology, researchers had to bypass conventional semiconductor manufacturing approaches. Among the novel processes and techniques pioneered by the IBM Research alliance were a number of industry-first innovations, most notably Silicon Germanium (SiGe) channel transistors and Extreme Ultraviolet (EUV) lithography integration at multiple levels.

Industry experts consider 7nm technology crucial to meeting the anticipated demands of future cloud computing and Big Data systems, cognitive computing, mobile products and other emerging technologies. Part of IBM’s $3 billion, five-year investment in chip R&D (announced in 2014), this accomplishment was made possible through a unique public-private partnership with New York State and joint development alliance with GLOBALFOUNDRIES, Samsung and equipment suppliers. The team is based at SUNY Poly’s NanoTech Complex in Albany [New York state].

“For business and society to get the most out of tomorrow’s computers and devices, scaling to 7nm and beyond is essential,” said Arvind Krishna, senior vice president and director of IBM Research. “That’s why IBM has remained committed to an aggressive basic research agenda that continually pushes the limits of semiconductor technology. Working with our partners, this milestone builds on decades of research that has set the pace for the microelectronics industry, and positions us to advance our leadership for years to come.”

Microprocessors utilizing 22nm and 14nm technology power today’s servers, cloud data centers and mobile devices, and 10nm technology is well on the way to becoming a mature technology. The IBM Research-led alliance achieved close to 50 percent area scaling improvements over today’s most advanced technology, introduced SiGe channel material for transistor performance enhancement at 7nm node geometries, process innovations to stack them below 30nm pitch and full integration of EUV lithography at multiple levels. These techniques and scaling could result in at least a 50 percent power/performance improvement for next generation mainframe and POWER systems that will power the Big Data, cloud and mobile era.

“Governor Andrew Cuomo’s trailblazing public-private partnership model is catalyzing historic innovation and advancement. Today’s [July 8, 2015] announcement is just one example of our collaboration with IBM, which furthers New York State’s global leadership in developing next generation technologies,” said Dr. Michael Liehr, SUNY Poly Executive Vice President of Innovation and Technology and Vice President of Research.  “Enabling the first 7nm node transistors is a significant milestone for the entire semiconductor industry as we continue to push beyond the limitations of our current capabilities.”

“Today’s announcement marks the latest achievement in our long history of collaboration to accelerate development of next-generation technology,” said Gary Patton, CTO and Head of Worldwide R&D at GLOBALFOUNDRIES. “Through this joint collaborative program based at the Albany NanoTech Complex, we are able to maintain our focus on technology leadership for our clients and partners by helping to address the development challenges central to producing a smaller, faster, more cost efficient generation of semiconductors.”

The 7nm node milestone continues IBM’s legacy of historic contributions to silicon and semiconductor innovation. They include the invention or first implementation of the single cell DRAM, the Dennard Scaling Laws, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed SiGe, High-k gate dielectrics, embedded DRAM, 3D chip stacking and Air gap insulators.

In 2014, they were talking about carbon nanotubes with regard to the 7nm chip, this shift to silicon germanium is interesting.

Sebastian Anthony in a July 9, 2015 article for Ars Technica offers some intriguing insight into the accomplishment and the technology (Note: A link has been removed),

… While it should be stressed that commercial 7nm chips remain at least two years away, this test chip from IBM and its partners is extremely significant for three reasons: it’s a working sub-10nm chip (this is pretty significant in itself); it’s the first commercially viable sub-10nm FinFET logic chip that uses silicon-germanium as the channel material; and it appears to be the first commercially viable design produced with extreme ultraviolet (EUV) lithography.

Technologically, SiGe and EUV are both very significant. SiGe has higher electron mobility than pure silicon, which makes it better suited for smaller transistors. The gap between two silicon nuclei is about 0.5nm; as the gate width gets ever smaller (about 7nm in this case), the channel becomes so small that the handful of silicon atoms can’t carry enough current. By mixing some germanium into the channel, electron mobility increases, and adequate current can flow. Silicon generally runs into problems at sub-10nm nodes, and we can expect Intel and TSMC to follow a similar path to IBM, GlobalFoundries, and Samsung (aka the Common Platform alliance).

EUV lithography is an more interesting innovation. Basically, as chip features get smaller, you need a narrower beam of light to etch those features accurately, or you need to use multiple patterning (which we won’t go into here). The current state of the art for lithography is a 193nm ArF (argon fluoride) laser; that is, the wavelength is 193nm wide. Complex optics and multiple painstaking steps are required to etch 14nm features using a 193nm light source. EUV has a wavelength of just 13.5nm, which will handily take us down into the sub-10nm realm, but so far it has proven very difficult and expensive to deploy commercially (it has been just around the corner for quite a few years now).

If you’re interested in the nuances, I recommend reading Anthony’s article in its entirety.

One final comment, there was no discussion of electrodes or other metallic components associated with computer chips. The metallic components are a topic of some interest to me (anyway), given some research published by scientists at the Massachusetts Institute of Technology (MIT) last year. From my Oct. 14, 2014 posting,

Research from the Massachusetts Institute of Technology (MIT) has revealed a new property of metal nanoparticles, in this case, silver. From an Oct. 12, 2014 news item on ScienceDaily,

A surprising phenomenon has been found in metal nanoparticles: They appear, from the outside, to be liquid droplets, wobbling and readily changing shape, while their interiors retain a perfectly stable crystal configuration.

The research team behind the finding, led by MIT professor Ju Li, says the work could have important implications for the design of components in nanotechnology, such as metal contacts for molecular electronic circuits. [my emphasis added]

This discovery and others regarding materials and phase changes at ever diminishing sizes hint that a computer with a functioning 7nm chip might be a bit further off than IBM is suggesting.