Tag Archives: University of Illinois

Detecting COVID-19 in under five minutes with paper-based sensor made of graphene

A Dec. 7, 2020 news item on Nanowerk announced a new technology for rapid COVID-19 testing (Note: A link has been removed),

As the COVID-19 pandemic continues to spread across the world, testing remains a key strategy for tracking and containing the virus. Bioengineering graduate student, Maha Alafeef, has co-developed a rapid, ultrasensitive test using a paper-based electrochemical sensor that can detect the presence of the virus in less than five minutes.

The team led by professor Dipanjan Pan reported their findings in ACS Nano (“Rapid, Ultrasensitive, and Quantitative Detection of SARS-CoV-2 Using Antisense Oligonucleotides Directed Electrochemical Biosensor Chip”).

“Currently, we are experiencing a once-in-a-century life-changing event,” said Alafeef. “We are responding to this global need from a holistic approach by developing multidisciplinary tools for early detection and diagnosis and treatment for SARS-CoV-2.”

I wonder why they didn’t think to provide a caption for the graphene substrate (the square surface) underlying the gold electrode (the round thing) or provide a caption for the electrode. Maybe they assumed anyone knowledgeable about graphene would be able to identify it?

Caption: COVID-19 electrochemical sensing platform. Credit: University of Illinois

A Dec. 7, 2020 University of Illinois Grainger College of Engineering news release (also on EurekAlert) by Huan Song, which originated the news item, provides more technical detail including a description of the graphene substrate and the gold electrode, which make up the cheaper, faster COVID-19 sensing platform,

There are two broad categories of COVID-19 tests on the market. The first category uses reverse transcriptase real-time polymerase chain reaction (RT-PCR) and nucleic acid hybridization strategies to identify viral RNA. Current FDA [US Food and Drug Administration]-approved diagnostic tests use this technique. Some drawbacks include the amount of time it takes to complete the test, the need for specialized personnel and the availability of equipment and reagents.

The second category of tests focuses on the detection of antibodies. However, there could be a delay of a few days to a few weeks after a person has been exposed to the virus for them to produce detectable antibodies.

In recent years, researchers have had some success with creating point-of-care biosensors using 2D nanomaterials such as graphene to detect diseases. The main advantages of graphene-based biosensors are their sensitivity, low cost of production and rapid detection turnaround. “The discovery of graphene opened up a new era of sensor development due to its properties. Graphene exhibits unique mechanical and electrochemical properties that make it ideal for the development of sensitive electrochemical sensors,” said Alafeef. The team created a graphene-based electrochemical biosensor with an electrical read-out setup to selectively detect the presence of SARS-CoV-2 genetic material.

There are two components [emphasis mine] to this biosensor: a platform to measure an electrical read-out and probes to detect the presence of viral RNA. To create the platform, researchers first coated filter paper with a layer of graphene nanoplatelets to create a conductive film [emphasis mine]. Then, they placed a gold electrode with a predefined design on top of the graphene [emphasis mine] as a contact pad for electrical readout. Both gold and graphene have high sensitivity and conductivity which makes this platform ultrasensitive to detect changes in electrical signals.

Current RNA-based COVID-19 tests screen for the presence of the N-gene (nucleocapsid phosphoprotein) on the SARS-CoV-2 virus. In this research, the team designed antisense oligonucleotide (ASOs) probes to target two regions of the N-gene. Targeting two regions ensures the reliability of the senor in case one region undergoes gene mutation. Furthermore, gold nanoparticles (AuNP) are capped with these single-stranded nucleic acids (ssDNA), which represents an ultra-sensitive sensing probe for the SARS-CoV-2 RNA.

The researchers previously showed the sensitivity of the developed sensing probes in their earlier work published in ACS Nano. The hybridization of the viral RNA with these probes causes a change in the sensor electrical response. The AuNP caps accelerate the electron transfer and when broadcasted over the sensing platform, results in an increase in the output signal and indicates the presence of the virus.

The team tested the performance of this sensor by using COVID-19 positive and negative samples. The sensor showed a significant increase in the voltage of positive samples compared to the negative ones and confirmed the presence of viral genetic material in less than five minutes. Furthermore, the sensor was able to differentiate viral RNA loads in these samples. Viral load is an important quantitative indicator of the progress of infection and a challenge to measure using existing diagnostic methods.

This platform has far-reaching applications due to its portability and low cost. The sensor, when integrated with microcontrollers and LED screens or with a smartphone via Bluetooth or wifi, could be used at the point-of-care in a doctor’s office or even at home. Beyond COVID-19, the research team also foresees the system to be adaptable for the detection of many different diseases.

“The unlimited potential of bioengineering has always sparked my utmost interest with its innovative translational applications,” Alafeef said. “I am happy to see my research project has an impact on solving a real-world problem. Finally, I would like to thank my Ph.D. advisor professor Dipanjan Pan for his endless support, research scientist Dr. Parikshit Moitra, and research assistant Ketan Dighe for their help and contribution toward the success of this study.”

Here’s a link to and a citation for the paper,

Rapid, Ultrasensitive, and Quantitative Detection of SARS-CoV-2 Using Antisense Oligonucleotides Directed Electrochemical Biosensor Chip by Maha Alafeef, Ketan Dighe, Parikshit Moitra, and Dipanjan Pan. ACS Nano 2020, 14, 12, 17028–17045 DOI: https://doi.org/10.1021/acsnano.0c06392 Publication Date:October 20, 2020 Copyright © 2020 American Chemical Society

I’m not sure where I found this notice but it is most definitely from the American Chemical Society: “This paper is freely accessible, at this time, for unrestricted RESEARCH re-use and analyses in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic.”

It’s about sound: (1) Earable computing? (2) The end of the cacophony in hospitals?

I have two items, both concerning sound but in very different ways.

Phones in your ears

Researchers at the University of Illinois are working on smartphones you can put in your ears like you do an earbud. The work is in its very earliest stages as they are trying to establish a new field of research. There is a proposed timeline,

Caption: Earable computing timeline, according to SyNRG. Credit: Romit Roy Choudhury, The Grainger College of Engineering

Here’s more from a December 2, 2020 University of Illinois Grainger College of Engineering news release (also on EurekAlert but published on December 15, 2020),

CSL’s [Coordinated Science Laboratory] Systems and Networking Research Group (SyNRG) is defining a new sub-area of mobile technology that they call “earable computing.” The team believes that earphones will be the next significant milestone in wearable devices, and that new hardware, software, and apps will all run on this platform.

“The leap from today’s earphones to ‘earables’ would mimic the transformation that we had seen from basic phones to smartphones,” said Romit Roy Choudhury, professor in electrical and computer engineering (ECE). “Today’s smartphones are hardly a calling device anymore, much like how tomorrow’s earables will hardly be a smartphone accessory.”

Instead, the group believes tomorrow’s earphones will continuously sense human behavior, run acoustic augmented reality, have Alexa and Siri whisper just-in-time information, track user motion and health, and offer seamless security, among many other capabilities.

The research questions that underlie earable computing draw from a wide range of fields, including sensing, signal processing, embedded systems, communications, and machine learning. The SyNRG team is on the forefront of developing new algorithms while also experimenting with them on real earphone platforms with live users.

Computer science PhD student Zhijian Yang and other members of the SyNRG group, including his fellow students Yu-Lin Wei and Liz Li, are leading the way. They have published a series of papers in this area, starting with one on the topic of hollow noise cancellation that was published at ACM SIGCOMM 2018. Recently, the group had three papers published at the 26th Annual International Conference on Mobile Computing and Networking (ACM MobiCom) on three different aspects of earables research: facial motion sensing, acoustic augmented reality, and voice localization for earphones.

In Ear-AR: Indoor Acoustic Augmented Reality on Earphones, the group looks at how smart earphone sensors can track human movement, and, depending on the user’s location, play 3D sounds in the ear.

“If you want to find a store in a mall,” says Zhijian, “the earphone could estimate the relative location of the store and play a 3D voice that simply says ‘follow me.’ In your ears, the sound would appear to come from the direction in which you should walk, as if it’s a voice escort.”

The second paper, EarSense: Earphones as a Teeth Activity Sensor, looks at how earphones could sense facial and in-mouth activities such as teeth movements and taps, enabling a hands-free modality of communication to smartphones. Moreover, various medical conditions manifest in teeth chatter, and the proposed technology would make it possible to identify them by wearing earphones during the day. In the future, the team is planning to look into analyzing facial muscle movements and emotions with earphone sensors.

The third publication, Voice Localization Using Nearby Wall Reflections, investigates the use of algorithms to detect the direction of a sound. This means that if Alice and Bob are having a conversation, Bob’s earphones would be able to tune into the direction Alice’s voice is coming from.

“We’ve been working on mobile sensing and computing for 10 years,” said Wei. “We have a lot of experience to define this emerging landscape of earable computing.”

Haitham Hassanieh, assistant professor in ECE, is also involved in this research. The team has been funded by both NSF [US National Science Foundation] and NIH [National Institutes of Health], as well as companies like Nokia and Google. See more at the group’s Earable Computing website.

Noise hurts hospital caregivers and patients

A December 11, 2020 Canadian Broadcasting Corporation (CBC) article features one of the corporation’s Day 6 Radio programmes. This one was about proposed sound design in hospitals as imagined by musician Yoko Sen and, on a converging track, professor of applied psychology, Judy Edworthy,

As an ambient electronic musician, Yoko Sen spends much of her time building intricate, soothing soundscapes. 

But when she was hospitalized in 2012, she found herself immersed in a very different sound environment.

Already panicked by her health condition, she couldn’t tune out the harsh tones of the medical machinery in her hospital room.

Instead, she zeroed in on two machines — a patient monitor and a bed fall alarm. Their piercing tones had blended together to create a diminished fifth, a musical interval so offensive that it was banned in medieval churches.

Sen went on to start Sen Sound, a Washington, D.C.-based social enterprise dedicated to improving the sound of hospitals. 

‘Alarms are ignored, missed’

The volume of noise in today’s hospitals isn’t just unpleasant. It can also put patients’ health at risk.

According to Judy Edworthy, a professor of applied psychology at the University of Plymouth, the sheer number of alarms going off each day can spark a sort of auditory burnout among doctors and nurses.

“Alarms are ignored, missed, or generally just not paid attention to,” says Edworthy.

In a hospital environment that’s also inundated with announcements from overhead speakers, ringing phones, trolleys, and all other manner of insidious background sound, it can be difficult for staff to accurately locate and differentiate between the alarms.

Worse yet, in many hospitals, many of the alarms ringing out across the ward are false. Studies have shown that as many as 99 per cent of clinical alarms are inaccurate

The resulting problem has become so widespread that a term has been coined to describe it: alarm fatigue.

Raising the alarm

Sen’s company, launched in 2016, has partnered with hospitals and design incubators and even collaborates directly with medical device companies seeking to redesign their alarms.

Over the years, Sen has interviewed countless patients and hospital staff who share her frustration with noise. 

But when she first sat down with the engineers responsible for the devices’ design, she found that they tended to treat the sound of their devices as an “afterthought.” 

“When people first started to develop medical devices … people thought it was a good idea to have one or two sounds to demonstrate or to indicate when, let’s say for example, the patient’s temperature … exceeded some kind of range,” she [Edworthy] said.

“There wasn’t really any design put into this; it was just a sound that people thought would get your attention by being very loud and aversive and so on.”

Edworthy, who has spent decades studying medical alarm design, took things one step further this summer. In July, the International Standards Organization approved a new set of alarm designs, created by Edworthy, that mimic the natural hospital environment.

The standards, which are accepted by Health Canada, include an electronic heartbeat sound for alarms related to cardiac issues; and a rattling pillbox for drug administration. 

Her [Sen’s] team continues to work with companies to improve the sound of existing medical devices. But she has also begun to think more deeply about the long-term future of hospital sound — especially as it relates to the end-of-life experience.

“A study shows that hearing can be the last sense to go when we die,” says Sen. 

“It’s really beyond upsetting to think that many people end up dying in acute care hospitals and there are all these medical devices.”

As part of her interviews with patients, Sen has asked what sounds they would most like to hear at the end of their lives — and she discovered a common theme in their responses.

“I asked this question in many different countries, but they are all sounds that symbolize life,” said Sen. 

“Sounds of nature, sound of water, voices of loved ones. It’s all the sounds of life that people say they want to hear.”

As the pandemic continues to affect hospitals around the world, those efforts have taken on a new resonance — and Sen hopes the current crisis might serve as an opportunity to help usher in a more healing soundscape.

“My own health crisis almost gave me a new pathway in life,” she said.

There’s an embedded audio file from the Day 6 radio programme featuring this material on sound design and hospitals and there’s an embedded video of Sen delivery a talk about her work in the December 11, 2020 Canadian Broadcasting Corporation (CBC) article.

Sen and Edworthy give hope for a less noisy future and better care in hospitals for the living and the dying. Here’s a link to Sen Sound.

Quantum Rhapsodies

“Quantum Rhapsodies” combines a narrative script, video images and live music by the Jupiter String Quartet to explore the world of quantum physics. The performance will premiere April 10 [2019] at the Beckman Institute for Advanced Science and Technology. Courtesy Beckman Institute for Advanced Science and Technology

Here’s more about Quantum Rhapsodies, a free public art/science music performance at the University of Illinois on April 10, 2019, from an April 5, 2019 University of Illinois news release (also here) by Jodi Heckel,

A new performance that explores the world of quantum physics will feature the music of the Jupiter String Quartet, a fire juggler and a fantastical “Alice in Quantumland” scene.

“Quantum Rhapsodies,” the vision of physics professor Smitha Vishveshwara, looks at the foundational developments in quantum physics, the role it plays in our world and in technology such as the MRI, and the quantum mysteries that remain unanswered.

“The quantum world is a world that inspires awe, but it’s also who we are and what we are made of,” said Vishveshwara, who wrote the piece and guided the visuals.

The performance will premiere April 10 [2019] as part of the 30th anniversary celebration of the Beckman Institute for Advanced Science and Technology. The event begins with a 5 p.m. reception, followed by the performance at 6 p.m. and a meet-and-greet with the show’s creators at 7 p.m. The performance will be in the atrium of the Beckman Institute, 405 N. Mathews Ave., Urbana, [emphases mine] and it is free and open to the public. While the available seating is filling up, the atrium space will allow for an immersive experience in spite of potentially restricted viewing.

The production is a sister piece to “Quantum Voyages,” a performance created in 2018 by Vishveshwara and theatre professor Latrelle Bright to illustrate the basic concepts of quantum physics. It was performed at a quantum physics conference celebrating Nobel Prize-winning physicist Anthony Leggett’s 80th birthday in 2018.

While “Quantum Voyages” was a live theater piece, “Quantum Rhapsodies” combines narration by Bright, video images and live music from the Jupiter String Quartet. It ponders the wonder of the cosmos, the nature of light and matter, and the revolutionary ideas of quantum physics. A central part of the narrative involves the theory of Nobel Prize-winning French physicist Louis de Broglie that matter, like light, can behave as a wave.

The visuals – a blend of still images, video and animation – were created by a team consisting of the Beckman Visualization Laboratory; Steven Drake, a video producer at Beckman; filmmaker Nic Morse of Protagonist Pizza Productions; and members of a class Vishveshwara teaches, Where the Arts Meet Physics.

The biggest challenge in illustrating the ideas in the script was conveying the scope of the piece, from the galactic scale of the cosmos to the subatomic scale of the quantum world, Drake said. The concepts of quantum physics “are not something you can see. It’s theoretical or so small you can’t put it under a microscope or go out into the real world and film it,” he said.

Much of the work involved finding images, both scientific and artistic, that would help illustrate the concepts of the piece and complement the poetic language that Vishveshwara used, as well as the music.

Students and teaching assistant Danielle Markovich from Vishveshwara’s class contributed scientific images and original paintings. Drake used satellite images from the Hubble Space Telescope and other satellites, as well as animation created by the National Center for Supercomputing Applications in its work with NASA, for portions of the script talking about the cosmos. The Visualization Laboratory provided novel scientific visualizations.

“What we’re good at doing and have done for years is taking research content and theories and visualizing that information. We do that for a very wide variety of research and data. We’re good at coming up with images that represent these invisible worlds, like quantum physics,” said Travis Ross, the director of the lab.

Some ideas required conceptual images, such as footage by Morse of a fire juggler at Allerton Park to represent light and of hands moving to depict the rotational behavior of water-based hydrogen within a person in an MRI machine.

Motion was incorporated into a painting of a lake to show water rippling and light flickering across it to illustrate light waves. In the “Alice in Quantumland” sequence, a Mad Hatter’s tea party filmed at the Illini Union was blended with cartoonlike animated elements into the fantasy sequence by Jose Vazquez, an illustrator and concept artist who works in the Visualization Lab.

“Our main objective is making sure we’re representing it in a believable way that’s also fun and engaging,” Ross said. “We’ve never done anything quite like this. It’s pretty unique.”

In addition to performing the score, members of the Jupiter String Quartet were the musical directors, creating the musical narrative to mesh with the script. The music includes contemplative compositions by Beethoven to evoke the cosmos and playful modern compositions that summon images of the movements of particles and waves.

“I was working with such talented people and creative minds, and we had fun and came up with these seemingly absurd ideas. But then again, it’s like that with the quantum world as well,” Vishveshwara said.

“My hope is not necessarily for people to understand everything, but to infuse curiosity and to feel the grandness and the beauty that is part of who we are and the cosmos that we live in,” she said..

Here’s a preview of this free public performance,

How to look at SciArt (also known as, art/science depending on your religion)

There’s an intriguing April 8, 2019 post on the Science Borealis blog by Katrina Vera Wong and Raymond Nakamura titled: How to look at (and appreciate) SciArt,

….

The recent #SciArt #TwitterStorm, in which participants tweeted their own sciart and retweeted that of others, illustrated the diversity of approaches to melding art and science. With all this work out there, what can we do, as advocates of art and science, to better appreciate sciart? We’d like to foster interest in, and engagement with, sciart so that its value goes beyond how much it costs or how many likes it gets.

An article by Kit Messham-Muir based on the work of art historian Erwin Panofsky outlines a three-step strategy for looking at art: Look. See. Think. Looking is observing what the elements are. Seeing draws meaning from it. Thinking links personal experience and accessible information to the piece at hand.

Looking and seeing is also part of the Visual Thinking Strategies (VTS) method originally developed for looking at art and subsequently applied to science and other subjects as a social, object-oriented learning process. It begins by asking, “What is going on here?”, followed by “What do you see that makes you think that?” This allows learners of different backgrounds to participate and encourages the pursuit of evidence to back up opinions.

Let’s see how these approaches might work on your own or in conversation. Take, for example, the following work by natural history illustrator Julius Csotonyi:

I hope some of our Vancouver-based (Canada) art critics get a look at some of this material. I read a review a few years ago and the critic seemed intimidated by the idea of looking at work that explicitly integrated and reflected on science. Since that time (Note: there aren’t that many art reviewers here), I have not seen another attempt by an art critic.

Nanotechnology and Pakistan

I don’t often get information about nanotechnology in Pakistan so this March 6, 2017 news article by Mrya Imran on the TheNews.com website provides some welcome insight,

Pakistan has the right level of expert human resource and scientific activity in the field of nanotechnology. A focused national strategy and sustainable funding can make Pakistan one of the leaders in this sector.

These views were expressed by Professor of Physics in University of Illinois and Founder and President of NanoSi Advanced Technology, Inc. Dr Munir H. Nayfeh.  Dr Nayfeh, along with Executive Director, Centre for Nanoscale Science and Technology, and Research Faculty, Department of Agricultural and Biological Engineering, University of Illinois, Dr. Irfan Ahmad and Associate Professor and Director of Medical Physics Programme, Pritzker School of Medicine, University of Chicago, Dr. Bulent Aydogan were invited by COMSATS Institute of Information Technology (CIIT) to deliver lectures on nanotechnology research and entrepreneurship with special focus on cancer nanomedicine.

The objective of the visit was to motivate and mentor faculty and students at COMSATS and also to provide feedback to campus administration and the Federal Ministry of Science and Technology on strategic initiatives to help develop the next generation of science and engineering workforce in Pakistan.

A story of success for the Muslim youth from areas affected by conflict and war, Dr Nayfeh, a Palestinian by origin, was brought up in a conflict area by a mother who did not know how to read and write. For him, the environment was actually a motivator to work hard and study. “My mother was uneducated but she always wanted her children to get the highest degree possible and both my parents supported us in whatever way possible to achieve our dreams,” he recalled.

Comparing Pakistan with other developing countries in scientific research enterprise, he said that despite lack of resources, he has observed some decent amount of research outcome from the existing setups. About their visits to different labs, he said that they found faculty members and researchers in need of for more and more funds. “I don’t blame them as I am also looking for more and more fund even in America. This is a positive sign which shows that these set ups are alive and want to do more.”

Dr. Nayfeh is greatly impressed with the number of women researchers and students in Pakistan. “In Tunisia and Algeria, there were decent number of women in this field but Pakistan has the most and there are more publications coming out of Pakistan as compared to other developing countries.”

If you have the time, I suggest you read the article in its entirety.

Skin-like electronic bandage

Not sure how I feel about an electronic bandage, presumably it won’t electrocute me should it encounter my blood. If the Nov. 17, 2016 news item on phys.org is to be believed, it’s more sensor than bandage,

A skin-like biomedical technology that uses a mesh of conducting nanowires and a thin layer of elastic polymer might bring new electronic bandages that monitor biosignals for medical applications and provide therapeutic stimulation through the skin.

The biomedical device mimics the human skin’s elastic properties and sensory capabilities.

“It can intimately adhere to the skin and simultaneously provide medically useful biofeedback such as electrophysiological signals,” said Chi Hwan Lee, an assistant professor of biomedical engineering and mechanical engineering at Purdue University. “Uniquely, this work combines high-quality nanomaterials into a skin-like device, thereby enhancing the mechanical properties.”

The device could be likened to an electronic bandage and might be used to treat medical conditions using thermotherapeutics, where heat is applied to promote vascular flow for enhanced healing, said Lee, who worked with a team that includes Purdue graduate student Min Ku Kim.

A November 17, 2016 Purdue University news release by Emil Venere, which originated the news item, provides more insight into the work,

Traditional approaches to developing such a technology have used thin films made of ductile metals such as gold, silver and copper.

“The problem is that these thin films are susceptible to fractures by over-stretching and cracking,” Lee said. “Instead of thin films we use nanowire mesh film, which makes the device more resistive to stretching and cracking than otherwise possible. In addition, the nanowire mesh film has very high surface area compared to conventional thin films, with more than 1,000 times greater surface roughness. So once you attach it to the skin the adhesion is much higher, reducing the potential of inadvertent delamination.”

Findings are detailed in a research publication appearing online in October [2016] in Advanced Materials. The paper is also available online at http://onlinelibrary.wiley.com/doi/10.1002/adma.201603878/full and was authored by Kim; postdoctoral researcher Seungyong Han at the University of Illinois, Urbana-Champaign; Purdue graduate student Dae Seung Wie; Oklahoma State University assistant professor Shuodao Wang and postdoctoral researcher Bo Wang; and Lee.

The conducting nanowires are around 50 nanometers in diameter and more than 150 microns long and are embedded inside a thin layer of elastomer, or elastic polymer, about 1.5 microns thick. To demonstrate its utility in medical diagnostics, the device was used to record electrophysiological signals from the heart and muscles. A YouTube video about the research is available at https://youtu.be/tYRebHNi6p4.

“Recording the electrophysiological signals from the skin can provide wearers and clinicians with quantitative measures of the heart’s activity or the muscle’s activity,” Lee said.

Much of the research was performed in the Birck Nanotechnology Center in Purdue’s Discovery Park.

“The nanowires mesh film was initially formed on a conventional silicon wafer with existing micro- and nano-fabrication technologies. Our unique technique, called a crack-driven transfer printing technique, allows us to controllably peel off the device layer from the silicon wafer, and then apply onto the skin,” Lee said.

The Oklahoma State researchers contributed theoretical simulations related to the underlying mechanics of the devices, and Seungyong Han synthesized and provided the conducting nanowires.

Future research will be dedicated to developing a transdermal drug-delivery bandage that would transport medications through the skin in an electronically controlled fashion. Such a system might include built-in sensors to detect the level of injury and autonomously deliver the appropriate dose of drugs.

Here’s a link to and a citation for the paper mentioned in the news release,

Mechanically Reinforced Skin-Electronics with Networked Nanocomposite Elastomer by Seungyong Han, Min Ku Kim, Bo Wang, Dae Seung Wie, Shuodao Wang, and Chi Hwan Lee. Advanced Materials DOI: 10.1002/adma.201603878 Version of Record online: 7 OCT 2016

© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall but you can watch the video mentioned in the news release,

It seems that liquids won’t be a problem with regard to electrocution and I notice they keep calling it a biopatch not a bandaid.

Not exactly ‘Prey’: self-organizing materials that can mimic swarm behaviour

Prey, a 2002 novel by Michael Crichton, focused on nanotechnology and other emerging technologies and how their development could lead to unleashing swarms of nanobots with agendas of their own. Crichton’s swarms had collective artificial intelligence, and could massive themselves together to take on different macroscale shapes to achieve their own ends. This latest development has nowhere near that potential—not yet and probably never. From a July 21, 2016 news item on ScienceDaily,

A new study by an international team of researchers, affiliated with Ulsan National Institute of Science and Technology (UNIST) [Korea] has announced that they have succeeded in demonstarting control over the interactions occurring among microscopic spheres, which cause them to self-propel into swarms, chains, and clusters.

The research published in the current online edition of Nature Materials takes lessons from cooperation in nature, including that observed in honey bee swarms and bacterial clusters. In the study, the team has successfully demonstrated the self-organizing pattern formation in active materials at microscale by modifying only one parameter.

A July 21, 2016 UNIST press release, which originated the news item, expands on the theme,

This breakthrough comes from a research, conducted by Dr. Steve Granick (School of Natural Science, UNIST) of IBS Center for Soft and Living Matter in collaboration with Dr. Erik Luijten from Northwestern University. Ming Han, a PhD student in Luijten’s laboratory, and Jing Yan, a former graduate student at the University of Illinois, served as co-first authors of the paper.

Researchers expect that such active particles could open a new class of technologies with applications in medicine, chemistry, and engineering as well as advance scientists’ fundamental understanding of collective, dynamic behavior in systems.

According to the research team, the significance of team work was stressed by both Dr. Luijten and Dr. Granick as this current breakthrough is part of a longtime partnership using a new class of soft-matter particles known as Janus colloids, which Dr. Granick had earlier created in his laboratory. The theoretical computer simulations were completed by the team, led by Dr. Luijten and Dr. Granick used these colloids to experimentally test the collective, dynamic behavior in the laboratory.

The micron-sized spheres, typically suspended in solution, were named after the Roman god with two faces as they have attractive interactions on one side and negative charges on the other side.

The electrostatic interactions between the two sides of the self-propelled spheres could be manipulated by subjecting the colloids to an electric field. Some experienced stronger repulsions between their forward-facing sides, while others went through the opposite. Along with them, another set remained completely neutral. This imbalance caused the self-propelled particles to swim and self-organize into one of the following patterns, which are swarms, chains, clusters and isotropic gases.

To avoid head-to-head collisions, head-repulsive particles swam side-by-side, forming into swarms. Depending on the electric-field frequency, tail-repulsive particles positioned their tails apart, thus encouraging them to face each other to form jammed clusters of high local density. Also, swimmers with equal-and-opposite charges attracted one another into connected chains.

Dr. Granick states, “This truly is a joint work of the technological know-how by the Korean IBS and the University of Illinois, as well as the computer simulations technology by Northwestern University.” He expects that this breakthrough has probable application in sensing, drug delivery, or even microrobotics.

With this discovery, a drug could be placed within particles, for instance, that cluster into the delivery spot. Moreover, alterations in the environment could be perceived if the system unexpectedly switches from swarming to forming chains.

Here’s a link to and a citation for the paper,

Reconfiguring active particles by electrostatic imbalance by Jing Yan, Ming Han, Jie Zhang, Cong Xu, Erik Luijten, & Steve Granick. Nature Materials (2016)  doi:10.1038/nmat4696 Published online 11 July 2016

This paper is behind a paywall.

Swallow your technology and wear it inside (wearable tech: 2 of 3)

While there are a number of wearable and fashionable pieces of technology that monitor heart rate and breathing, they are all worn on the outside of your body. Researchers are working on an alternative that can be swallowed and will monitor vital signs from within the gastrointestinal tract. I believe this is a prototype of the device,

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Courtesy: MIT

This ingestible electronic device invented at MIT can measure heart rate and respiratory rate from inside the gastrointestinal tract. Image: Albert Swiston/MIT Lincoln Laboratory Courtesy: MIT

From a Nov. 18, 2015 news item on phys.org,

This type of sensor could make it easier to assess trauma patients, monitor soldiers in battle, perform long-term evaluation of patients with chronic illnesses, or improve training for professional and amateur athletes, the researchers say.

The new sensor calculates heart and breathing rates from the distinctive sound waves produced by the beating of the heart and the inhalation and exhalation of the lungs.

“Through characterization of the acoustic wave, recorded from different parts of the GI tract, we found that we could measure both heart rate and respiratory rate with good accuracy,” says Giovanni Traverso, a research affiliate at MIT’s Koch Institute for Integrative Cancer Research, a gastroenterologist at Massachusetts General Hospital, and one of the lead authors of a paper describing the device in the Nov. 18 issue of the journal PLOS One.

A Nov. 18, 2015 Massachusetts Institute of Technology (MIT) news release by Anne Trafton, which originated the news item, further explains the research,

Doctors currently measure vital signs such as heart and respiratory rate using techniques including electrocardiograms (ECG) and pulse oximetry, which require contact with the patient’s skin. These vital signs can also be measured with wearable monitors, but those are often uncomfortable to wear.

Inspired by existing ingestible devices that can measure body temperature, and others that take internal digestive-tract images, the researchers set out to design a sensor that would measure heart and respiratory rate, as well as temperature, from inside the digestive tract.

The simplest way to achieve this, they decided, would be to listen to the body using a small microphone. Listening to the sounds of the chest is one of the oldest medical diagnostic techniques, practiced by Hippocrates in ancient Greece. Since the 1800s, doctors have used stethoscopes to listen to these sounds.

The researchers essentially created “an extremely tiny stethoscope that you can swallow,” Swiston says. “Using the same sensor, we can collect both your heart sounds and your lung sounds. That’s one of the advantages of our approach — we can use one sensor to get two pieces of information.”

To translate these acoustic data into heart and breathing rates, the researchers had to devise signal processing systems that distinguish the sounds produced by the heart and lungs from each other, as well as from background noise produced by the digestive tract and other parts of the body.

The entire sensor is about the size of a multivitamin pill and consists of a tiny microphone packaged in a silicone capsule, along with electronics that process the sound and wirelessly send radio signals to an external receiver, with a range of about 3 meters.

In tests along the GI tract of pigs, the researchers found that the device could accurately pick up heart rate and respiratory rate, even when conditions such as the amount of food being digested were varied.

“The authors introduce some interesting and radically different approaches to wearable physiological status monitors, in which the devices are not worn on the skin or on clothing, but instead reside, in a transient fashion, inside the gastrointestinal tract. The resulting capabilities provide a powerful complement to those found in wearable technologies as traditionally conceived,” says John Rogers, a professor of materials science and engineering at the University of Illinois who was not part of the research team.

Better diagnosis

The researchers expect that the device would remain in the digestive tract for only a day or two, so for longer-term monitoring, patients would swallow new capsules as needed.

For the military, this kind of ingestible device could be useful for monitoring soldiers for fatigue, dehydration, tachycardia, or shock, the researchers say. When combined with a temperature sensor, it could also detect hypothermia, hyperthermia, or fever from infections.

In the future, the researchers plan to design sensors that could diagnose heart conditions such as abnormal heart rhythms (arrhythmias), or breathing problems including emphysema or asthma. Currently doctors require patients to wear a harness (Holter) monitor for up to a week to detect such problems, but these often fail to produce a diagnosis because patients are uncomfortable wearing them 24 hours a day.

“If you could ingest a device that would listen for those pathological sounds, rather than wearing an electrical monitor, that would improve patient compliance,” Swiston says.

The researchers also hope to create sensors that would not only diagnose a problem but also deliver a drug to treat it.

“We hope that one day we’re able to detect certain molecules or a pathogen and then deliver an antibiotic, for example,” Traverso says. “This development provides the foundation for that kind of system down the line.”

MIT has provided a video with two of the researchers describing their work and and plans for its future development,

Here’s a link to and a citation for the paper,

Physiologic Status Monitoring via the Gastrointestinal Tract by G. Traverso, G. Ciccarelli, S. Schwartz, T. Hughes, T. Boettcher, R. Barman, R. Langer, & A. Swiston. PLOS DOI: 10.1371/journal.pone.0141666 Published: November 18, 2015

This paper is open access.

Note added Nov. 25, 2015 at 1625 hours PDT: US National Public Radio (NPR) has a story on this research. You can find Nov. 23, 2015 podcast (about six minutes) and a series of textual excerpts featuring Albert Swiston, biomaterials scientist at MIT, and Stephen Shankland, senior writer for CNET covering digital technology, from the podcast here.

Center for Sustainable Nanotechnology or how not to poison and make the planet uninhabitable

I received notice of the Center for Sustainable Nanotechnology’s newest deal with the US National Science Foundation in an August 31, 2015 email University of Wisconsin-Madison (UWM) news release,

The Center for Sustainable Nanotechnology, a multi-institutional research center based at the University of Wisconsin-Madison, has inked a new contract with the National Science Foundation (NSF) that will provide nearly $20 million in support over the next five years.

Directed by UW-Madison chemistry Professor Robert Hamers, the center focuses on the molecular mechanisms by which nanoparticles interact with biological systems.

Nanotechnology involves the use of materials at the smallest scale, including the manipulation of individual atoms and molecules. Products that use nanoscale materials range from beer bottles and car wax to solar cells and electric and hybrid car batteries. If you read your books on a Kindle, a semiconducting material manufactured at the nanoscale underpins the high-resolution screen.

While there are already hundreds of products that use nanomaterials in various ways, much remains unknown about how these modern materials and the tiny particles they are composed of interact with the environment and living things.

“The purpose of the center is to explore how we can make sure these nanotechnologies come to fruition with little or no environmental impact,” explains Hamers. “We’re looking at nanoparticles in emerging technologies.”

In addition to UW-Madison, scientists from UW-Milwaukee, the University of Minnesota, the University of Illinois, Northwestern University and the Pacific Northwest National Laboratory have been involved in the center’s first phase of research. Joining the center for the next five-year phase are Tuskegee University, Johns Hopkins University, the University of Iowa, Augsburg College, Georgia Tech and the University of Maryland, Baltimore County.

At UW-Madison, Hamers leads efforts in synthesis and molecular characterization of nanomaterials. soil science Professor Joel Pedersen and chemistry Professor Qiang Cui lead groups exploring the biological and computational aspects of how nanomaterials affect life.

Much remains to be learned about how nanoparticles affect the environment and the multitude of organisms – from bacteria to plants, animals and people – that may be exposed to them.

“Some of the big questions we’re asking are: How is this going to impact bacteria and other organisms in the environment? What do these particles do? How do they interact with organisms?” says Hamers.

For instance, bacteria, the vast majority of which are beneficial or benign organisms, tend to be “sticky” and nanoparticles might cling to the microorganisms and have unintended biological effects.

“There are many different mechanisms by which these particles can do things,” Hamers adds. “The challenge is we don’t know what these nanoparticles do if they’re released into the environment.”

To get at the challenge, Hamers and his UW-Madison colleagues are drilling down to investigate the molecular-level chemical and physical principles that dictate how nanoparticles interact with living things.
Pedersen’s group, for example, is studying the complexities of how nanoparticles interact with cells and, in particular, their surface membranes.

“To enter a cell, a nanoparticle has to interact with a membrane,” notes Pedersen. “The simplest thing that can happen is the particle sticks to the cell. But it might cause toxicity or make a hole in the membrane.”

Pedersen’s group can make model cell membranes in the lab using the same lipids and proteins that are the building blocks of nature’s cells. By exposing the lab-made membranes to nanomaterials now used commercially, Pedersen and his colleagues can see how the membrane-particle interaction unfolds at the molecular level – the scale necessary to begin to understand the biological effects of the particles.

Such studies, Hamers argues, promise a science-based understanding that can help ensure the technology leaves a minimal environmental footprint by identifying issues before they manifest themselves in the manufacturing, use or recycling of products that contain nanotechnology-inspired materials.

To help fulfill that part of the mission, the center has established working relationships with several companies to conduct research on materials in the very early stages of development.

“We’re taking a look-ahead view. We’re trying to get into the technological design cycle,” Hamers says. “The idea is to use scientific understanding to develop a predictive ability to guide technology and guide people who are designing and using these materials.”

What with this initiative and the LCnano Network at Arizona State University (my April 8, 2014 posting; scroll down about 50% of the way), it seems that environmental and health and safety studies of nanomaterials are kicking into a higher gear as commercialization efforts intensify.

Microbattery from the University of Illinois

Caption: This is an image of the holographically patterned microbattery. Credit: University of Illinois

Caption: This is an image of the holographically patterned microbattery.
Credit: University of Illinois

Hard to believe that’s a battery but the researchers at the University of Illinois  assure us this is so according to a May 11, 2015 news item on Nanowerk,

By combining 3D holographic lithography and 2D photolithography, researchers from the University of Illinois at Urbana-Champaign have demonstrated a high-performance 3D microbattery suitable for large-scale on-chip integration with microelectronic devices.

“This 3D microbattery has exceptional performance and scalability, and we think it will be of importance for many applications,” explained Paul Braun, a professor of materials science and engineering at Illinois. “Micro-scale devices typically utilize power supplied off-chip because of difficulties in miniaturizing energy storage technologies. A miniaturized high-energy and high-power on-chip battery would be highly desirable for applications including autonomous microscale actuators, distributed wireless sensors and transmitters, monitors, and portable and implantable medical devices.”

A May 11, 2015 University of Illinois news release on EurkeAlert, which originated the news item, provides some insight into and detail about the research,

“Due to the complexity of 3D electrodes, it is generally difficult to realize such batteries, let alone the possibility of on-chip integration and scaling. In this project, we developed an effective method to make high-performance 3D lithium-ion microbatteries using processes that are highly compatible with the fabrication of microelectronics,” stated Hailong Ning, a graduate student in the Department of Materials Science and Engineering and first author of the article, “Holographic Patterning of High Performance on-chip 3D Lithium-ion Microbatteries,” appearing in Proceedings of the National Academy of Sciences.

“We utilized 3D holographic lithography to define the interior structure of electrodes and 2D photolithography to create the desired electrode shape.” Ning added. “This work merges important concepts in fabrication, characterization, and modeling, showing that the energy and power of the microbattery are strongly related to the structural parameters of the electrodes such as size, shape, surface area, porosity, and tortuosity. A significant strength of this new method is that these parameters can be easily controlled during lithography steps, which offers unique flexibility for designing next-generation on-chip energy storage devices.”

Enabled by a 3D holographic patterning technique–where multiple optical beams interfere inside the photoresist creating a desirable 3D structure–the battery possesses well-defined, periodically structured porous electrodes, that facilitates the fast transports of electrons and ions inside the battery, offering supercapacitor-like power.

“Although accurate control on the interfering optical beams is required to construct 3D holographic lithography, recent advances have significantly simplified the required optics, enabling creation of structures via a single incident beam and standard photoresist processing. This makes it highly scalable and compatible with microfabrication,” stated John Rogers, a professor of materials science and engineering, who has worked with Braun and his team to develop the technology.

“Micro-engineered battery architectures, combined with high energy material such as tin, offer exciting new battery features including high energy capacity and good cycle lives, which provide the ability to power practical devices,” stated William King, a professor of mechanical science and engineering, who is a co-author of this work.

Here’s a link to and a citation for the paper,

Holographic patterning of high-performance on-chip 3D lithium-ion microbatteries by Hailong Ning, James H. Pikul, Runyu Zhang, Xuejiao Li, Sheng Xu, Junjie Wang, John A. Rogers, William P. King, and Paul V. Braun. PNAS doi: 10.1073/pnas.1423889112

This paper is behind a paywall.

Bring ‘jazzy’ molecules to life on an app

The app accompanies book two (Molecules: The Elements and the Architecture of Everything) in a proposed trilogy about chemical elements in all their glory. A Dec. 10, 2014 news item on phys.org describes the app,

Although molecules make up everything around us, most people encounter these groups of atoms held together by chemical bonds in the pages of a textbook. They read text and see a drawing of chemical symbols or colorful circles—a one-dimensional view of the microscopic structures. Other representations have drawbacks as well: three-dimensional models are made of materials that can’t replicate the rapid and continuous molecular movement. Molecules are wiggling and jiggling in a never-ending dance, but you can’t see it, not even with the most powerful microscope.

“What is the best thing you could do to present (a molecule) to somebody?” asked Theo Gray, a scientist and author of “Molecules: The Elements and the Architecture of Everything.” “What’s the closest that you can come to actually handing it to them, so they could pick it up and look at it themselves?”

How about an app? In collaboration with the Theoretical and Computational Biophysics Group (TCBG) at the Beckman Institute at the University of Illinois, Gray’s company, Touchpress, has created an app for the Apple operating system (iOS) that brings molecules to life in a handheld device. Through the app, people can use up to eleven [?] fingers to examine in great detail more than 350 molecules, which they can also twist, turn, and tie into knots.

Gray has produced an entertaining and ‘jazzy’ video promoting his app (from theodoregray.com’s Molecules webpage where a link to the iTunes Download is provided),

A Dec. 10, 2014 Beckman Institute (University of Illinois) news release (also on EurekAlert) which originated the news item on phys.org, describes the objectives and the app itself more thoroughly,

“Every student who learns about typical molecules can do it now in a playful manner and realize that molecules are not dead and frozen, but that they move,” said Klaus Schulten, head of TCBG, and professor of physics at Illinois.

The app also allows users to vary the temperature and time scale in order to make the molecules move more quickly or more slowly. If the temperature is warm the molecules will move rapidly, while cold temperatures turn them sluggish, and absolute zero freezes them solid.

Getting molecules into everyone’s hands has been a goal for Gray. Even though Molecules is a beautifully illustrated book, Gray knows that the most stunning color photos and detailed descriptions can’t show the actual nature of molecules as well as looking at moving images of the material and the atomic motions themselves.

“In the case of molecules, you really can’t get a sense of what this stuff is like if you’re just looking at a picture: how goopy is it? Is it very runny or is it very thick? Is it like molasses or more like oil or more like water?” explained Gray.

“There’s also the fact that you can’t see molecules: they’re too small to see, and they’re too fast to perceive, but by providing an interactive simulation, you can give people really quite a good intuitive feeling for what a molecule is like and how it moves and how it behaves, and translate that into human scale.”

A chance meeting at a 2013 New Year’s Eve party between Gray and Barry Isralewitz, a TCBG research programmer, led to a discussion of molecular dynamic simulation. Isralewitz’s work with the TCBG involves simulating biological structures down to the atomic level. The group has created software packages VMD, which creates the visualizations, and NAMD, which simulates the movement of the structures. The software runs consecutively and in conjunction on powerful computing systems and is freely available to researchers around the world, who use it to model and simulate structures at detailed levels. Recently with the help of Blue Waters, a petascale supercomputer at the University of Illinois, TCBG unraveled one of the largest structures ever simulated—the HIV capsid, made up of 64 million atoms.

For the app, Touchpress created the visualizations, and TCBG provided the NAMD software. Taking the software into iOS, which can be used on iPhones and iPads, was not an easy task. The TCBG staff, including Jim Phillips, John Stone, and Christopher Mayne, among others, consulted regularly with Richard Zito, the main programmer from Touchpress, who lives in London.

“When Theo Gray came to us, he was full of enthusiasm, and we were actually a little hesitant. We didn’t know how well it would work out,” said Schulten. “But it worked very well, and in the course of putting our program onto this device, some technical challenges had to be met and in the wave of enthusiasm of doing it, we actually met those challenges.”

“We learned that our scientific software, which cost around $20 million to develop for the world’s best computers, can actually serve children and their parents in acquainting themselves with flexibility of molecules,” said Schulten.

“Now between education and entertainment we can think of using it for teaching. VMD is actually already used at many colleges for teaching, but now with this approach and having just a tablet computer, not even a laptop or a desktop workstation, we can penetrate much further with utilizing our tools for teaching than we ever did before.”

According to Gray, the app can make significant inroads into bringing molecules to the masses.

“It was particularly the combination of molecular dynamic simulation with a touch screen that makes it into sort of a magical experience that you don’t have when you’re doing it with a mouse,” said Gray. “Touch devices make things much more immediate and you have a personal connection to it. Combined with the fact that you can use multiple fingers to grab onto and move a molecule, like you would if you were actually holding it in your hands, it makes it quite a different experience and because it’s an iPad app, it’s available to anybody. I think it’s a pretty significant step toward getting the general public to have a better intuitive grasp as to what molecules are like.”

Schulten believes that the entertainment the app provides will help educate the next generation of scientists.

“Interacting with molecules makes them fun and natural, and that is a very powerful aspect of becoming familiar with the world of molecules,” said Schulten. “This is a wonderful tool that fits the landscape of the computing world that anybody can become familiar with through a cell phone and with a tablet, and we can utilize this big science for teaching the next generation.”

Molecules is the second volume in a proposed trilogy; The Elements: A Visual Exploration of Every Known Atom in the Universe was the first. Gray hopes that his next book Reactions and accompanying app can be as successful as Molecules.

“I think the most important thing, really,” said Gray, “is the fact that this technology has existed for quite some time, a couple of decades, but it’s really been locked up in labs, as it were—not because it wasn’t possible to bring it out to a more wide accessibility, but just because no one had thought of a good context to do that in, and maybe have the idea that it was possible to port them to a touch screen device.”

The app is available on iTunes for $13.99.

What a great idea! I wish Gray and his collaborators all the best with this project.

One last questions, is there an Android or PC desktop app in the works?