Tag Archives: University of Wisconsin-Madison

See-through medical sensors from the University of Wisconsin-Madison

This is quite the week for see-through medical devices based on graphene. A second team has developed a transparent sensor which could allow scientists to make observations of brain activity that are now impossible, according to an Oct. 20, 2014 University of Wisconsin-Madison news release (also on EurekAlert),

Neural researchers study, monitor or stimulate the brain using imaging techniques in conjunction with implantable sensors that allow them to continuously capture and associate fleeting brain signals with the brain activity they can see.

However, it’s difficult to see brain activity when there are sensors blocking the view.

“One of the holy grails of neural implant technology is that we’d really like to have an implant device that doesn’t interfere with any of the traditional imaging diagnostics,” says Justin Williams, the Vilas Distinguished Achievement Professor of biomedical engineering and neurological surgery at UW-Madison. “A traditional implant looks like a square of dots, and you can’t see anything under it. We wanted to make a transparent electronic device.”

The researchers chose graphene, a material gaining wider use in everything from solar cells to electronics, because of its versatility and biocompatibility. And in fact, they can make their sensors incredibly flexible and transparent because the electronic circuit elements are only 4 atoms thick—an astounding thinness made possible by graphene’s excellent conductive properties. “It’s got to be very thin and robust to survive in the body,” says Zhenqiang (Jack) Ma, the Lynn H. Matthias Professor and Vilas Distinguished Achievement Professor of electrical and computer engineering at UW-Madison. “It is soft and flexible, and a good tradeoff between transparency, strength and conductivity.”

Drawing on his expertise in developing revolutionary flexible electronics, he, Williams and their students designed and fabricated the micro-electrode arrays, which—unlike existing devices—work in tandem with a range of imaging technologies. “Other implantable micro-devices might be transparent at one wavelength, but not at others, or they lose their properties,” says Ma. “Our devices are transparent across a large spectrum—all the way from ultraviolet to deep infrared.”

The transparent sensors could be a boon to neuromodulation therapies, which physicians increasingly are using to control symptoms, restore function, and relieve pain in patients with diseases or disorders such as hypertension, epilepsy, Parkinson’s disease, or others, says Kip Ludwig, a program director for the National Institutes of Health neural engineering research efforts. “Despite remarkable improvements seen in neuromodulation clinical trials for such diseases, our understanding of how these therapies work—and therefore our ability to improve existing or identify new therapies—is rudimentary.”

Currently, he says, researchers are limited in their ability to directly observe how the body generates electrical signals, as well as how it reacts to externally generated electrical signals. “Clear electrodes in combination with recent technological advances in optogenetics and optical voltage probes will enable researchers to isolate those biological mechanisms. This fundamental knowledge could be catalytic in dramatically improving existing neuromodulation therapies and identifying new therapies.”

The advance aligns with bold goals set forth in President Barack Obama’s BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative. Obama announced the initiative in April 2013 as an effort to spur innovations that can revolutionize understanding of the brain and unlock ways to prevent, treat or cure such disorders as Alzheimer’s and Parkinson’s disease, post-traumatic stress disorder, epilepsy, traumatic brain injury, and others.

The UW-Madison researchers developed the technology with funding from the Reliable Neural-Interface Technology program at the Defense Advanced Research Projects Agency.

While the researchers centered their efforts around neural research, they already have started to explore other medical device applications. For example, working with researchers at the University of Illinois-Chicago, they prototyped a contact lens instrumented with dozens of invisible sensors to detect injury to the retina; the UIC team is exploring applications such as early diagnosis of glaucoma.

Here’s an image of the see-through medical implant,

Caption: A blue light shines through a clear implantable medical sensor onto a brain model. See-through sensors, which have been developed by a team of University of Wisconsin Madison engineers, should help neural researchers better view brain activity. Credit: Justin Williams research group

Caption: A blue light shines through a clear implantable medical sensor onto a brain model. See-through sensors, which have been developed by a team of University of Wisconsin Madison engineers, should help neural researchers better view brain activity.
Credit: Justin Williams research group

Here’s a link to and a citation for the paper,

Graphene-based carbon-layered electrode array technology for neural imaging and optogenetic applications by Dong-Wook Park, Amelia A. Schendel, Solomon Mikael, Sarah K. Brodnick, Thomas J. Richner, Jared P. Ness, Mohammed R. Hayat, Farid Atry, Seth T. Frye, Ramin Pashaie, Sanitta Thongpang, Zhenqiang Ma, & Justin C. Williams. Nature Communications 5, Article number: 5258 doi:10.1038/ncomms6258 Published
20 October 2014

This is an open access paper.

DARPA (US Defense Advanced Research Projects Agency), which funds this work at the University of Wisconsin-Madison, has also provided an Oct. 20, 2014 news release (also published an an Oct. 27, 2014 news item on Nanowerk) describing this research from the military perspective, which may not be what you might expect. First, here’s a description of the DARPA funding programme underwriting this research, from DARPA’s Reliable Neural-Interface Technology (RE-NET) webpage,

Advancing technology for military uniforms, body armor and equipment have saved countless lives of our servicemembers injured on the battlefield.  Unfortunately, many of those survivors are seriously and permanently wounded, with unprecedented rates of limb loss and traumatic brain injury among our returning soldiers. This crisis has motivated great interest in the science of and technology for restoring sensorimotor functions lost to amputation and injury of the central nervous system. For a decade now, DARPA has been leading efforts aimed at ‘revolutionizing’ the state-of-the-art in prosthetic limbs, recently debuting 2 advanced mechatronic limbs for the upper extremity. These new devices are truly anthropomorphic and capable of performing dexterous manipulation functions that finally begin to approach the capabilities of natural limbs. However, in the absence of a high bandwidth, intuitive interface for the user, these limbs will never achieve their full potential in improving the quality of life for the wounded soldiers that could benefit from this advanced technology.

DARPA created the Reliable Neural-Interface Technology (RE-NET) program in 2010 to directly address the need for high performance neural interfaces to control dexterous functions made possible with advanced prosthetic limbs.  Specifically, RE-NET seeks to develop the technologies needed to reliably extract information from the nervous system, and to do so at a scale and rate necessary to control many degree-of-freedom (DOF) machines, such as high-performance prosthetic limbs. Prior to the DARPA RE-NET program, all existing methods to extract neural control signals were inadequate for amputees to control high-performance prostheses, either because the level of extracted information was too low or the functional lifetime was too short. However, recent technological advances create new opportunities to solve both of these neural-interface problems. For example, it is now feasible to develop high-resolution peripheral neuromuscular interfaces that increase the amount of information obtained from the peripheral nervous system.  Furthermore, advances in cortical microelectrode technologies are extending the durability of neural signals obtained from the brain, making it possible to create brain-controlled prosthetics that remain useful over the full lifetime of the patient.

Canada’s Situating Science in Fall 2014

Canada’s Situating Science cluster (network of humanities and social science researchers focused on the study of science) has a number of projects mentioned and in its Fall 2014 newsletter,

1. Breaking News
It’s been yet another exciting spring and summer with new developments for the Situating Science SSHRC Strategic Knowledge Cluster team and HPS/STS [History of Philosophy of Science/Science and Technology Studies] research. And we’ve got even more good news coming down the pipeline soon…. For now, here’s the latest.

1.1. New 3 yr. Cosmopolitanism Partnership with India and Southeast Asia
We are excited to announce that the Situating Science project has helped to launch a new 3 yr. 200,000$ SSHRC Partnership Development Grant on ‘Cosmopolitanism and the Local in Science and Nature’ with institutions and scholars in Canada, India and Singapore. Built upon relations that the Cluster has helped establish over the past few years, the project will closely examine the actual types of negotiations that go into the making of science and its culture within an increasingly globalized landscape. A recent workshop on Globalizing History and Philosophy of Science at the Asia Research Institute at the National University of Singapore helped to mark the soft launch of the project (see more in this newsletter).

ARI along with Manipal University, Jawaharlal Nehru University, University of King’s College, Dalhousie University, York University, University of Toronto, and University of Alberta, form the partnership from which the team will seek new connections and longer term collaborations. The project’s website will feature a research database, bibliography, syllabi, and event information for the project’s workshops, lecture series, summer schools, and artifact work. When possible, photos, blogs, podcasts and videos from events will be posted online as well. The project will have its own mailing list so be sure to subscribe to that too. Check it all out: www.CosmoLocal.org

2.1. Globalizing History and Philosophy of Science workshop in Singapore August 21-22 2014
On August 21 and 22, scholars from across the globe gathered at the Asia Research Institute at the National University of Singapore to explore key issues in global histories and philosophies of the sciences. The setting next to the iconic Singapore Botanical Gardens provided a welcome atmosphere to examine how and why globalizing the humanities and social studies of science generates intellectual and conceptual tensions that require us to revisit, and possibly rethink, the leading notions that have hitherto informed the history, philosophy and sociology of science.

The keynote by Sanjay Subrahmanyam (UCLA) helped to situate discussions within a larger issue of paradigms of civilization. Workshop papers explored commensurability, translation, models of knowledge exchange, indigenous epistemologies, commercial geography, translation of math and astronomy, transmission and exchange, race, and data. Organizer Arun Bala and participants will seek out possibilities for publishing the proceedings. The event partnered with La Trobe University and Situating Science, and it helped to launch a new 3 yr. Cosmopolitanism project. For more information visit: www.CosmoLocal.org

2.2. Happy Campers: The Summer School Experience

We couldn’t help but feel like we were little kids going to summer camp while our big yellow school bus kicked up dust driving down a dirt road on a hot summer’s day. In this case it would have been a geeky science camp. We were about to dive right into day-long discussions of key pieces from Science and Technology Studies and History and Philosophy of Science and Technology.

Over four and a half days at one of the Queen’s University Biology Stations at the picturesque Elbow Lake Environmental Education Centre, 18 students from across Canada explored the four themes of the Cluster. Each day targeted a Cluster theme, which was introduced by organizer Sergio Sismondo (Sociology and Philosophy, Queen’s). Daryn Lehoux (Classics, Queen’s) explained key concepts in Historical Epistemology and Ontology. Using references of the anti-magnetic properties of garlic (or garlic’s antipathy with the loadstone) from the ancient period, Lehoux discussed the importance and significance of situating the meaning of a thing within specific epistemological contexts. Kelly Bronson (STS, St. Thomas University) explored modes of science communication and the development of the Public Engagement with Science and Technology model from the deficit model of Public Understanding of Science and Technology during sessions on Science Communication and its Publics. Nicole Nelson (University of Wisconsin-Madison) explained Material Culture and Scientific/Technological Practices by dissecting the meaning of animal bodies and other objects as scientific artifacts. Gordon McOuat wrapped up the last day by examining the nuances of the circulation and translation of knowledge and ‘trading zones’ during discussions of Geographies and Sites of Knowledge.

2.3. Doing Science in and on the Oceans
From June 14 to June 17, U. King’s College hosted an international workshop on the place and practice of oceanography in celebration of the work of Dr. Eric Mills, Dalhousie Professor Emeritus in Oceanography and co-creator of the History of Science and Technology program. Leading ocean scientists, historians and museum professionals came from the States, Europe and across Canada for “Place and Practice: Doing Science in and on the Ocean 1800-2012”. The event successfully connected different generations of scholars, explored methodologies of material culture analysis and incorporated them into mainstream historical work. There were presentations and discussions of 12 papers, an interdisciplinary panel discussion with keynote lecture by Dr. Mills, and a presentation at the Maritime Museum of the Atlantic by Canada Science and Technology Museum curator, David Pantalony. Paper topics ranged from exploring the evolving methodology of oceanographic practice to discussing ways that the boundaries of traditional scientific writing have been transcended. The event was partially organized and supported by the Atlantic Node and primary support was awarded by the SSHRC Connection Grant.

2.4. Evidence Dead or Alive: The Lives of Evidence National Lecture Series

The 2014 national lecture series on The Lives of Evidence wrapped up on a high note with an interdisciplinary panel discussion of Dr. Stathis Psillos’ exploration of the “Death of Evidence” controversy and the underlying philosophy of scientific evidence. The Canada Research Chair in Philosophy of Science spoke at the University of Toronto with panelists from law, philosophy and HPS. “Evidence: Wanted Dead of Alive” followed on the heels of his talk at the Institute for Science, Society and Policy “From the ‘Bankruptcy of Science’ to the ‘Death of Evidence’: Science and its Value”.

In 6 parts, The Lives of Evidence series examined the cultural, ethical, political, and scientific role of evidence in our world. The series formed as response to the recent warnings about the “Death of Evidence” and “War on Science” to explore what was meant by “evidence”, how it is interpreted, represented and communicated, how trust is created in research, what the relationship is between research, funding and policy and between evidence, explanations and expertise. It attracted collaborations from such groups as Evidence for Democracy, the University of Toronto Evidence Working Group, Canadian Centre for Ethics in Public Affairs, Dalhousie University Health Law Institute, Rotman Institute of Philosophy and many more.

A December [2013] symposium, “Hype in Science”, marked the soft launch of the series. In the all-day public event in Halifax, leading scientists, publishers and historians and philosophers of science discussed several case studies of how science is misrepresented and over-hyped in top science journals. Organized by the recent winner of the Gerhard Herzberg Canada Gold Medal for Science and Engineering, Ford Doolittle, the interdisciplinary talks in “Hype” explored issues of trustworthiness in science publications, scientific authority, science communication, and the place of research in the broader public.

The series then continued to explore issues from the creation of the HIV-Crystal Meth connection (Cindy Patton, SFU), Psychiatric Research Abuse (Carl Elliott, U. Minnesota), Evidence, Accountability and the Future of Canadian Science (Scott Findlay, Evidence for Democracy), Patents and Commercialized Medicine (Jim Brown, UofT), and Clinical Trials (Joel Lexchin, York).

All 6 parts are available to view on the Situating Science YouTube channel.You can read a few blogs from the events on our website too. Some of those involved are currently discussing possibilities of following up on some of the series’ issues.

2.5. Other Past Activities and Events
The Frankfurt School: The Critique of Capitalist Culture (July, UBC)

De l’exclusion à l’innovation théorique: le cas de l’éconophysique ; Prosocial attitudes and patterns of academic entrepreneurship (April, UQAM)

Critical Itineraries Technoscience Salon – Ontologies (April, UofT)

Technologies of Trauma: Assessing Wounds and Joining Bones in Late Imperial China (April, UBC)

For more, check out: www.SituSci.ca

You can find some of the upcoming talks and the complete Fall 2014 Situating Science newsletter here.

About one week after receiving the newsletter, I got this notice (Sept. 11, 2014),

We are ecstatic to announce that the Situating Science SSHRC Strategic Knowledge Cluster is shortlisted for a highly competitive SSHRC Partnership Impact Award!

And what an impact we’ve had over the past seven years: Organizing and supporting over 20 conferences and workshops, 4 national lecture series, 6 summer schools, and dozens of other events. Facilitating the development of 4 new programs of study at partner institutions. Leveraging more than one million dollars from Nodal partner universities plus more than one million dollars from over 200 supporting and partnering organizations. Hiring over 30 students and 9 postdoctoral fellows. Over 60 videos and podcasts as well as dozens of student blogs and over 50 publications. Launching a new Partnership Development Grant between Canada, India and Southeast Asia. Developing a national consortium…And more!

The winners will be presented with their awards at a ceremony in Ottawa on Monday, November 3, 2014.

From the Sept. 11, 2014 Situating Science press release:

University of King’s College [Nova Scotia, Canada] professor Dr. Gordon McOuat has been named one of three finalists for the Social Sciences and Humanities Research Council of Canada’s (SSHRC) Partnership Award, one of five Impact Awards annually awarded by SSHRC.

Congratulations on the nomination and I wish Gordon McQuat and Situating Science good luck in the competition.

A labradoodle, gold nanoparticles, and cancer treatment for dogs and cats

Here’s the labradoodle,

Caption: Dr. Shawna Klahn, an assistant professor of oncology at the Virginia-Maryland College of Veterinary Medicine, performs a checkup on "Grayton" four weeks after the animal's experimental cancer treatment involving gold nanoparticles and a targeted laser therapy. Credit: Virginia Tech

Caption: Dr. Shawna Klahn, an assistant professor of oncology at the Virginia-Maryland College of Veterinary Medicine, performs a checkup on “Grayton” four weeks after the animal’s experimental cancer treatment involving gold nanoparticles and a targeted laser therapy.
Credit: Virginia Tech

An Aug. 6, 2014 news item on Azonano outlines ‘Grayton’s’ story and how gold nanoparticles will factor in,

When Michael and Sandra Friedlander first came to the Virginia-Maryland College of Veterinary Medicine three years ago with their dog, Grayton, they learned some bad news: Grayton had nasal adenocarcinoma, a form of cancer with a short life expectancy.

“Most dogs with this form of cancer are with their owners no more than a few months after the diagnosis, but here Grayton is three years later,” said Michael Friedlander, who is the executive director of the Virginia Tech Carilion Research Institute and senior dean at the Virginia Tech Carilion School of Medicine.

No stranger to medical research, Friedlander was referred by Veterinary Teaching Hospital clinicians to an experimental treatment at the University of Florida called stereotactic radiation therapy, which delivers precise, high dosages of radiation to a tumor and can only be performed once.

“That shrunk the tumor down to almost nothing,” said Friedlander, who is also the associate provost for health sciences at Virginia Tech. “We knew when Grayton had the procedure that we couldn’t do it again, but now the cancer is back.”

An Aug. 4, 2014 Virginia Tech news release (also on EurekAlert) by Michael Sutphin, which originated the news item, explains what occasioned the release and how gold nanoparticles are being used in veterinary treatment for cancer,

Today [Aug. 4, 2014], the 11-year-old Labradoodle is the first patient at the Virginia-Maryland College of Veterinary Medicine in a new clinical trial that is testing the use of gold nanoparticles and a targeted laser treatment for solid tumors in dogs and cats. The study is one of several on new treatments for client-owned companion animals at the college. In January [2014], the college established the Veterinary Clinical Research Office to help facilitate this work.

“Clinical research at the veterinary college involves both primary research focused on advancing the treatment and diagnosis of veterinary diseases and translational research in which spontaneous diseases in animals can be used as models of human disease,” said Dr. Greg Daniel, head of the Department of Small Animal Clinical Sciences. “In the latter situation, we can provide our companion animal patients with treatment and diagnostic options that are not yet available in mainstream human medicine.”

Although medical researchers have tested gold nanoparticles with targeted laser treatments on human patients with some success, the treatment is still new to both human and veterinary medicine. The college is one of four current veterinary schools around the country testing the AuroLase therapy developed by Nanospectra Biosciences Inc., a startup company based in Houston, Texas. The others are Texas A&M University, the University of Wisconsin-Madison, and the University of Georgia.

Dr. Nick Dervisis, assistant professor of oncology in the Department of Small Animal Clinical Sciences, is leading the Nanospectra-funded study. Following a rhinoscopy performed on Grayton by Dr. David Grant, associate professor of internal medicine, Dervisis began the one-time, experimental therapy.

“The treatment involves two phases,” Dervisis said. “First, we infuse the patient with the gold nanoparticles. Although the nanoparticles distribute throughout the body, they tend to concentrate around blood vessels associated with tumors. Within 36 hours, they have cleared the bloodstream except for tumors. The gold nanoparticles are small enough to circulate freely in the bloodstream and become temporarily captured within the incomplete blood vessel walls common in solid tumors. Then, we use a non-ablative laser on the patient.”

Dervisis explained that a non-ablative laser is not strong enough to harm the skin or normal tissue, but “it does cause the remaining nanoparticles to absorb the laser energy and convert it into heat so that they damage the tumor cells.”

Like all clinical trials, the study involves many unknowns, including the treatment’s usefulness and effectiveness. One month after the AuroLase treatment, the nosebleeds that initially brought Grayton back to the Veterinary Teaching Hospital had stopped and Grayton has no other side effects.

“I’m delighted with the care and service that Grayton has received at the veterinary college,” said Friedlander, who explained that the treatment appears to be safe even though researchers do not know whether it is effective yet. “Grayton recently came with us on our annual vacation at the beach. We didn’t know if he would be able to come again, so it was great to have him with us swimming, catching fish and crabs, and doing what dogs do.”

Current clinical trials at the veterinary college range from the use of MRI to distinguish between benign and cancerous lymph nodes in dogs with oral melanoma, to a new chemotherapy drug for dogs with brain tumors, to the treatment of invasive skin cancer in horses with high-voltage, high-frequency electrical pulses. A complete list of current trials can be found at the college’s new clinical trials website.

Mindy Quigley, who oversees the college’s Veterinary Clinical Research Office, explained that veterinary trials, which follow a four-phase process and a variety of regulations similar to human medicine, have another layer of complexity that human trials do not.

“Variation among species means that a therapy that has proven safe and effective in, for example, humans or dogs, may not work for horses,” said Quigley, who comes to the college from the University of Edinburgh’s College of Medicine and Veterinary Medicine, where she helped set up a new neurology research clinic with funding from author J.K. Rowling. “Many veterinary clinical trials must therefore take therapies that have worked in one species and test them in other species with similar conditions. This is a necessary step to determine if a proposed treatment is safe and effective for our companion animals.”

Grayton may be the first companion animal in the AuroLase study at the veterinary college, but he certainly won’t be the last. Dervisis is continuing to enroll patients in the study and is seeking dogs and cats of a certain size with solid tumors who have not recently received radiation therapy or chemotherapy.

Interested parties can check this site for current clinical trials, including the Aurolase study,  being held by the Virginia-Maryland Regional College of Veterinary Medicine.

Nanojuice in your gut

A July 7, 2014 news item on Azonano features a new technique that could help doctors better diagnose problems in the intestines (guts),

Located deep in the human gut, the small intestine is not easy to examine. X-rays, MRIs and ultrasound images provide snapshots but each suffers limitations. Help is on the way.

University at Buffalo [State University of New York] researchers are developing a new imaging technique involving nanoparticles suspended in liquid to form “nanojuice” that patients would drink. Upon reaching the small intestine, doctors would strike the nanoparticles with a harmless laser light, providing an unparalleled, non-invasive, real-time view of the organ.

A July 5, 2014 University of Buffalo news release (also on EurekAlert) by Cory Nealon, which originated the news item, describes some of the challenges associated with medical imaging of small intestines,

“Conventional imaging methods show the organ and blockages, but this method allows you to see how the small intestine operates in real time,” said corresponding author Jonathan Lovell, PhD, UB assistant professor of biomedical engineering. “Better imaging will improve our understanding of these diseases and allow doctors to more effectively care for people suffering from them.”

The average human small intestine is roughly 23 feet long and 1 inch thick. Sandwiched between the stomach and large intestine, it is where much of the digestion and absorption of food takes place. It is also where symptoms of irritable bowel syndrome, celiac disease, Crohn’s disease and other gastrointestinal illnesses occur.

To assess the organ, doctors typically require patients to drink a thick, chalky liquid called barium. Doctors then use X-rays, magnetic resonance imaging and ultrasounds to assess the organ, but these techniques are limited with respect to safety, accessibility and lack of adequate contrast, respectively.

Also, none are highly effective at providing real-time imaging of movement such as peristalsis, which is the contraction of muscles that propels food through the small intestine. Dysfunction of these movements may be linked to the previously mentioned illnesses, as well as side effects of thyroid disorders, diabetes and Parkinson’s disease.

The news release goes on to describe how the researchers manipulated dyes that are usually unsuitable for the purpose of imaging an organ in the body,

Lovell and a team of researchers worked with a family of dyes called naphthalcyanines. These small molecules absorb large portions of light in the near-infrared spectrum, which is the ideal range for biological contrast agents.

They are unsuitable for the human body, however, because they don’t disperse in liquid and they can be absorbed from the intestine into the blood stream.

To address these problems, the researchers formed nanoparticles called “nanonaps” that contain the colorful dye molecules and added the abilities to disperse in liquid and move safely through the intestine.

In laboratory experiments performed with mice, the researchers administered the nanojuice orally. They then used photoacoustic tomography (PAT), which is pulsed laser lights that generate pressure waves that, when measured, provide a real-time and more nuanced view of the small intestine.

The researchers plan to continue to refine the technique for human trials, and move into other areas of the gastrointestinal tract.

Here’s an image of the nanojuice in the guts of a mouse,

The combination of "nanojuice" and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

The combination of “nanojuice” and photoacoustic tomography illuminates the intestine of a mouse. (Credit: Jonathan Lovell)

This is an international collaboration both from a research perspective and a funding perspective (from the news release),

Additional authors of the study come from UB’s Department of Chemical and Biological Engineering, Pohang University of Science and Technology in Korea, Roswell Park Cancer Institute in Buffalo, the University of Wisconsin-Madison, and McMaster University in Canada.

The research was supported by grants from the National Institutes of Health, the Department of Defense and the Korean Ministry of Science, ICT and Future Planning.

Here’s a link to and a citation for the paper,

Non-invasive multimodal functional imaging of the intestine with frozen micellar naphthalocyanines by Yumiao Zhang, Mansik Jeon, Laurie J. Rich, Hao Hong, Jumin Geng, Yin Zhang, Sixiang Shi, Todd E. Barnhart, Paschalis Alexandridis, Jan D. Huizinga, Mukund Seshadri, Weibo Cai, Chulhong Kim, & Jonathan F. Lovell. Nature Nanotechnology (2014) doi:10.1038/nnano.2014.130 Published online 06 July 2014

This paper is behind a paywall.

Good lignin, bad lignin: Florida researchers use plant waste to create lignin nanotubes while researchers in British Columbia develop trees with less lignin

An April 4, 2014 news item on Azonano describes some nanotube research at the University of Florida that reaches past carbon to a new kind of nanotube,

Researchers with the University of Florida’s [UF] Institute of Food and Agricultural Sciences took what some would consider garbage and made a remarkable scientific tool, one that could someday help to correct genetic disorders or treat cancer without chemotherapy’s nasty side effects.

Wilfred Vermerris, an associate professor in UF’s department of microbiology and cell science, and Elena Ten, a postdoctoral research associate, created from plant waste a novel nanotube, one that is much more flexible than rigid carbon nanotubes currently used. The researchers say the lignin nanotubes – about 500 times smaller than a human eyelash – can deliver DNA directly into the nucleus of human cells in tissue culture, where this DNA could then correct genetic conditions. Experiments with DNA injection are currently being done with carbon nanotubes, as well.

“That was a surprising result,” Vermerris said. “If you can do this in actual human beings you could fix defective genes that cause disease symptoms and replace them with functional DNA delivered with these nanotubes.”

An April 3, 2014 University of Florida’s Institute of Food and Agricultural Sciences news release, which originated the news item, describes the lignin nanotubes (LNTs) and future applications in more detail,

The nanotube is made up of lignin from plant material obtained from a UF biofuel pilot facility in Perry, Fla. Lignin is an integral part of the secondary cell walls of plants and enables water movement from the roots to the leaves, but it is not used to make biofuels and would otherwise be burned to generate heat or electricity at the biofuel plant. The lignin nanotubes can be made from a variety of plant residues, including sorghum, poplar, loblolly pine and sugar cane. [emphasis mine]

The researchers first tested to see if the nanotubes were toxic to human cells and were surprised to find that they were less so than carbon nanotubes. Thus, they could deliver a higher dose of medicine to the human cell tissue.  Then they researched if the nanotubes could deliver plasmid DNA to the same cells and that was successful, too. A plasmid is a small DNA molecule that is physically separate from, and can replicate independently of, chromosomal DNA within a cell.

“It’s not a very smooth road because we had to try different experiments to confirm the results,” Ten said. “But it was very fruitful.”

In cases of genetic disorders, the nanotube would be loaded with a functioning copy of a gene, and injected into the body, where it would target the affected tissue, which then makes the missing protein and corrects the genetic disorder.

Although Vermerris cautioned that treatment in humans is many years away, among the conditions that these gene-carrying nanotubes could correct include cystic fibrosis and muscular dystrophy. But, he added, that patients would have to take the corrective DNA via nanotubes on a continuing basis.

Another application under consideration is to use the lignin nanotubes for the delivery of chemotherapy drugs in cancer patients. The nanotubes would ensure the drugs only get to the tumor without affecting healthy tissues.

Vermerris said they created different types of nanotubes, depending on the experiment. They could also adapt nanotubes to a patient’s specific needs, a process called customization.

“You can think about it as a chest of drawers and, depending on the application, you open one drawer or use materials from a different drawer to get things just right for your specific application,” he said.  “It’s not very difficult to do the customization.”

The next step in the research process is for Vermerris and Ten to begin experiments on mice. They are in the application process for those experiments, which would take several years to complete.  If those are successful, permits would need to be obtained for their medical school colleagues to conduct research on human patients, with Vermerris and Ten providing the nanotubes for that research.

“We are a long way from that point,” Vermerris said. “That’s the optimistic long-term trajectory.”

I hope they have good luck with this work. I have emphasized the plant waste the University of Florida scientists studied due to the inclusion of poplar, which is featured in the University of British Columbia research work also being mentioned in this post.

Getting back to Florida for a moment, here’s a link to and a citation for the paper,

Lignin Nanotubes As Vehicles for Gene Delivery into Human Cells by Elena Ten, Chen Ling, Yuan Wang, Arun Srivastava, Luisa Amelia Dempere, and Wilfred Vermerris. Biomacromolecules, 2014, 15 (1), pp 327–338 DOI: 10.1021/bm401555p Publication Date (Web): December 5, 2013
Copyright © 2013 American Chemical Society

This is an open access paper.

Meanwhile, researchers at the University of British Columbia (UBC) are trying to limit the amount of lignin in trees (specifically poplars, which are not mentioned in this excerpt but in the next). From an April 3, 2014 UBC news release,

Researchers have genetically engineered trees that will be easier to break down to produce paper and biofuel, a breakthrough that will mean using fewer chemicals, less energy and creating fewer environmental pollutants.

“One of the largest impediments for the pulp and paper industry as well as the emerging biofuel industry is a polymer found in wood known as lignin,” says Shawn Mansfield, a professor of Wood Science at the University of British Columbia.

Lignin makes up a substantial portion of the cell wall of most plants and is a processing impediment for pulp, paper and biofuel. Currently the lignin must be removed, a process that requires significant chemicals and energy and causes undesirable waste.

Researchers used genetic engineering to modify the lignin to make it easier to break down without adversely affecting the tree’s strength.

“We’re designing trees to be processed with less energy and fewer chemicals, and ultimately recovering more wood carbohydrate than is currently possible,” says Mansfield.

Researchers had previously tried to tackle this problem by reducing the quantity of lignin in trees by suppressing genes, which often resulted in trees that are stunted in growth or were susceptible to wind, snow, pests and pathogens.

“It is truly a unique achievement to design trees for deconstruction while maintaining their growth potential and strength.”

The study, a collaboration between researchers at the University of British Columbia, the University of Wisconsin-Madison, Michigan State University, is a collaboration funded by Great Lakes Bioenergy Research Center, was published today in Science.

Here’s more about lignin and how a decrease would free up more material for biofuels in a more environmentally sustainable fashion, from the news release,

The structure of lignin naturally contains ether bonds that are difficult to degrade. Researchers used genetic engineering to introduce ester bonds into the lignin backbone that are easier to break down chemically.

The new technique means that the lignin may be recovered more effectively and used in other applications, such as adhesives, insolation, carbon fibres and paint additives.

Genetic modification

The genetic modification strategy employed in this study could also be used on other plants like grasses to be used as a new kind of fuel to replace petroleum.

Genetic modification can be a contentious issue, but there are ways to ensure that the genes do not spread to the forest. These techniques include growing crops away from native stands so cross-pollination isn’t possible; introducing genes to make both the male and female trees or plants sterile; and harvesting trees before they reach reproductive maturity.

In the future, genetically modified trees could be planted like an agricultural crop, not in our native forests. Poplar is a potential energy crop for the biofuel industry because the tree grows quickly and on marginal farmland. [emphasis mine] Lignin makes up 20 to 25 per cent of the tree.

“We’re a petroleum reliant society,” says Mansfield. “We rely on the same resource for everything from smartphones to gasoline. We need to diversify and take the pressure off of fossil fuels. Trees and plants have enormous potential to contribute carbon to our society.”

As noted earlier, the researchers in Florida mention poplars in their paper (Note: Links have been removed),

Gymnosperms such as loblolly pine (Pinus taeda L.) contain lignin that is composed almost exclusively of G-residues, whereas lignin from angiosperm dicots, including poplar (Populus spp.) contains a mixture of G- and S-residues. [emphasis mine] Due to the radical-mediated addition of monolignols to the growing lignin polymer, lignin contains a variety of interunit bonds, including aryl–aryl, aryl–alkyl, and alkyl–alkyl bonds.(3) This feature, combined with the association between lignin and cell-wall polysaccharides, which involves both physical and chemical interactions, make the isolation of lignin from plant cell walls challenging. Various isolation methods exist, each relying on breaking certain types of chemical bonds within the lignin, and derivatizations to solubilize the resulting fragments.(5) Several of these methods are used on a large scale in pulp and paper mills and biorefineries, where lignin needs to be removed from woody biomass and crop residues(6) in order to use the cellulose for the production of paper, biofuels, and biobased polymers. The lignin is present in the waste stream and has limited intrinsic economic value.(7)

Since hydroxyl and carboxyl groups in lignin facilitate functionalization, its compatibility with natural and synthetic polymers for different commercial applications have been extensively studied.(8-12) One of the promising directions toward the cost reduction associated with biofuel production is the use of lignin for low-cost carbon fibers.(13) Other recent studies reported development and characterization of lignin nanocomposites for multiple value-added applications. For example, cellulose nanocrystals/lignin nanocomposites were developed for improved optical, antireflective properties(14, 15) and thermal stability of the nanocomposites.(16) [emphasis mine] Model ultrathin bicomponent films prepared from cellulose and lignin derivatives were used to monitor enzyme binding and cellulolytic reactions for sensing platform applications.(17) Enzymes/“synthetic lignin” (dehydrogenation polymer (DHP)) interactions were also investigated to understand how lignin impairs enzymatic hydrolysis during the biomass conversion processes.(18)

The synthesis of lignin nanotubes and nanowires was based on cross-linking a lignin base layer to an alumina membrane, followed by peroxidase-mediated addition of DHP and subsequent dissolution of the membrane in phosphoric acid.(1) Depending upon monomers used for the deposition of DHP, solid nanowires, or hollow nanotubes could be manufactured and easily functionalized due to the presence of many reactive groups. Due to their autofluorescence, lignin nanotubes permit label-free detection under UV radiation.(1) These features make lignin nanotubes suitable candidates for numerous biomedical applications, such as the delivery of therapeutic agents and DNA to specific cells.

The synthesis of LNTs in a sacrificial template membrane is not limited to a single source of lignin or a single lignin isolation procedure. Dimensions of the LNTs and their cytotoxicity to HeLa cells appear to be determined primarily by the lignin isolation procedure, whereas the transfection efficiency is also influenced by the source of the lignin (plant species and genotype). This means that LNTs can be tailored to the application for which they are intended. [emphasis mine] The ability to design LNTs for specific purposes will benefit from a more thorough understanding of the relationship between the structure and the MW of the lignin used to prepare the LNTs, the nanomechanical properties, and the surface characteristics.

We have shown that DNA is physically associated with the LNTs and that the LNTs enter the cytosol, and in some case the nucleus. The LNTs made from NaOH-extracted lignin are of special interest, as they were the shortest in length, substantially reduced HeLa cell viability at levels above approximately 50 mg/mL, and, in the case of pine and poplar, were the most effective in the transfection [penetrating the cell with a bacterial plasmid to leave genetic material in this case] experiments. [emphasis mine]

As I see the issues presented with these two research efforts, there are environmental and energy issues with extracting the lignin while there seem to be some very promising medical applications possible with lignin ‘waste’. These two research efforts aren’t necessarily antithetical but they do raise some very interesting issues as to how we approach our use of resources and future policies.

ETA May 16, 2014: The beat goes on with the Georgia (US) Institute of Technology issues a roadmap for making money from lignin. From a Georgia Tech May 15, 2014 news release on EurekAlert,

When making cellulosic ethanol from plants, one problem is what to do with a woody agricultural waste product called lignin. The old adage in the pulp industry has been that one can make anything from lignin except money.

A new review article in the journal Science points the way toward a future where lignin is transformed from a waste product into valuable materials such as low-cost carbon fiber for cars or bio-based plastics. Using lignin in this way would create new markets for the forest products industry and make ethanol-to-fuel conversion more cost-effective.

“We’ve developed a roadmap for integrating genetic engineering with analytical chemistry tools to tailor the structure of lignin and its isolation so it can be used for materials, chemicals and fuels,” said Arthur Ragauskas, a professor in the School of Chemistry and Biochemistry at the Georgia Institute of Technology. Ragauskas is also part of the Institute for Paper Science and Technology at Georgia Tech.

The roadmap was published May 15 [2014] in the journal Science. …

Here’s a link to and citation for the ‘roadmap’,

Lignin Valorization: Improving Lignin Processing in the Biorefinery by  Arthur J. Ragauskas, Gregg T. Beckham, Mary J. Biddy, Richard Chandra, Fang Chen, Mark F. Davis, Brian H. Davison, Richard A. Dixon, Paul Gilna, Martin Keller, Paul Langan, Amit K. Naskar, Jack N. Saddler, Timothy J. Tschaplinski, Gerald A. Tuskan, and Charles E. Wyman. Science 16 May 2014: Vol. 344 no. 6185 DOI: 10.1126/science.1246843

This paper is behind a paywall.

Cleaning up oil* spills with cellulose nanofibril aerogels

Given the ever-expanding scope of oil and gas production as previously impossible to reach sources are breached and previously unusable contaminated sources are purified for use while major pipelines and mega tankers are being built to transport all this product, it’s good to see that research into cleaning up oil spills is taking place. A Feb. 26, 2014 news item on Azonano features a project at the University of Wisconsin–Madison,

Cleaning up oil spills and metal contaminates in a low-impact, sustainable and inexpensive manner remains a challenge for companies and governments globally.

But a group of researchers at the University of Wisconsin–Madison is examining alternative materials that can be modified to absorb oil and chemicals without absorbing water. If further developed, the technology may offer a cheaper and “greener” method to absorb oil and heavy metals from water and other surfaces.

Shaoqin “Sarah” Gong, a researcher at the Wisconsin Institute for Discovery (WID) and associate professor of biomedical engineering, graduate student Qifeng Zheng, and Zhiyong Cai, a project leader at the USDA Forest Products Laboratory in Madison, have recently created and patented the new aerogel technology.

The Feb. 25, 2014 University of Wisconsin–Madison news release, which originated the news item, explains a little bit about aergels and about what makes these cellulose nanofibril-based aerogels special,

Aerogels, which are highly porous materials and the lightest solids in existence, are already used in a variety of applications, ranging from insulation and aerospace materials to thickening agents in paints. The aerogel prepared in Gong’s lab is made of cellulose nanofibrils (sustainable wood-based materials) and an environmentally friendly polymer. Furthermore, these cellulose-based aerogels are made using an environmentally friendly freeze-drying process without the use of organic solvents.

It’s the combination of this “greener” material and its high performance that got Gong’s attention.

“For this material, one unique property is that it has superior absorbing ability for organic solvents — up to nearly 100 times its own weight,” she says. “It also has strong absorbing ability for metal ions.”

Treating the cellulose-based aerogel with specific types of silane after it is made through the freeze-drying process is a key step that gives the aerogel its water-repelling and oil-absorbing properties.

The researchers have produced a video showing their aerogel in operation,

For those who don’t have the time for a video, the news release describes some of the action taking place,

“So if you had an oil spill, for example, the idea is you could throw this aerogel sheet in the water and it would start to absorb the oil very quickly and efficiently,” she says. “Once it’s fully saturated, you can take it out and squeeze out all the oil. Although its absorbing capacity reduces after each use, it can be reused for a couple of cycles.”

In addition, this cellulose-based aerogel exhibits excellent flexibility as demonstrated by compression mechanical testing.

Though much work needs to be done before the aerogel can be mass-produced, Gong says she’s eager to share the technology’s potential benefits beyond the scientific community.

“We are living in a time where pollution is a serious problem — especially for human health and for animals in the ocean,” she says. “We are passionate to develop technology to make a positive societal impact.”

Here’s a link to and a citation for the research paper,

Green synthesis of polyvinyl alcohol (PVA)–cellulose nanofibril (CNF) hybrid aerogels and their use as superabsorbents by Qifeng Zheng, Zhiyong Cai, and Shaoqin Gong.  J. Mater. Chem. A, 2014,2, 3110-3118 DOI: 10.1039/C3TA14642A First published online 16 Dec 2013

This paper is behind a paywall. I last wrote about oil-absorbing nanosponges in an April 17, 2012 posting. Those sponges were based on carbon nanotubes (CNTs).

* ‘oils’ in headline changed to ‘oil’ on May 6, 2014.

Tweet your nano

Researchers at the University of Wisconsin-Madison have published a study titled, “Tweeting nano: how public discourses about nanotechnology develop in social media environments,”  which analyses, for the first time, nanotechnology discourse on Twitter social media. From the Life Sciences Communication University of Wisconsin-Madison research webpage,

The study, “Tweeting nano: how public discourses about nanotechnology develop in social media environments,” mapped social media traffic about nanotechnology, finding that Twitter traffic expressing opinion about nanotechnology is more likely to originate from states with a federally-funded National Nanotechnology Initiative center or network than states without such centers.

Runge [Kristin K. Runge, doctoral student] and her co-authors used computational linguistic software to analyze a census of all English-language nanotechnology-related tweets expressing opinion posted on Twitter over one calendar year. In addition to mapping tweets by state, the team coded sentiment along two axes: certain vs. uncertain, and optimistic-neutral-pessimistic. They found 55% of nanotechnology-related opinions expressed certainty, 41% expressed pessimistic outlooks and 32% expressed neutral outlooks.

In addition to shedding light on how social media is used in communicating about an emerging technology, this study is believed to be the first published study to use a census of social media messages rather than a sample.

“We likely wouldn’t have captured these results if we had to rely on a sample rather than a complete census,” said Runge. “That would have been unfortunate, because the distinct geographic origins of the tweets and the tendency toward certainty in opinion expression will be useful in helping us understand how key online influencers are shaping the conversation around nanotechnology.”

It’s not obvious from this notice or the title of the study but it is stated clearly in the study that the focus is the world of US nano, not the English language world of nano. After reading the study (very quickly), I can say it’s interesting and, hopefully, will stimulate more work about public opinion that takes social media into account. (I’d love to know how they limited their study to US tweets only and how they determined the region that spawned the tweet. )

The one thing which puzzles me is they don’t mention retweets (RTs) specifically. Did they consider only original tweets? If not, did they take into account the possibility that someone might RT an item that does not reflect their own opinion? I occasionally RT something that doesn’t reflect my opinion when there isn’t sufficient space to include comment indicating otherwise because I want to promote discussion and that doesn’t necessarily take place on Twitter or in Twitter’s public space. This leads to another question, did the researchers include direct messages in their study? Unfortunately, there’s no mention in the two sections  (Discussion and Implications for future research) of the conclusion.

For those who would like to see the research for themselves (Note: The article is behind a paywall),

Tweeting nano: how public discourses about nanotechnology develop in social media environments by Kristin K. Runge, Sara K. Yeo, Michael Cacciatore, Dietram A. Scheufele, Dominique Brossard, Michael Xenos, Ashley Anderson, Doo-hun Choi, Jiyoun Kim, Nan Li, Xuan Liang, Maria Stubbings, and Leona Yi-Fan Su. Journal of Nanoparticle Research; An Interdisciplinary Forum for Nanoscale Science and Technology© Springer 10.1007/s11051-012-1381-8. Published online Jan. 4, 2013

It’s no surprise to see Dietram Scheufele and Dominique Brossard who are both located the University of Wisconsin-Madison and publish steadily on the topic of nanotechnology and public opinion listed as authors.

Unintended consequences of reading science news online

University of Wisconsin-Madison researchers Dominique Brossard and  Dietram Scheufele have written a cautionary piece for the AAAS’s (American Association for the Advancement of Science) magazine, Science, according to a Jan. 3, 2013 news item on ScienceDaily,

A science-inclined audience and wide array of communications tools make the Internet an excellent opportunity for scientists hoping to share their research with the world. But that opportunity is fraught with unintended consequences, according to a pair of University of Wisconsin-Madison life sciences communication professors.

Dominique Brossard and Dietram Scheufele, writing in a Perspectives piece for the journal Science, encourage scientists to join an effort to make sure the public receives full, accurate and unbiased information on science and technology.

“This is an opportunity to promote interest in science — especially basic research, fundamental science — but, on the other hand, we could be missing the boat,” Brossard says. “Even our most well-intended effort could backfire, because we don’t understand the ways these same tools can work against us.”

The Jan. 3, 2012 University of Wisconsin-Madison news release by Chris Barncard (which originated the news item) notes,

Recent research by Brossard and Scheufele has described the way the Internet may be narrowing public discourse, and new work shows that a staple of online news presentation — the comments section — and other ubiquitous means to provide endorsement or feedback can color the opinions of readers of even the most neutral science stories.

Online news sources pare down discussion or limit visibility of some information in several ways, according to Brossard and Scheufele.

Many news sites use the popularity of stories or subjects (measured by the numbers of clicks they receive, or the rate at which users share that content with others, or other metrics) to guide the presentation of material.

The search engine Google offers users suggested search terms as they make requests, offering up “nanotechnology in medicine,” for example, to those who begin typing “nanotechnology” in a search box. Users often avail themselves of the list of suggestions, making certain searches more popular, which in turn makes those search terms even more likely to appear as suggestions.

Brossard and Scheufele have published an earlier study about the ‘narrowing’ effects of search engines such as Google, using the example of the topic ‘nanotechnology’, as per my May 19, 2010 posting. The researchers appear to be building on this earlier work,

The consequences become more daunting for the researchers as Brossard and Scheufele uncover more surprising effects of Web 2.0.

In their newest study, they show that independent of the content of an article about a new technological development, the tone of comments posted by other readers can make a significant difference in the way new readers feel about the article’s subject. The less civil the accompanying comments, the more risk readers attributed to the research described in the news story.

“The day of reading a story and then turning the page to read another is over,” Scheufele says. “Now each story is surrounded by numbers of Facebook likes and tweets and comments that color the way readers interpret even truly unbiased information. This will produce more and more unintended effects on readers, and unless we understand what those are and even capitalize on them, they will just cause more and more problems.”

If even some of the for-profit media world and advocacy organizations are approaching the digital landscape from a marketing perspective, Brossard and Scheufele argue, scientists need to turn to more empirical communications research and engage in active discussions across disciplines of how to most effectively reach large audiences.

“It’s not because there is not decent science writing out there. We know all kinds of excellent writers and sources,” Brossard says. “But can people be certain that those are the sites they will find when they search for information? That is not clear.”

It’s not about preparing for the future. It’s about catching up to the present. And the present, Scheufele says, includes scientific subjects — think fracking, or synthetic biology — that need debate and input from the public.

Here’s a citation and link for the Science article,

Science, New Media, and the Public by Dominique Brossard and Dietram A. Scheufele in Science 4 January 2013: Vol. 339 no. 6115 pp. 40-41 DOI: 10.1126/science.1232329

This article is behind a paywall.