Tag Archives: University of Maryland

Watching a nanosized space rocket under a microscope

That is a silent video depicting the research. For anyone who may be puzzled, there’s an Aug. 8, 2016 news item on Nanowerk featuring the research announcement from Michigan Technological University (Note: A link has been removed),

Researchers at the University of Maryland and Michigan Technological University have operated a tiny proposed satellite ion rocket under a microscope to see how it works (Nanotechnology, “Radiation-induced solidification of ionic liquid under extreme electric field”).

The rocket, called an electrospray thruster, is a drop of molten salt. When electricity is applied, it creates a field on the tip of the droplet, until ions begin streaming off the end. The force created by the rocket is less than the weight of a human hair, but in the vacuum of space it is enough to push a small object forward with a constant acceleration. Many of these tiny thrusters packed together could propel a spacecraft over great distances, maybe even to the nearest exoplanet, and they are particularly useful for Earth-orbiting nanosatellites, which can be as small as a shoe box. These thrusters are currently being tested on the European Space Agency’s LISA Pathfinder, which hopes to poise objects in space so precisely that they would only be disturbed by gravitational waves.

An Aug, 8, 2016 Michigan Technological University news release on EurekAlert, which originated the news item, explains further,

these droplet engines have a problem: sometimes they form needle-like spikes that disrupt the way the thruster works – they get in the way of the ions flowing outward and turn the liquid to a gel. Lyon B. King and Kurt Terhune, mechanical engineers at Michigan Tech, wanted to find out how this actually happens.

“The challenge is making measurements of features as small as a few molecules in the presence of a strong electric field, which is why we turned to John Cumings at the University of Maryland,” King says, explaining Cumings is known for his work with challenging materials and that they needed to look for a needle in a haystack. “Getting a close look at these droplets is like looking through a straw to find a penny somewhere on the floor of a room–and if that penny moves out of view, like the tip of the molten salt needles do–then you have to start searching for it all over again.”

At the Advanced Imaging and Microscopy Lab at the University of Maryland, Cumings put the tiny thruster in a transmission electron microscope – an advanced scope that can see things down to millionths of a meter. They watched as the droplet elongated and sharpened to a point, and then started emitting ions. Then the tree-like defects began to appear.

The researchers say that figuring out why these branched structures grow could help prevent them from forming. The problem occurs when high-energy electrons, like those used in the microscope’s imaging beam, impact the fluid causing damage to the molecules that they strike. This damages the molten salt’s molecular structure, so it thickens into a gel and no longer flows properly.

“We were able to watch the dendritic structures accumulate in real time,” says Kurt Terhune, a mechanical engineering graduate student and the study’s lead author. “The specific mechanism still needs to be investigated, but this could have importance for spacecraft in high-radiation environments.”

He adds that the microscope’s electron beam is more powerful than natural settings, but the gelling effect could affect the lifetime of electrospray thrusters in low-Earth and geosynchronous orbit.

Here’s a link to and a citation for the paper,

Radiation-induced solidification of ionic liquid under extreme electric field by Kurt J Terhune, Lyon B King, Kai He, and John Cumings. Nanotechnology, Volume 27, Number 37 DOI: http://dx.doi.org/10.1088/0957-4484/27/37/375701 Published 3 August 2016

© 2016 IOP Publishing Ltd

This paper is behind a paywall.

Curbing police violence with machine learning

A rather fascinating Aug. 1, 2016 article by Hal Hodson about machine learning and curbing police violence has appeared in the New Scientist journal (Note: Links have been removed),

None of their colleagues may have noticed, but a computer has. By churning through the police’s own staff records, it has caught signs that an officer is at high risk of initiating an “adverse event” – racial profiling or, worse, an unwarranted shooting.

The Charlotte-Mecklenburg Police Department in North Carolina is piloting the system in an attempt to tackle the police violence that has become a heated issue in the US in the past three years. A team at the University of Chicago is helping them feed their data into a machine learning system that learns to spot risk factors for unprofessional conduct. The department can then step in before risk transforms into actual harm.

The idea is to prevent incidents in which officers who are stressed behave aggressively, for example, such as one in Texas where an officer pulled his gun on children at a pool party after responding to two suicide calls earlier that shift. Ideally, early warning systems would be able to identify individuals who had recently been deployed on tough assignments, and divert them from other sensitive calls.

According to Hodson, there are already systems, both human and algorithmic, in place but the goal is to make them better,

The system being tested in Charlotte is designed to include all of the records a department holds on an individual – from details of previous misconduct and gun use to their deployment history, such as how many suicide or domestic violence calls they have responded to. It retrospectively caught 48 out of 83 adverse incidents between 2005 and now – 12 per cent more than Charlotte-Mecklenberg’s existing early intervention system.

More importantly, the false positive rate – the fraction of officers flagged as being under stress who do not go on to act aggressively – was 32 per cent lower than the existing system’s. “Right now the systems that claim to do this end up flagging the majority of officers,” says Rayid Ghani, who leads the Chicago team. “You can’t really intervene then.”

There is some cautious optimism about this new algorithm (Note: Links have been removed),

Frank Pasquale, who studies the social impact of algorithms at the University of Maryland, is cautiously optimistic. “In many walks of life I think this algorithmic ranking of workers has gone too far – it troubles me,” he says. “But in the context of the police, I think it could work.”

Pasquale says that while such a system for tackling police misconduct is new, it’s likely that older systems created the problem in the first place. “The people behind this are going to say it’s all new,” he says. “But it could be seen as an effort to correct an earlier algorithmic failure. A lot of people say that the reason you have so much contact between minorities and police is because the CompStat system was rewarding officers who got the most arrests.”

CompStat, short for Computer Statistics, is a police management and accountability system that was used to implement the “broken windows” theory of policing, which proposes that coming down hard on minor infractions like public drinking and vandalism helps to create an atmosphere of law and order, bringing serious crime down in its wake. Many police researchers have suggested that the approach has led to the current dangerous tension between police and minority communities.

Ghani has not forgotten the human dimension,

One thing Ghani is certain of is that the interventions will need to be decided on and delivered by humans. “I would not want any of those to be automated,” he says. “As long as there is a human in the middle starting a conversation with them, we’re reducing the chance for things to go wrong.”

h/t Terkko Navigator

I have written about police and violence here in the context of the Dallas Police Department and its use of a robot in a violent confrontation with a sniper, July 25, 2016 posting titled: Robots, Dallas (US), ethics, and killing.

D-PLACE: an open access database of places, language, culture, and enviroment

In an attempt to be a bit more broad in my interpretation of the ‘society’ part of my commentary I’m including this July 8, 2016 news item on ScienceDaily (Note: A link has been removed),

An international team of researchers has developed a website at d-place.org to help answer long-standing questions about the forces that shaped human cultural diversity.

D-PLACE — the Database of Places, Language, Culture and Environment — is an expandable, open access database that brings together a dispersed body of information on the language, geography, culture and environment of more than 1,400 human societies. It comprises information mainly on pre-industrial societies that were described by ethnographers in the 19th and early 20th centuries.

A July 8, 2016 University of Toronto news release (also on EurekAlert), which originated the news item, expands on the theme,

“Human cultural diversity is expressed in numerous ways: from the foods we eat and the houses we build, to our religious practices and political organisation, to who we marry and the types of games we teach our children,” said Kathryn Kirby, a postdoctoral fellow in the Departments of Ecology & Evolutionary Biology and Geography at the University of Toronto and lead author of the study. “Cultural practices vary across space and time, but the factors and processes that drive cultural change and shape patterns of diversity remain largely unknown.

“D-PLACE will enable a whole new generation of scholars to answer these long-standing questions about the forces that have shaped human cultural diversity.”

Co-author Fiona Jordan, senior lecturer in anthropology at the University of Bristol and one of the project leads said, “Comparative research is critical for understanding the processes behind cultural diversity. Over a century of anthropological research around the globe has given us a rich resource for understanding the diversity of humanity – but bringing different resources and datasets together has been a huge challenge in the past.

“We’ve drawn on the emerging big data sets from ecology, and combined these with cultural and linguistic data so researchers can visualise diversity at a glance, and download data to analyse in their own projects.”

D-PLACE allows users to search by cultural practice (e.g., monogamy vs. polygamy), environmental variable (e.g. elevation, mean annual temperature), language family (e.g. Indo-European, Austronesian), or region (e.g. Siberia). The search results can be displayed on a map, a language tree or in a table, and can also be downloaded for further analysis.

It aims to enable researchers to investigate the extent to which patterns in cultural diversity are shaped by different forces, including shared history, demographics, migration/diffusion, cultural innovations, and environmental and ecological conditions.

D-PLACE was developed by an international team of scientists interested in cross-cultural research. It includes researchers from Max Planck Institute for the Science of Human history in Jena Germany, University of Auckland, Colorado State University, University of Toronto, University of Bristol, Yale, Human Relations Area Files, Washington University in Saint Louis, University of Michigan, American Museum of Natural History, and City University of New York.

The diverse team included: linguists; anthropologists; biogeographers; data scientists; ethnobiologists; and evolutionary ecologists, who employ a variety of research methods including field-based primary data collection; compilation of cross-cultural data sources; and analyses of existing cross-cultural datasets.

“The team’s diversity is reflected in D-PLACE, which is designed to appeal to a broad user base,” said Kirby. “Envisioned users range from members of the public world-wide interested in comparing their cultural practices with those of other groups, to cross-cultural researchers interested in pushing the boundaries of existing research into the drivers of cultural change.”

Here’s a link to and a citation for the paper,

D-PLACE: A Global Database of Cultural, Linguistic and Environmental Diversity by Kathryn R. Kirby, Russell D. Gray, Simon J. Greenhill, Fiona M. Jordan, Stephanie Gomes-Ng, Hans-Jörg Bibiko, Damián E. Blasi, Carlos A. Botero, Claire Bowern, Carol R. Ember, Dan Leehr, Bobbi S. Low, Joe McCarter, William Divale, Michael C. Gavin.  PLOS ONE, 2016; 11 (7): e0158391 DOI: 10.1371/journal.pone.0158391 Published July 8, 2016.

This paper is open access.

You can find D-PLACE here.

While it might not seem like that there would be a close link between anthropology and physics in the 19th and early 20th centuries, that information can be mined for more contemporary applications. For example, someone who wants to make a case for a more diverse scientific community may want to develop a social science approach to the discussion. The situation in my June 16, 2016 post titled: Science literacy, science advice, the US Supreme Court, and Britain’s House of Commons, could  be extended into a discussion and educational process using data from D-Place and other sources to make the point,

Science literacy may not be just for the public, it would seem that US Supreme Court judges may not have a basic understanding of how science works. David Bruggeman’s March 24, 2016 posting (on his Pasco Phronesis blog) describes a then current case before the Supreme Court (Justice Antonin Scalia has since died), Note: Links have been removed,

It’s a case concerning aspects of the University of Texas admissions process for undergraduates and the case is seen as a possible means of restricting race-based considerations for admission.  While I think the arguments in the case will likely revolve around factors far removed from science and or technology, there were comments raised by two Justices that struck a nerve with many scientists and engineers.

Both Justice Antonin Scalia and Chief Justice John Roberts raised questions about the validity of having diversity where science and scientists are concerned [emphasis mine].  Justice Scalia seemed to imply that diversity wasn’t esential for the University of Texas as most African-American scientists didn’t come from schools at the level of the University of Texas (considered the best university in Texas).  Chief Justice Roberts was a bit more plain about not understanding the benefits of diversity.  He stated, “What unique perspective does a black student bring to a class in physics?”

To that end, Dr. S. James Gates, theoretical physicist at the University of Maryland, and member of the President’s Council of Advisers on Science and Technology (and commercial actor) has an editorial in the March 25 [2016] issue of Science explaining that the value of having diversity in science does not accrue *just* to those who are underrepresented.

Dr. Gates relates his personal experience as a researcher and teacher of how people’s background inform their practice of science, and that two different people may use the same scientific method, but think about the problem differently.

I’m guessing that both Scalia and Roberts and possibly others believe that science is the discovery and accumulation of facts. In this worldview science facts such as gravity are waiting for discovery and formulation into a ‘law’. They do not recognize that most science is a collection of beliefs and may be influenced by personal beliefs. For example, we believe we’ve proved the existence of the Higgs boson but no one associated with the research has ever stated unequivocally that it exists.

More generally, with D-PLACE and the recently announced Trans-Atlantic Platform (see my July 15, 2016 post about it), it seems Canada’s humanities and social sciences communities are taking strides toward greater international collaboration and a more profound investment in digital scholarship.

Science literacy, science advice, the US Supreme Court, and Britain’s House of Commons

This ‘think’ piece is going to cover a fair bit of ground including science literacy in the general public and in the US Supreme Court, and what that might mean for science advice and UK Members of Parliament (MPs).

Science literacy generally and in the US Supreme Court

A science literacy report for the US National Academy of Sciences (NAS), due sometime from early to mid 2017, is being crafted with an eye to capturing a different perspective according to a March 24, 2016 University of Wisconsin-Madison news release by Terry Dewitt,

What does it mean to be science literate? How science literate is the American public? How do we stack up against other countries? What are the civic implications of a public with limited knowledge of science and how it works? How is science literacy measured?

These and other questions are under the microscope of a 12-member National Academy of Sciences (NAS) panel — including University of Wisconsin—Madison Life Sciences Communication Professor Dominique Brossard and School of Education Professor Noah Feinstein — charged with sorting through the existing data on American science and health literacy and exploring the association between knowledge of science and public perception of and support for science.

The committee — composed of educators, scientists, physicians and social scientists — will take a hard look at the existing data on the state of U.S. science literacy, the questions asked, and the methods used to measure what Americans know and don’t know about science and how that knowledge has changed over time. Critically for science, the panel will explore whether a lack of science literacy is associated with decreased public support for science or research.

Historically, policymakers and leaders in the scientific community have fretted over a perceived lack of knowledge among Americans about science and how it works. A prevailing fear is that an American public unequipped to come to terms with modern science will ultimately have serious economic, security and civic consequences, especially when it comes to addressing complex and nuanced issues like climate change, antibiotic resistance, emerging diseases, environment and energy choices.

While the prevailing wisdom, inspired by past studies, is that Americans don’t stack up well in terms of understanding science, Brossard is not so convinced. Much depends on what kinds of questions are asked, how they are asked, and how the data is analyzed.

It is very easy, she argues, to do bad social science and past studies may have measured the wrong things or otherwise created a perception about the state of U.S. science literacy that may or may not be true.

“How do you conceptualize scientific literacy? What do people need to know? Some argue that scientific literacy may be as simple as an understanding of how science works, the nature of science, [emphasis mine]” Brossard explains. “For others it may be a kind of ‘civic science literacy,’ where people have enough knowledge to be informed and make good decisions in a civics context.”

Science literacy may not be just for the public, it would seem that US Supreme Court judges may not have a basic understanding of how science works. David Bruggeman’s March 24, 2016 posting (on his Pasco Phronesis blog) describes a then current case before the Supreme Court (Justice Antonin Scalia has since died), Note: Links have been removed,

It’s a case concerning aspects of the University of Texas admissions process for undergraduates and the case is seen as a possible means of restricting race-based considerations for admission.  While I think the arguments in the case will likely revolve around factors far removed from science and or technology, there were comments raised by two Justices that struck a nerve with many scientists and engineers.

Both Justice Antonin Scalia and Chief Justice John Roberts raised questions about the validity of having diversity where science and scientists are concerned [emphasis mine].  Justice Scalia seemed to imply that diversity wasn’t esential for the University of Texas as most African-American scientists didn’t come from schools at the level of the University of Texas (considered the best university in Texas).  Chief Justice Roberts was a bit more plain about not understanding the benefits of diversity.  He stated, “What unique perspective does a black student bring to a class in physics?”

To that end, Dr. S. James Gates, theoretical physicist at the University of Maryland, and member of the President’s Council of Advisers on Science and Technology (and commercial actor) has an editorial in the March 25 [2016] issue of Science explaining that the value of having diversity in science does not accrue *just* to those who are underrepresented.

Dr. Gates relates his personal experience as a researcher and teacher of how people’s background inform their practice of science, and that two different people may use the same scientific method, but think about the problem differently.

I’m guessing that both Scalia and Roberts and possibly others believe that science is the discovery and accumulation of facts. In this worldview science facts such as gravity are waiting for discovery and formulation into a ‘law’. They do not recognize that most science is a collection of beliefs and may be influenced by personal beliefs. For example, we believe we’ve proved the existence of the Higgs boson but no one associated with the research has ever stated unequivocally that it exists.

For judges who are under the impression that scientific facts are out there somewhere waiting to be discovered diversity must seem irrelevant. It is not. Who you are affects the questions you ask and how you approach science. The easiest example is to look at how women were viewed when they were subjects in medical research. The fact that women’s physiology is significantly different (and not just in child-bearing ways) was never considered relevant when reporting results. Today, researchers consider not only gender, but age (to some extent), ethnicity, and more when examining results. It’s still not a perfect but it was a step forward.

So when Brossard included “… an understanding of how science works, the nature of science …” as an aspect of science literacy, the judges seemed to present a good example of how not understanding science can have a major impact on how others live.

I’d almost forgotten this science literacy piece as I’d started the draft some months ago but then I spotted a news item about a science advice/MP ‘dating’ service in the UK.

Science advice and UK MPs

First, the news, then, the speculation (from a June 6, 2016 news item on ScienceDaily),

MPs have expressed an overwhelming willingness to use a proposed new service to swiftly link them with academics in relevant areas to help ensure policy is based on the latest evidence.

A June 6, 2016 University of Exeter press release, which originated the news item, provides more detail about the proposed service and the research providing the supporting evidence (Note: A link has been removed),

The government is pursuing a drive towards evidence-based policy, yet policy makers still struggle to incorporate evidence into their decisions. One reason for this is limited easy access to the latest research findings or to academic experts who can respond to questions about evidence quickly.

Researchers at Cardiff University, the University of Exeter and University College London have today published results of the largest study to date reporting MPs’ attitudes to evidence in policy making and their reactions to a proposed Evidence Information Service (EIS) – a rapid match-making advisory service that would work alongside existing systems to put MPs in touch with relevant academic experts.

Dr Natalia Lawrence, of the University of Exeter, said: “It’s clear from our study that politicians want to ensure their decisions incorporate the most reliable evidence, but it can sometimes be very difficult for them to know how to access the latest research findings. This new matchmaking service could be a quick and easy way for them to seek advice from cutting-edge researchers and to check their understanding and facts. It could provide a useful complement to existing highly-valued information services.”

The research, published today in the journal Evidence and Policy, reports the findings of a national consultation exercise between politicians and the public. The researchers recruited members of the public to interview their local parliamentary representative. In total 86, politicians were contacted with 56 interviews completed. The MPs indicated an overwhelming willingness to use a service such as the EIS, with 85% supporting the idea, but noted a number of potential reservations related to the logistics of the EIS such as response time and familiarity with the service. Yet, the MPs indicated that their logistical reservations could be overcome by accessing the EIS via existing highly-valued parliamentary information services such as those provided by the House of Commons and Lords Libraries. Furthermore prior to rolling out the EIS on a nationwide basis it would first need to be piloted.

Developing the proposed EIS in line with feedback from this consultation of MPs would offer the potential to provide policy makers with rapid, reliable and confidential evidence from willing volunteers from the research community.

Professor Chris Chambers, of Cardiff University, said: “The government has given a robust steer that MPs need to link in more with academics to ensure decisions shaping the future of the country are evidence-based. It’s heartening to see that there is a will to adopt this system and we now need to move into a phase of developing a service that is both simple and effective to meet this need.”

The next steps for the project are parallel consultations of academics and members of the public and a pilot of the EIS, using funding from GW4 alliance of universities, made up of Bath, Bristol, Cardiff and Exeter.

What this study shows:
• The consultation shows that politicians recognise the importance of evidence-based policy making and agree on the need for an easier and more direct linkage between academic experts and policy makers.
• Politicians would welcome the creation of the EIS as a provider of rapid, reliable and confidential evidence.

What this study does not show:
• This study does not show how academics would provide evidence. This was a small-scale study which consulted politicians and has not attempted to give voice to the academic community.
• This study does not detail the mechanism of an operational EIS. Instead it indicates the need for a service such as the EIS and suggests ways in which the EIS can be operationalized.

Here’s a link to and a citation for the paper,

Service as a new platform for supporting evidence-based policy: a consultation of UK parliamentarians by Natalia Lawrence, Jemma Chambers, Sinead Morrison, Sven Bestmann, Gerard O’Grady, Christopher Chambers, Andrew Kythreotis. Evidence & Policy: A Journal of Research, Debate and Practice DOI: http://dx.doi.org/10.1332/174426416X14643531912169 Appeared or available online: June 6, 2016

This paper is behind a paywall open access. *Corrected June 17, 2016.*

It’s an interesting idea and I can understand the appeal. However, operationalizing this ‘dating’ or ‘matchmaking’ service could prove quite complex. I appreciate the logistics issues but I’m a little more concerned about the MPs’ science literacy. Are they going to be like the two US justices who believe that science is the pursuit of immutable facts? What happens if two MPs are matched up with a different scientist and those two scientists didn’t agree about what the evidence says. Or, what happens if one scientist is more cautious than the other. There are all kinds of pitfalls. I’m not arguing against the idea but it’s going to require a lot of careful consideration.

University of Maryland looks into transparent wood

Is transparent wood becoming the material du jour? Following on the heels of my April 1, 2016 post about transparent wood and the KTH Royal Institute of Technology (Sweden), there’s a May 6, 2016 news item on ScienceDaily about the material and a team at the University of Maryland,

Researchers at the University of Maryland have made a block of linden wood transparent, which they say will be useful in fancy building materials and in light-based electronics systems.

Materials scientist Liangbing Hu and his team at the University of Maryland, College Park, have removed the molecule in wood, lignin, that makes it rigid and dark in color. They left behind the colorless cellulose cell structures, filled them with epoxy, and came up with a version of the wood that is mostly see-thru.

I wonder if this is the type of material that might be used in structures like the proposed Center of Nanoscience and Nanotechnology at Tel Aviv University building (my May 9, 2016 posting about a building design that features no doors or windows)?

Regardless, there’s more about this latest transparent wood in a May 5, 2016 Tufts University news release, which originated the news item,

Remember “xylem” and “phloem” from grade-school science class? These structures pass water and nutrients up and down the tree. Hu and his colleagues see these as vertically aligned channels in the wood, a naturally-grown structure that can be used to pass light along, after the wood has been treated.

The resulting three-inch block of wood had both high transparency—the quality of being see-thru—and high haze—the quality of scattering light. This would be useful, said Hu, in making devices comfortable to look at. It would also help solar cells trap light; light could easily enter through the transparent function, but the high haze would keep the light bouncing around near where it would be absorbed by the solar panel.

They compared how the materials performed and how light worked its way through the wood when they sliced it two ways: one with the grain of the wood, so that the channels passed through the longest dimension of the block. And they also tried slicing it against the grain, so that the channels passed through the shortest dimension of the block.

The short channel wood proved slightly stronger and a little less brittle. But though the natural component making the wood strong had been removed, the addition of the epoxy made the wood four to six times tougher than the untreated version.

Then they investigated how the different directions of the wood affected the way the light passed through it. When laid down on top of a grid, both kinds of wood showed the lines clearly. When lifted just a touch above the grid, the long-channel wood still showed the grid, just a little bit more blurry. But the short channel wood, when lifted those same few millimeters, made the grid completely invisible.

Here’s a link to and a citation for the paper,

Highly Anisotropic, Highly Transparent Wood Composites by Mingwei Zhu, Jianwei Song, Tian Li, Amy Gong, Yanbin Wang, Jiaqi Dai, Yonggang Yao, Wei Luo, Doug Henderson, and Liangbing Hu. Advanced Materials DOI: 10.1002/adma.201600427 Article first published online: 4 MAY 2016

© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

This paper is behind a paywall.

New ABCs of research: seminars and a book

David Bruggeman has featured a new book and mentioned its attendant seminars in an April 19, 2016 post on his Pasco Phronesis blog (Note: A link has been removed),

Ben Shneiderman, Professor of Computer Science at the University of Maryland at College Park, recently published The New ABCs of Research: Achieving Breakthrough Collaborations.  It’s meant to be a guide for students and researchers about the various efforts to better integrate different kinds of research and design to improve research outputs and outcomes. …

David has an embedded a video of Schneiderman discussing the principles espoused in his book. There are some upcoming seminars including one on Thursday, April 21, 2016 (today) at New York University (NYU) at 12:30 pm at 44 West 4th St, Kaufman Management Center, Room 3-50. From the description on the NYU event page,

Solving the immense problems of the 21st century will require ambitious research teams that are skilled at producing practical solutions and foundational theories simultaneously – that is the ABC Principle: Applied & Basic Combined.  Then these research teams can deliver high-impact outcomes by applying the SED Principle: Blend Science, Engineering and Design Thinking, which encourages use of the methods from all three disciplines.  These guiding principles (ABC & SED) are meant to replace Vannevar Bush’s flawed linear model from 1945 that has misled researchers for 70+ years.  These new guiding principles will enable students, researchers, business leaders, and government policy makers to accelerate discovery and innovation.

Oxford University Press:  http://ukcatalogue.oup.com/product/9780198758839.do

Book website:  http://www.cs.umd.edu/hcil/newabcs

There is another seminar on Wednesday, April 27, 2016 at 3:00 pm in the Pepco Room, #1105 Kim Engineering Building at the University of Maryland which is handy for anyone in the Washington, DC area.

‘Beleafing’ in magic; a new type of battery

A Jan. 28, 2016 news item on ScienceDaily announces the ‘beleaf’,

Scientists have a new recipe for batteries: Bake a leaf, and add sodium. They used a carbonized oak leaf, pumped full of sodium, as a demonstration battery’s negative terminal, or anode, according to a paper published yesterday in the journal ACS Applied Materials Interfaces.

Scientists baked a leaf to demonstrate a battery. Credit: Image courtesy of Maryland NanoCenter

Scientists baked a leaf to demonstrate a battery.
Credit: Image courtesy of Maryland NanoCenter

A Jan. ??, 2016 Maryland NanoCenter (University of Maryland) news release, which originated the news item, provides more information about the nature (pun intended) of the research,

“Leaves are so abundant. All we had to do was pick one up off the ground here on campus,” said Hongbian Li, a visiting professor at the University of Maryland’s department of materials science and engineering and one of the main authors of the paper. Li is a member of the faculty at the National Center for Nanoscience and Technology in Beijing, China.

Other studies have shown that melon skin, banana peels and peat moss can be used in this way, but a leaf needs less preparation.

The scientists are trying to make a battery using sodium where most rechargeable batteries sold today use lithium. Sodium would hold more charge, but can’t handle as many charge-and-discharge cycles as lithium can.

One of the roadblocks has been finding an anode material that is compatible with sodium, which is slightly larger than lithium. Some scientists have explored graphene, dotted with various materials to attract and retain the sodium, but these are time consuming and expensive to produce.  In this case, they simply heated the leaf for an hour at 1,000 degrees C (don’t try this at home) to burn off all but the underlying carbon structure.

The lower side of the maple [?] leaf is studded with pores for the leaf to absorb water. In this new design, the pores absorb the sodium electrolyte. At the top, the layers of carbon that made the leaf tough become sheets of nanostructured carbon to absorb the sodium that carries the charge.

“The natural shape of a leaf already matches a battery’s needs: a low surface area, which decreases defects; a lot of small structures packed closely together, which maximizes space; and internal structures of the right size and shape to be used with sodium electrolyte,” said Fei Shen, a visiting student in the department of materials science and engineering and the other main author of the paper.

“We have tried other natural materials, such as wood fiber, to make a battery,” said Liangbing Hu, an assistant professor of materials science and engineering. “A leaf is designed by nature to store energy for later use, and using leaves in this way could make large-scale storage environmentally friendly.”

The next step, Hu said, is “to investigate different types of leaves to find the best thickness, structure and flexibility” for electrical energy storage.  The researchers have no plans to commercialize at this time.

Here’s a link to and a citation for the paper,

Carbonized-leaf Membrane with Anisotropic Surfaces for Sodium-ion Battery by Hongbian Li, Fei Shen, Wei Luo, Jiaqi Dai, Xiaogang Han, Yanan Chen, Yonggang Yao, Hongli Zhu, Kun Fu, Emily Hitz, and Liangbing Hu. ACS Appl. Mater. Interfaces, 2016, 8 (3), pp 2204–2210 DOI: 10.1021/acsami.5b10875 Publication Date (Web): January 4, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

US National Institute of Standards and Technology and molecules made of light (lightsabres anyone?)

As I recall, lightsabres are a Star Wars invention. I gather we’re a long way from running around with lightsabres  but there is hope, if that should be your dream, according to a Sept. 9, 2015 news item on Nanowerk,

… a team including theoretical physicists from JQI [Joint Quantum Institute] and NIST [US National Institute of Stnadards and Technology] has taken another step toward building objects out of photons, and the findings hint that weightless particles of light can be joined into a sort of “molecule” with its own peculiar force.

Here’s an artist’s conception of the light “molecule” provided by the researchers,

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

Researchers show that two photons, depicted in this artist’s conception as waves (left and right), can be locked together at a short distance. Under certain conditions, the photons can form a state resembling a two-atom molecule, represented as the blue dumbbell shape at center. Credit: E. Edwards/JQI

A Sept. 8, 2015 NIST news release (also available on EurekAlert*), which originated the news item, provides more information about the research (Note: Links have been removed),

The findings build on previous research that several team members contributed to before joining NIST. In 2013, collaborators from Harvard, Caltech and MIT found a way to bind two photons together so that one would sit right atop the other, superimposed as they travel. Their experimental demonstration was considered a breakthrough, because no one had ever constructed anything by combining individual photons—inspiring some to imagine that real-life lightsabers were just around the corner.

Now, in a paper forthcoming in Physical Review Letters, the NIST and University of Maryland-based team (with other collaborators) has showed theoretically that by tweaking a few parameters of the binding process, photons could travel side by side, a specific distance from each other. The arrangement is akin to the way that two hydrogen atoms sit next to each other in a hydrogen molecule.

“It’s not a molecule per se, but you can imagine it as having a similar kind of structure,” says NIST’s Alexey Gorshkov. “We’re learning how to build complex states of light that, in turn, can be built into more complex objects. This is the first time anyone has shown how to bind two photons a finite distance apart.”

While the new findings appear to be a step in the right direction—if we can build a molecule of light, why not a sword?—Gorshkov says he is not optimistic that Jedi Knights will be lining up at NIST’s gift shop anytime soon. The main reason is that binding photons requires extreme conditions difficult to produce with a roomful of lab equipment, let alone fit into a sword’s handle. Still, there are plenty of other reasons to make molecular light—humbler than lightsabers, but useful nonetheless.

“Lots of modern technologies are based on light, from communication technology to high-definition imaging,” Gorshkov says. “Many of them would be greatly improved if we could engineer interactions between photons.”

For example, engineers need a way to precisely calibrate light sensors, and Gorshkov says the findings could make it far easier to create a “standard candle” that shines a precise number of photons at a detector. Perhaps more significant to industry, binding and entangling photons could allow computers to use photons as information processors, a job that electronic switches in your computer do today.

Not only would this provide a new basis for creating computer technology, but it also could result in substantial energy savings. Phone messages and other data that currently travel as light beams through fiber optic cables has to be converted into electrons for processing—an inefficient step that wastes a great deal of electricity. If both the transport and the processing of the data could be done with photons directly, it could reduce these energy losses.

Gorshkov says it will be important to test the new theory in practice for these and other potential benefits.

“It’s a cool new way to study photons,” he says. “They’re massless and fly at the speed of light. Slowing them down and binding them may show us other things we didn’t know about them before.”

Here are links and citations for the paper. First, there’s an early version on arXiv.org and, then, there’s the peer-reviewed version, which is not yet available,

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, A. V. Gorshkov.      arXiv:1505.03859 [quant-ph] (or arXiv:1505.03859v1 [quant-ph] for this version)

Coulomb bound states of strongly interacting photons by M. F. Maghrebi, M. J. Gullans, P. Bienias, S. Choi, I. Martin, O. Firstenberg, M. D. Lukin, H. P. Büchler, and A. V. Gorshkov.
Phys. Rev. Lett. forthcoming in September 2015.

The first version (arXiv) is open access and I’m not sure whether or not the Physical review Letters study will be behind a paywall or be available as an open access paper.

*EurekAlert link added 10:34 am PST on Sept. 11, 2015.

Northwestern University’s (US) International Institute for Nanotechnology (IIN) rakes in some cash

Within less than a month Northwestern University’s International Institute for Nanotechnology (IIN) has been granted awarded two grants by the US Department of Defense.

4D printing

The first grant, for 4D printing, was announced in a June 11, 2015 Northwestern news release by Megan Fellman (Note: A link has been removed),

Northwestern University’s International Institute for Nanotechnology (IIN) has received a five-year, $8.5 million grant from the U.S. Department of Defense’s competitive Multidisciplinary University Research Initiative (MURI) program to develop a “4-dimensional printer” — the next generation of printing technology for the scientific world.

Once developed, the 4-D printer, operating on the nanoscale, will be used to construct new devices for research in chemistry, materials sciences and U.S. defense-related areas that could lead to new chemical and biological sensors, catalysts, microchip designs and materials designed to respond to specific materials or signals.

“This research promises to bring transformative advancement to the development of biosensors, adaptive optics, artificially engineered tissues and more by utilizing nanotechnology,” said IIN director and chemist Chad A. Mirkin, who is leading the multi-institution project. Mirkin is the George B. Rathmann Professor of Chemistry in the Weinberg College of Arts and Sciences.

The award, issued by the Air Force Office of Scientific Research, supports a team of experts from Northwestern, the University of Miami, the University of California, San Diego, and the University of Maryland.

In science, “printing” encodes information at specific locations on a material’s surface, similar to how we print words on paper with ink. The 4-dimensional printer will consist of millions of tiny elastomeric “pens” that can be used individually and independently to create nanometer-size features composed of hard or soft materials.

The information encoded can be in the form of materials with a defined set of chemical and physical properties. The printing speed and resolution determine the amount and complexity of the information that can be encoded.

Progress in fields ranging from biology to chemical sensing to computing currently are limited by the lack of low-cost equipment that can perform high-resolution printing and 3-dimensional patterning on hard materials (e.g., metals and semiconductors) and soft materials (e.g., organic and biological materials) at nanometer resolution (approximately 1,000 times smaller than the width of a human hair).

“Ultimately, the 4-D printer will provide a foundation for a new generation of tools to develop novel architectures, wherein the hard materials that form the functional components of electronics can be merged with biological or soft materials,” said Milan Mrksich, a co-principal investigator on the grant.

Mrksich is the Henry Wade Rogers Professor of Biomedical Engineering, Chemistry and Cell and Molecular Biology, with appointments in the McCormick School of Engineering and Applied Science, Weinberg and Northwestern University Feinberg School of Medicine.

A July 10, 2015 article about the ‘4D printer’ grant  by Madeline Fox for the Daily Northwestern features a description of 4D printing from Milan Mrksich, a co-principal investigator on the grant,

Milan Mrksich, one of the project’s five senior participants, said that while most people are familiar with the three dimensions of length, width and depth, there are often misconceptions about the fourth property of a four-dimensional object. Mrksich used Legos as an analogy to describe 4D printing technology.

“If you take Lego blocks, you can basically build any structure you want by controlling which Lego is connected to which Lego and controlling all their dimensions in space,” Mrksich said. “Within an object made up of nanoparticles, we’re controlling the placement — as we use a printer to control the placement of every particle, our fourth dimension lets us choose which nanoparticle with which property would be at each position.”

Thank you Dr. Mrksich and Ms. Fox for that helpful analogy.

Designing advanced bioprogrammable nanomaterials

The second grant, announced in a July 6, 2015 Northwestern news release by Megan Fellman, is apparently the only one of its kind in the US (Note: A link has been removed),

Northwestern University’s International Institute for Nanotechnology (IIN) has been awarded a U.S. Air Force Center of Excellence grant to design advanced bioprogrammable nanomaterials for solutions to challenging problems in the areas of energy, the environment, security and defense, as well as for developing ways to monitor and mitigate human stress.

The five-year, $9.8 million grant establishes the Center of Excellence for Advanced Bioprogrammable Nanomaterials (C-ABN), the only one of its kind in the country. After the initial five years, the grant potentially could be renewed for an additional five years.

“Northwestern University was chosen to lead this Center of Excellence because of its investment in infrastructure development, including new facilities and instrumentation; its recruitment of high-caliber faculty members and students; and its track record in bio-nanotechnology and cognitive sciences,” said Timothy Bunning, chief scientist at the U.S. Air Force Research Laboratory (AFRL) Materials and Manufacturing Directorate.

Led by IIN director Chad A. Mirkin, C-ABN will support collaborative, discovery-based research projects aimed at developing bioprogrammable nanomaterials that will meet both military and civilian needs and facilitate the efficient transition of these new technologies from the laboratory to marketplace.

Bioprogrammable nanomaterials are structures that typically contain a biomolecular component, such as nucleic acids or proteins, which give the materials a variety of novel capabilities. [emphasis mine] Nanomaterials can be designed to assemble into large 3-D structures, to interface with biological structures inside cells or tissues, or to interface with existing macroscale devices, for example. These new bioprogrammable nanomaterials and the fundamental knowledge gained through their development will ultimately lead to the creation of wearable, portable and/or human-interactive devices with extraordinary capabilities that will significantly impact both civilian and Air Force needs.

In one research area, scientists will work to understand the molecular underpinnings of vulnerability and resilience to stress. They will use bioprogrammable nanomaterials to develop ultrasensitive sensors capable of detecting and quantifying biomarkers for human stress in biological fluids (e.g., saliva, perspiration or blood), providing means to easily monitor the soldier during times of extreme stress. Ultimately, these bioprogrammable materials may lead to methods to increase human cellular resilience to the effects of stress and/or to correct genetic mutations that decrease cellular resilience of susceptible individuals.

Other research projects, encompassing a wide variety of nanotechnology-enabled goals, include:

Developing hybrid wearable energy-storage devices;
Developing devices to identify chemical and biological targets in a field environment;
Developing flexible bio-electronic circuits;
Designing a new class of flat optics; and
Advancing understanding of design rules between 2-D and 3-D architectures.

The analysis of these nanostructures also will extend fundamental knowledge in the fields of materials science and engineering, human performance, chemistry, biology and physics.

The center will be housed under the IIN, providing researchers with access to IIN’s strong entrepreneurial community and its close ties with Northwestern’s renowned Kellogg School of Management.

This second news release provides an interesting contrast to a recent news release from Sweden’s Karolinska Intitute where the writer was careful to note that the enzymes and organic electronic ion pumps were not living as noted in my June 26, 2015 posting. It seems nucleic acids (as in RNA and DNA) can be mentioned without a proviso in the US. as there seems to be little worry about anti-GMO (genetically modified organisms) and similar backlashes affecting biotechnology research.

I sing the body cyber: two projects funded by the US National Science Foundation

Points to anyone who recognized the reference to Walt Whitman’s poem, “I sing the body electric,” from his classic collection, Leaves of Grass (1867 edition; h/t Wikipedia entry). I wonder if the cyber physical systems (CPS) work being funded by the US National Science Foundation (NSF) in the US will occasion poetry too.

More practically, a May 15, 2015 news item on Nanowerk, describes two cyber physical systems (CPS) research projects newly funded by the NSF,

Today [May 12, 2015] the National Science Foundation (NSF) announced two, five-year, center-scale awards totaling $8.75 million to advance the state-of-the-art in medical and cyber-physical systems (CPS).

One project will develop “Cyberheart”–a platform for virtual, patient-specific human heart models and associated device therapies that can be used to improve and accelerate medical-device development and testing. The other project will combine teams of microrobots with synthetic cells to perform functions that may one day lead to tissue and organ re-generation.

CPS are engineered systems that are built from, and depend upon, the seamless integration of computation and physical components. Often called the “Internet of Things,” CPS enable capabilities that go beyond the embedded systems of today.

“NSF has been a leader in supporting research in cyber-physical systems, which has provided a foundation for putting the ‘smart’ in health, transportation, energy and infrastructure systems,” said Jim Kurose, head of Computer & Information Science & Engineering at NSF. “We look forward to the results of these two new awards, which paint a new and compelling vision for what’s possible for smart health.”

Cyber-physical systems have the potential to benefit many sectors of our society, including healthcare. While advances in sensors and wearable devices have the capacity to improve aspects of medical care, from disease prevention to emergency response, and synthetic biology and robotics hold the promise of regenerating and maintaining the body in radical new ways, little is known about how advances in CPS can integrate these technologies to improve health outcomes.

These new NSF-funded projects will investigate two very different ways that CPS can be used in the biological and medical realms.

A May 12, 2015 NSF news release (also on EurekAlert), which originated the news item, describes the two CPS projects,

Bio-CPS for engineering living cells

A team of leading computer scientists, roboticists and biologists from Boston University, the University of Pennsylvania and MIT have come together to develop a system that combines the capabilities of nano-scale robots with specially designed synthetic organisms. Together, they believe this hybrid “bio-CPS” will be capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.

“We bring together synthetic biology and micron-scale robotics to engineer the emergence of desired behaviors in populations of bacterial and mammalian cells,” said Calin Belta, a professor of mechanical engineering, systems engineering and bioinformatics at Boston University and principal investigator on the project. “This project will impact several application areas ranging from tissue engineering to drug development.”

The project builds on previous research by each team member in diverse disciplines and early proof-of-concept designs of bio-CPS. According to the team, the research is also driven by recent advances in the emerging field of synthetic biology, in particular the ability to rapidly incorporate new capabilities into simple cells. Researchers so far have not been able to control and coordinate the behavior of synthetic cells in isolation, but the introduction of microrobots that can be externally controlled may be transformative.

In this new project, the team will focus on bio-CPS with the ability to sense, transport and work together. As a demonstration of their idea, they will develop teams of synthetic cell/microrobot hybrids capable of constructing a complex, fabric-like surface.

Vijay Kumar (University of Pennsylvania), Ron Weiss (MIT), and Douglas Densmore (BU) are co-investigators of the project.

Medical-CPS and the ‘Cyberheart’

CPS such as wearable sensors and implantable devices are already being used to assess health, improve quality of life, provide cost-effective care and potentially speed up disease diagnosis and prevention. [emphasis mine]

Extending these efforts, researchers from seven leading universities and centers are working together to develop far more realistic cardiac and device models than currently exist. This so-called “Cyberheart” platform can be used to test and validate medical devices faster and at a far lower cost than existing methods. CyberHeart also can be used to design safe, patient-specific device therapies, thereby lowering the risk to the patient.

“Innovative ‘virtual’ design methodologies for implantable cardiac medical devices will speed device development and yield safer, more effective devices and device-based therapies, than is currently possible,” said Scott Smolka, a professor of computer science at Stony Brook University and one of the principal investigators on the award.

The group’s approach combines patient-specific computational models of heart dynamics with advanced mathematical techniques for analyzing how these models interact with medical devices. The analytical techniques can be used to detect potential flaws in device behavior early on during the device-design phase, before animal and human trials begin. They also can be used in a clinical setting to optimize device settings on a patient-by-patient basis before devices are implanted.

“We believe that our coordinated, multi-disciplinary approach, which balances theoretical, experimental and practical concerns, will yield transformational results in medical-device design and foundations of cyber-physical system verification,” Smolka said.

The team will develop virtual device models which can be coupled together with virtual heart models to realize a full virtual development platform that can be subjected to computational analysis and simulation techniques. Moreover, they are working with experimentalists who will study the behavior of virtual and actual devices on animals’ hearts.

Co-investigators on the project include Edmund Clarke (Carnegie Mellon University), Elizabeth Cherry (Rochester Institute of Technology), W. Rance Cleaveland (University of Maryland), Flavio Fenton (Georgia Tech), Rahul Mangharam (University of Pennsylvania), Arnab Ray (Fraunhofer Center for Experimental Software Engineering [Germany]) and James Glimm and Radu Grosu (Stony Brook University). Richard A. Gray of the U.S. Food and Drug Administration is another key contributor.

It is fascinating to observe how terminology is shifting from pacemakers and deep brain stimulators as implants to “CPS such as wearable sensors and implantable devices … .” A new category has been created, CPS, which conjoins medical devices with other sensing devices such as wearable fitness monitors found in the consumer market. I imagine it’s an attempt to quell fears about injecting strange things into or adding strange things to your body—microrobots and nanorobots partially derived from synthetic biology research which are “… capable of performing heretofore impossible functions, from microscopic assembly to cell sensing within the body.” They’ve also sneaked in a reference to synthetic biology, an area of research where some concerns have been expressed, from my March 19, 2013 post about a poll and synthetic biology concerns,

In our latest survey, conducted in January 2013, three-fourths of respondents say they have heard little or nothing about synthetic biology, a level consistent with that measured in 2010. While initial impressions about the science are largely undefined, these feelings do not necessarily become more positive as respondents learn more. The public has mixed reactions to specific synthetic biology applications, and almost one-third of respondents favor a ban “on synthetic biology research until we better understand its implications and risks,” while 61 percent think the science should move forward.

I imagine that for scientists, 61% in favour of more research is not particularly comforting given how easily and quickly public opinion can shift.