Category Archives: military

Robots, Dallas (US), ethics, and killing

I’ve waited a while before posting this piece in the hope that the situation would calm. Sadly, it took longer than hoped as there was an additional shooting incident of police officers in Baton Rouge on July 17, 2016. There’s more about that shooting in a July 18, 2016 news posting by Steve Visser for CNN.)

Finally: Robots, Dallas, ethics, and killing: In the wake of the Thursday, July 7, 2016 shooting in Dallas (Texas, US) and subsequent use of a robot armed with a bomb to kill  the suspect, a discussion about ethics has been raised.

This discussion comes at a difficult period. In the same week as the targeted shooting of white police officers in Dallas, two African-American males were shot and killed in two apparently unprovoked shootings by police. The victims were Alton Sterling in Baton Rouge, Louisiana on Tuesday, July 5, 2016 and, Philando Castile in Minnesota on Wednesday, July 6, 2016. (There’s more detail about the shootings prior to Dallas in a July 7, 2016 news item on CNN.) The suspect in Dallas, Micah Xavier Johnson, a 25-year-old African-American male had served in the US Army Reserve and been deployed in Afghanistan (there’s more in a July 9, 2016 news item by Emily Shapiro, Julia Jacobo, and Stephanie Wash for abcnews.go.com). All of this has taken place within the context of a movement started in 2013 in the US, Black Lives Matter.

Getting back to robots, most of the material I’ve seen about ‘killing or killer’ robots has so far involved industrial accidents (very few to date) and ethical issues for self-driven cars (see a May 31, 2016 posting by Noah J. Goodall on the IEEE [Institute of Electrical and Electronics Engineers] Spectrum website).

The incident in Dallas is apparently the first time a US police organization has used a robot as a bomb, although it has been an occasional practice by US Armed Forces in combat situations. Rob Lever in a July 8, 2016 Agence France-Presse piece on phys.org focuses on the technology aspect,

The “bomb robot” killing of a suspected Dallas shooter may be the first lethal use of an automated device by American police, and underscores growing role of technology in law enforcement.

Regardless of the methods in Dallas, the use of robots is expected to grow, to handle potentially dangerous missions in law enforcement and the military.


Researchers at Florida International University meanwhile have been working on a TeleBot that would allow disabled police officers to control a humanoid robot.

The robot, described in some reports as similar to the “RoboCop” in films from 1987 and 2014, was designed “to look intimidating and authoritative enough for citizens to obey the commands,” but with a “friendly appearance” that makes it “approachable to citizens of all ages,” according to a research paper.

Robot developers downplay the potential for the use of automated lethal force by the devices, but some analysts say debate on this is needed, both for policing and the military.

A July 9, 2016 Associated Press piece by Michael Liedtke and Bree Fowler on phys.org focuses more closely on ethical issues raised by the Dallas incident,

When Dallas police used a bomb-carrying robot to kill a sniper, they also kicked off an ethical debate about technology’s use as a crime-fighting weapon.

The strategy opens a new chapter in the escalating use of remote and semi-autonomous devices to fight crime and protect lives. It also raises new questions over when it’s appropriate to dispatch a robot to kill dangerous suspects instead of continuing to negotiate their surrender.

“If lethally equipped robots can be used in this situation, when else can they be used?” says Elizabeth Joh, a University of California at Davis law professor who has followed U.S. law enforcement’s use of technology. “Extreme emergencies shouldn’t define the scope of more ordinary situations where police may want to use robots that are capable of harm.”

In approaching the question about the ethics, Mike Masnick’s July 8, 2016 posting on Techdirt provides a surprisingly sympathetic reading for the Dallas Police Department’s actions, as well as, asking some provocative questions about how robots might be better employed by police organizations (Note: Links have been removed),

The Dallas Police have a long history of engaging in community policing designed to de-escalate situations, rather than encourage antagonism between police and the community, have been handling all of this with astounding restraint, frankly. Many other police departments would be lashing out, and yet the Dallas Police Dept, while obviously grieving for a horrible situation, appear to be handling this tragic situation professionally. And it appears that they did everything they could in a reasonable manner. They first tried to negotiate with Johnson, but after that failed and they feared more lives would be lost, they went with the robot + bomb option. And, obviously, considering he had already shot many police officers, I don’t think anyone would question the police justification if they had shot Johnson.

But, still, at the very least, the whole situation raises a lot of questions about the legality of police using a bomb offensively to blow someone up. And, it raises some serious questions about how other police departments might use this kind of technology in the future. The situation here appears to be one where people reasonably concluded that this was the most effective way to stop further bloodshed. And this is a police department with a strong track record of reasonable behavior. But what about other police departments where they don’t have that kind of history? What are the protocols for sending in a robot or drone to kill someone? Are there any rules at all?

Furthermore, it actually makes you wonder, why isn’t there a focus on using robots to de-escalate these situations? What if, instead of buying military surplus bomb robots, there were robots being designed to disarm a shooter, or detain him in a manner that would make it easier for the police to capture him alive? Why should the focus of remote robotic devices be to kill him? This isn’t faulting the Dallas Police Department for its actions last night. But, rather, if we’re going to enter the age of robocop, shouldn’t we be looking for ways to use such robotic devices in a manner that would help capture suspects alive, rather than dead?

Gordon Corera’s July 12, 2016 article on the BBC’s (British Broadcasting Corporation) news website provides an overview of the use of automation and of ‘killing/killer robots’,

Remote killing is not new in warfare. Technology has always been driven by military application, including allowing killing to be carried out at distance – prior examples might be the introduction of the longbow by the English at Crecy in 1346, then later the Nazi V1 and V2 rockets.

More recently, unmanned aerial vehicles (UAVs) or drones such as the Predator and the Reaper have been used by the US outside of traditional military battlefields.

Since 2009, the official US estimate is that about 2,500 “combatants” have been killed in 473 strikes, along with perhaps more than 100 non-combatants. Critics dispute those figures as being too low.

Back in 2008, I visited the Creech Air Force Base in the Nevada desert, where drones are flown from.

During our visit, the British pilots from the RAF deployed their weapons for the first time.

One of the pilots visibly bristled when I asked him if it ever felt like playing a video game – a question that many ask.

The military uses encrypted channels to control its ordnance disposal robots, but – as any hacker will tell you – there is almost always a flaw somewhere that a determined opponent can find and exploit.

We have already seen cars being taken control of remotely while people are driving them, and the nightmare of the future might be someone taking control of a robot and sending a weapon in the wrong direction.

The military is at the cutting edge of developing robotics, but domestic policing is also a different context in which greater separation from the community being policed risks compounding problems.

The balance between risks and benefits of robots, remote control and automation remain unclear.

But Dallas suggests that the future may be creeping up on us faster than we can debate it.

The excerpts here do not do justice to the articles, if you’re interested in this topic and have the time, I encourage you to read all the articles cited here in their entirety.

*(ETA: July 25, 2016 at 1405 hours PDT: There is a July 25, 2016 essay by Carrie Sheffield for Salon.com which may provide some insight into the Black Lives matter movement and some of the generational issues within the US African-American community as revealed by the movement.)*

Wireless, wearable carbon nanotube-based gas sensors for soldiers

Researchers at MIT (Massachusetts Institute of Technology) are hoping to make wireless, toxic gas detectors the size of badges. From a June 30, 2016 news item on Nanowerk,

MIT researchers have developed low-cost chemical sensors, made from chemically altered carbon nanotubes, that enable smartphones or other wireless devices to detect trace amounts of toxic gases.

Using the sensors, the researchers hope to design lightweight, inexpensive radio-frequency identification (RFID) badges to be used for personal safety and security. Such badges could be worn by soldiers on the battlefield to rapidly detect the presence of chemical weapons — such as nerve gas or choking agents — and by people who work around hazardous chemicals prone to leakage.

A June 30, 2016 MIT news release (also on EurekAlert), which originated the news item, describes the technology further,

“Soldiers have all this extra equipment that ends up weighing way too much and they can’t sustain it,” says Timothy Swager, the John D. MacArthur Professor of Chemistry and lead author on a paper describing the sensors that was published in the Journal of the American Chemical Society. “We have something that would weigh less than a credit card. And [soldiers] already have wireless technologies with them, so it’s something that can be readily integrated into a soldier’s uniform that can give them a protective capacity.”

The sensor is a circuit loaded with carbon nanotubes, which are normally highly conductive but have been wrapped in an insulating material that keeps them in a highly resistive state. When exposed to certain toxic gases, the insulating material breaks apart, and the nanotubes become significantly more conductive. This sends a signal that’s readable by a smartphone with near-field communication (NFC) technology, which allows devices to transmit data over short distances.

The sensors are sensitive enough to detect less than 10 parts per million of target toxic gases in about five seconds. “We are matching what you could do with benchtop laboratory equipment, such as gas chromatographs and spectrometers, that is far more expensive and requires skilled operators to use,” Swager says.

Moreover, the sensors each cost about a nickel to make; roughly 4 million can be made from about 1 gram of the carbon nanotube materials. “You really can’t make anything cheaper,” Swager says. “That’s a way of getting distributed sensing into many people’s hands.”

The paper’s other co-authors are from Swager’s lab: Shinsuke Ishihara, a postdoc who is also a member of the International Center for Materials Nanoarchitectonics at the National Institute for Materials Science, in Japan; and PhD students Joseph Azzarelli and Markrete Krikorian.

Wrapping nanotubes

In recent years, Swager’s lab has developed other inexpensive, wireless sensors, called chemiresistors, that have detected spoiled meat and the ripeness of fruit, among other things [go to the end of this post for links to previous posts about Swager’s work]. All are designed similarly, with carbon nanotubes that are chemically modified, so their ability to carry an electric current changes when exposed to a target chemical.

This time, the researchers designed sensors highly sensitive to “electrophilic,” or electron-loving, chemical substances, which are often toxic and used for chemical weapons.

To do so, they created a new type of metallo-supramolecular polymer, a material made of metals binding to polymer chains. The polymer acts as an insulation, wrapping around each of the sensor’s tens of thousands of single-walled carbon nanotubes, separating them and keeping them highly resistant to electricity. But electrophilic substances trigger the polymer to disassemble, allowing the carbon nanotubes to once again come together, which leads to an increase in conductivity.

In their study, the researchers drop-cast the nanotube/polymer material onto gold electrodes, and exposed the electrodes to diethyl chlorophosphate, a skin irritant and reactive simulant of nerve gas. Using a device that measures electric current, they observed a 2,000 percent increase in electrical conductivity after five seconds of exposure. Similar conductivity increases were observed for trace amounts of numerous other electrophilic substances, such as thionyl chloride (SOCl2), a reactive simulant in choking agents. Conductivity was significantly lower in response to common volatile organic compounds, and exposure to most nontarget chemicals actually increased resistivity.

Creating the polymer was a delicate balancing act but critical to the design, Swager says. As a polymer, the material needs to hold the carbon nanotubes apart. But as it disassembles, its individual monomers need to interact more weakly, letting the nanotubes regroup. “We hit this sweet spot where it only works when it’s all hooked together,” Swager says.

Resistance is readable

To build their wireless system, the researchers created an NFC tag that turns on when its electrical resistance dips below a certain threshold.

Smartphones send out short pulses of electromagnetic fields that resonate with an NFC tag at radio frequency, inducing an electric current, which relays information to the phone. But smartphones can’t resonate with tags that have a resistance higher than 1 ohm.

The researchers applied their nanotube/polymer material to the NFC tag’s antenna. When exposed to 10 parts per million of SOCl2 for five seconds, the material’s resistance dropped to the point that the smartphone could ping the tag. Basically, it’s an “on/off indicator” to determine if toxic gas is present, Swager says.

According to the researchers, such a wireless system could be used to detect leaks in Li-SOCl2 (lithium thionyl chloride) batteries, which are used in medical instruments, fire alarms, and military systems.

The next step, Swager says, is to test the sensors on live chemical agents, outside of the lab, which are more dispersed and harder to detect, especially at trace levels. In the future, there’s also hope for developing a mobile app that could make more sophisticated measurements of the signal strength of an NFC tag: Differences in the signal will mean higher or lower concentrations of a toxic gas. “But creating new cell phone apps is a little beyond us right now,” Swager says. “We’re chemists.”

Here’s a link to and a citation for the paper,

Ultratrace Detection of Toxic Chemicals: Triggered Disassembly of Supramolecular Nanotube Wrappers by Shinsuke Ishihara, Joseph M. Azzarelli, Markrete Krikorian, and Timothy M. Swager. J. Am. Chem. Soc., Article ASAP DOI: 10.1021/jacs.6b03869 Publication Date (Web): June 23, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Here are links to other posts about Swager’s work featured here previously:

Carbon nanotubes sense spoiled food (April 23, 2015 post)

Smart suits for US soldiers—an update of sorts from the Lawrence Livermore National Laboratory (Feb. 25, 2014 post)

Come, see my etchings … they detect poison gases (Oct. 9, 2012 post)

Soldiers sniff overripe fruit (May 1, 2012 post)

Beating tactical experts in combat simulation—AI with the processing power of a Raspberry Pi

It looks like one day combat may come down to who has the best artificial intelligence (AI) if a June 27, 2016 University of Cincinnati news release (also on EurekAlert) by M. B. Reilly is to be believed (Note: Links have been removed),

Artificial intelligence (AI) developed by a University of Cincinnati doctoral graduate was recently assessed by subject-matter expert and retired United States Air Force Colonel Gene Lee — who holds extensive aerial combat experience as an instructor and Air Battle Manager with considerable fighter aircraft expertise — in a high-fidelity air combat simulator.

The artificial intelligence, dubbed ALPHA, was the victor in that simulated scenario, and according to Lee, is “the most aggressive, responsive, dynamic and credible AI I’ve seen to date.”

Details on ALPHA – a significant breakthrough in the application of what’s called genetic-fuzzy systems are published in the most-recent issue of the Journal of Defense Management, as this application is specifically designed for use with Unmanned Combat Aerial Vehicles (UCAVs) in simulated air-combat missions for research purposes.

The tools used to create ALPHA as well as the ALPHA project have been developed by Psibernetix, Inc., recently founded by UC College of Engineering and Applied Science 2015 doctoral graduate Nick Ernest, now president and CEO of the firm; as well as David Carroll, programming lead, Psibernetix, Inc.; with supporting technologies and research from Gene Lee; Kelly Cohen, UC aerospace professor; Tim Arnett, UC aerospace doctoral student; and Air Force Research Laboratory sponsors.

The news release goes on to provide a overview of ALPHA’s air combat fighting and strategy skills,

ALPHA is currently viewed as a research tool for manned and unmanned teaming in a simulation environment. In its earliest iterations, ALPHA consistently outperformed a baseline computer program previously used by the Air Force Research Lab for research.  In other words, it defeated other AI opponents.

In fact, it was only after early iterations of ALPHA bested other computer program opponents that Lee then took to manual controls against a more mature version of ALPHA last October. Not only was Lee not able to score a kill against ALPHA after repeated attempts, he was shot out of the air every time during protracted engagements in the simulator.

Since that first human vs. ALPHA encounter in the simulator, this AI has repeatedly bested other experts as well, and is even able to win out against these human experts when its (the ALPHA-controlled) aircraft are deliberately handicapped in terms of speed, turning, missile capability and sensors.

Lee, who has been flying in simulators against AI opponents since the early 1980s, said of that first encounter against ALPHA, “I was surprised at how aware and reactive it was. It seemed to be aware of my intentions and reacting instantly to my changes in flight and my missile deployment. It knew how to defeat the shot I was taking. It moved instantly between defensive and offensive actions as needed.”

He added that with most AIs, “an experienced pilot can beat up on it (the AI) if you know what you’re doing. Sure, you might have gotten shot down once in a while by an AI program when you, as a pilot, were trying something new, but, until now, an AI opponent simply could not keep up with anything like the real pressure and pace of combat-like scenarios.”

But, now, it’s been Lee, who has trained with thousands of U.S. Air Force pilots, flown in several fighter aircraft and graduated from the U.S. Fighter Weapons School (the equivalent of earning an advanced degree in air combat tactics and strategy), as well as other pilots who have been feeling pressured by ALPHA.

And, anymore [sic], when Lee flies against ALPHA in hours-long sessions that mimic real missions, “I go home feeling washed out. I’m tired, drained and mentally exhausted. This may be artificial intelligence, but it represents a real challenge.”

New goals have been set for ALPHA according to the news release,

Explained Ernest, “ALPHA is already a deadly opponent to face in these simulated environments. The goal is to continue developing ALPHA, to push and extend its capabilities, and perform additional testing against other trained pilots. Fidelity also needs to be increased, which will come in the form of even more realistic aerodynamic and sensor models. ALPHA is fully able to accommodate these additions, and we at Psibernetix look forward to continuing development.”

In the long term, teaming artificial intelligence with U.S. air capabilities will represent a revolutionary leap. Air combat as it is performed today by human pilots is a highly dynamic application of aerospace physics, skill, art, and intuition to maneuver a fighter aircraft and missiles against adversaries, all moving at very high speeds. After all, today’s fighters close in on each other at speeds in excess of 1,500 miles per hour while flying at altitudes above 40,000 feet. Microseconds matter, and the cost for a mistake is very high.

Eventually, ALPHA aims to lessen the likelihood of mistakes since its operations already occur significantly faster than do those of other language-based consumer product programming. In fact, ALPHA can take in the entirety of sensor data, organize it, create a complete mapping of a combat scenario and make or change combat decisions for a flight of four fighter aircraft in less than a millisecond. Basically, the AI is so fast that it could consider and coordinate the best tactical plan and precise responses, within a dynamic environment, over 250 times faster than ALPHA’s human opponents could blink.

So it’s likely that future air combat, requiring reaction times that surpass human capabilities, will integrate AI wingmen – Unmanned Combat Aerial Vehicles (UCAVs) – capable of performing air combat and teamed with manned aircraft wherein an onboard battle management system would be able to process situational awareness, determine reactions, select tactics, manage weapons use and more. So, AI like ALPHA could simultaneously evade dozens of hostile missiles, take accurate shots at multiple targets, coordinate actions of squad mates, and record and learn from observations of enemy tactics and capabilities.

UC’s Cohen added, “ALPHA would be an extremely easy AI to cooperate with and have as a teammate. ALPHA could continuously determine the optimal ways to perform tasks commanded by its manned wingman, as well as provide tactical and situational advice to the rest of its flight.”

Happily, insight is provided into the technical aspects (from the news release),

It would normally be expected that an artificial intelligence with the learning and performance capabilities of ALPHA, applicable to incredibly complex problems, would require a super computer in order to operate.

However, ALPHA and its algorithms require no more than the computing power available in a low-budget PC in order to run in real time and quickly react and respond to uncertainty and random events or scenarios.

According to a lead engineer for autonomy at AFRL, “ALPHA shows incredible potential, with a combination of high performance and low computational cost that is a critical enabling capability for complex coordinated operations by teams of unmanned aircraft.”

Ernest began working with UC engineering faculty member Cohen to resolve that computing-power challenge about three years ago while a doctoral student. (Ernest also earned his UC undergraduate degree in aerospace engineering and engineering mechanics in 2011 and his UC master’s, also in aerospace engineering and engineering mechanics, in 2012.)

They tackled the problem using language-based control (vs. numeric based) and using what’s called a “Genetic Fuzzy Tree” (GFT) system, a subtype of what’s known as fuzzy logic algorithms.

States UC’s Cohen, “Genetic fuzzy systems have been shown to have high performance, and a problem with four or five inputs can be solved handily. However, boost that to a hundred inputs, and no computing system on planet Earth could currently solve the processing challenge involved – unless that challenge and all those inputs are broken down into a cascade of sub decisions.”

That’s where the Genetic Fuzzy Tree system and Cohen and Ernest’s years’ worth of work come in.

According to Ernest, “The easiest way I can describe the Genetic Fuzzy Tree system is that it’s more like how humans approach problems.  Take for example a football receiver evaluating how to adjust what he does based upon the cornerback covering him. The receiver doesn’t think to himself: ‘During this season, this cornerback covering me has had three interceptions, 12 average return yards after interceptions, two forced fumbles, a 4.35 second 40-yard dash, 73 tackles, 14 assisted tackles, only one pass interference, and five passes defended, is 28 years old, and it’s currently 12 minutes into the third quarter, and he has seen exactly 8 minutes and 25.3 seconds of playtime.’”

That receiver – rather than standing still on the line of scrimmage before the play trying to remember all of the different specific statistics and what they mean individually and combined to how he should change his performance – would just consider the cornerback as ‘really good.’

The cornerback’s historic capability wouldn’t be the only variable. Specifically, his relative height and relative speed should likely be considered as well. So, the receiver’s control decision might be as fast and simple as: ‘This cornerback is really good, a lot taller than me, but I am faster.’

At the very basic level, that’s the concept involved in terms of the distributed computing power that’s the foundation of a Genetic Fuzzy Tree system wherein, otherwise, scenarios/decision making would require too high a number of rules if done by a single controller.

Added Ernest, “Only considering the relevant variables for each sub-decision is key for us to complete complex tasks as humans. So, it makes sense to have the AI do the same thing.”

In this case, the programming involved breaking up the complex challenges and problems represented in aerial fighter deployment into many sub-decisions, thereby significantly reducing the required “space” or burden for good solutions. The branches or sub divisions of this decision-making tree consists of high-level tactics, firing, evasion and defensiveness.

That’s the “tree” part of the term “Genetic Fuzzy Tree” system.

Programming that’s language based, genetic and generational

Most AI programming uses numeric-based control and provides very precise parameters for operations. In other words, there’s not a lot of leeway for any improvement or contextual decision making on the part of the programming.

The AI algorithms that Ernest and his team ultimately developed are language based, with if/then scenarios and rules able to encompass hundreds to thousands of variables. This language-based control or fuzzy logic, while much less about complex mathematics, can be verified and validated.

Another benefit of this linguistic control is the ease in which expert knowledge can be imparted to the system. For instance, Lee worked with Psibernetix to provide tactical and maneuverability advice which was directly plugged in to ALPHA. (That “plugging in” occurs via inputs into a fuzzy logic controller. Those inputs consist of defined terms, e.g., close vs. far in distance to a target; if/then rules related to the terms; and inputs of other rules or specifications.)

Finally, the ALPHA programming is generational. It can be improved from one generation to the next, from one version to the next. In fact, the current version of ALPHA is only that – the current version. Subsequent versions are expected to perform significantly better.

Again, from UC’s Cohen, “In a lot of ways, it’s no different than when air combat began in W.W. I. At first, there were a whole bunch of pilots. Those who survived to the end of the war were the aces. Only in this case, we’re talking about code.”

To reach its current performance level, ALPHA’s training has occurred on a $500 consumer-grade PC. This training process started with numerous and random versions of ALPHA. These automatically generated versions of ALPHA proved themselves against a manually tuned version of ALPHA. The successful strings of code are then “bred” with each other, favoring the stronger, or highest performance versions. In other words, only the best-performing code is used in subsequent generations. Eventually, one version of ALPHA rises to the top in terms of performance, and that’s the one that is utilized.

This is the “genetic” part of the “Genetic Fuzzy Tree” system.

Said Cohen, “All of these aspects are combined, the tree cascade, the language-based programming and the generations. In terms of emulating human reasoning, I feel this is to unmanned aerial vehicles what the IBM/Deep Blue vs. Kasparov was to chess.”

Here’s a link to and a citation for the paper,

Genetic Fuzzy based Artificial Intelligence for Unmanned Combat Aerial Vehicle Control in Simulated Air Combat Missions by Nicholas Ernest, David Carroll, Corey Schumacher, Matthew Clark, Kelly Cohen, and Gene Lee. J Def Manag [Journal of Defense Management]  6:144. doi:10.4172/2167-0374.1000144 Published: March 22, 2016

This is an open access paper.

Segue

The University of Cincinnati’s president, Santa Ono, recently accepted a job as president of the University of British Columbia (UBC), which is located in the region where I live. Nassif Ghoussoub, professor of mathematics at UBC, writes about Ono and his new appointment in a June 13, 2016 posting on his blog (Note: A link has been removed),

By the time you read this, UBC communications will already have issued the mandatory press release [the official announcement was made June 13, 2016] describing Santa Ono’s numerous qualifications for the job, including that he is a Canuck in the US, born in Vancouver, McGill PhD, a highly accomplished medical researcher, who is the President of the University of Cincinnati.

So, I shall focus here on what UBC communications may not be enclined [sic] to tell you, yet may be quite consequential for UBC’s future direction. After all, life experiences, gender, race, class, and character are what shape leadership.

President Ono seems to have had battles with mental illness, and have been courageous enough to deal with it and to publicly disclose it –as recently as May 24 [2016]– so as to destigmatize struggles that many people go through. It is interesting to note the two events that led the president to have suicidal thoughts: …

The post is well worth reading if you have any interest in Ono, UBC, and/or insight into some of the struggles even some of the most accomplished academics can encounter.

Nanotechnology Molecular Tagging for sniffing out explosives

A nifty technology for sniffing out explosives is described in a June 22, 2016 news item in Government Security News magazine. I do think they might have eased up on the Egypt Air disaster reference and the implication that it might have been avoided with the use of this technology,

The crash of an Egypt Air Flight 804 recently again raised concerns over whether a vulnerability in pre-flight security has led to another deadly terrorist attacks. Officials haven’t found a cause for the crash yet, but news reports indicate that officials believe either a bomb or fire are what brought the plane down [link included from press release].

Regardless of the cause, the Chief Executive Officer of British-based Ancon Technologies said that the incident shows the compelling need for more versatile and affordable explosive detection technology.

“There are still too many vulnerabilities in transportation systems around the world,” said CEO Dr. Robert Muir. “That’s why our focus has been on developing explosive detection technology that is highly efficient, easily deployable and economically priced.”

A June 21, 2015 Ancon Technologies press release on PR Web, which originated the news item, describes the technology in a little more detail,

Using nanotechnology to scan sensitive vapour readings, Ancon Technologies has developed unique security devices with exception sensitivity to detect explosive chemicals and materials. Called Nanotechnology Molecular Tagging, the technology is used to look for specific molecular markers that are emitted from the chemicals used in explosive compounds. An NMT device can then be programmed to look for these compounds and gauge concentrations.

“The result is unprecedented sensitivity for a device that is portable and versatile,” Dr. Muir said. “The technology is also highly selective, meaning it can distinguish the molecules is testing for against the backdrop of other chemicals and readings in the air.”

If terrorism is responsible for the crash of the Egypt Air flight on route to Cairo from Paris’ Charles de Gaulle Airport, the incident further shows the need for heightened screening processes, Muir said. Concerns about air travel’s vulnerabilities to terrorism were further raised in October when a Russian plane flying out of Egypt crashed in what several officials believe was a terrorist bombing.

Both cases show the need for improved security measures in airports around the world, especially those related to early explosive detection, Muir said. CNN reported that the Egypt Air crash would likely generate even more attention to airport security while Egypt has already been investing in new security measures following the October attack.

“An NMT device can bring laboratory-level sensitivity to the airport screening procedure, adding another level of safety in places where it’s needed most,” Muir said. “By being able to detect a compound at concentrations as small as a single molecule, NMT can pinpoint a threat and provide security teams with the early warning they need.”

The NMT device’s sensitivity and accuracy can also help balance another concern with airport security: long waits. Already, the Transportation Security Agency is coming under fire this summer for extended airport security screening lines, reports USA Today.

“An NMT device can produce results from test samples in minutes, meaning screenings can proceed at a reasonable pace without jeopardizing security,” Muir said.

Ancon Technologies has working arrangements with military and security agencies in both the United Kingdom and the United States, Muir said, following a recent round of investments. The company is headquartered in Canterbury, Kent and has an office in the U.S. in Bloomington, Minnesota.

So this is a sensing device and I believe this particular type can also be described as an artificial nose.

US Army offers course on nanotechnology

As you might expect, the US Army course on nanotechnology stresses the importance of nanotechnology for the military, according to a June 16, 2016 news item on Nanowerk,

If there is one lesson to glean from Picatinny Arsenal’s new course in nanomaterials, it’s this: never underestimate the power of small.

Nanotechnology is the study of manipulating matter on an atomic, molecular, or supermolecular scale. The end result can be found in our everyday products, such as stained glass [This is a reference to the red glass found in churches from the Middle Ages. More about this later in the posting], sunscreen, cellphones, and pharmaceutical products.

Other examples are in U.S. Army items such as vehicle armor, Soldier uniforms, power sources, and weaponry. All living things also can be considered united forms of nanotechnology produced by the forces of nature.

“People tend to think that nanotechnology is all about these little robots roaming around, fixing the environment or repairing damage to your body, and for many reasons that’s just unrealistic,” said Rajen Patel, a senior engineer within the Energetics and Warheads Manufacturing Technology Division, or EWMTD.

The division is part of the U.S. Army Armament Research, Development and Engineering Center or ARDEC.

A June 15, 2016 ARDEC news release by Cassandra Mainiero, which originated the news item, expands on the theme,

“For me, nanotechnology means getting materials to have these properties that you wouldn’t expect them to have.” [Patel]

The subject can be separated into multiple types (nanomedicine, nanomachines, nanoelectronics, nanocomposites, nanophotonics and more), which can benefit areas, such as communications, medicine, environment remediation, and manufacturing.

Nanomaterials are defined as materials that have at least one dimension in the 1-100 nm range (there are 25,400,000 nanometers in one inch.) To provide some size perspective: comparing a nanometer to a meter is like comparing a soccer ball to the earth.

Picatinny’s nanomaterials class focuses on nanomaterials’ distinguishing qualities, such as their optical, electronic, thermal and mechanical properties–and teaches how manipulating them in a weapon can benefit the warfighter [soldier].

While you could learn similar information at a college course, Patel argues that Picatinny’s nanomaterial class is nothing like a university class.

This is because Picatinny’s nanomaterials class focuses on applied, rather than theoretical nanotechnology, using the arsenal as its main source of examples.

“We talk about things like what kind of properties you get, how to make materials, places you might expect to see nanotechnology within the Army,” explained Patel.

The class is taught at the Armament University. Each class lasts three days. The last one was held in February.

Each class includes approximately 25 students and provides an overview of nanotechnology, covering topics, such as its history, early pioneers in the field, and everyday items that rely on nanotechnology.

Additionally, the course covers how those same concepts apply at Picatinny (for electronics, sensors, energetics, robotics, insensitive munitions, and more) and the major difficulties with experimenting and manufacturing nanotechnology.

Moreover, the class involves guest talks from Picatinny engineers and scientists, such as Dan Kaplan, Christopher Haines, and Venkataraman Swaminathan as well as tours of Picatinny facilities like the Nanotechnology Center and the Explosives Research Laboratory.

It also includes lectures from guest speakers, such as Gordon Thomas from the New Jersey Institute of Technology (NJIT), who spoke about nanomaterials and diabetes research.

A CLASSROOM COINCIDENCE

Relatively new, the nanomaterials class launched in January 2015. It was pioneered by Patel after he attended an instructional course on teaching at the Armament University, where he met Erin Williams, a technical training analyst at the university.

“At the Armament University, we’re always trying to think of, ‘What new areas of interest should we offer to help our workforce? What forward reaching technologies are needed?’ One topic that came up was nanotechnology,” said Williams about how the nanomaterials class originated.

“I started to do research on the subject, how it might be geared toward Picatinny, and trying to think of ways to organize the class. Then, I enrolled in the instructional course on teaching, where I just so happen to be sitting across from Dr. Rajen Patel, who not only knew about nanotechnology, but taught a few seminars at NJIT, where he did his doctorate,” explained Williams. “I couldn’t believe the coincidence! So, I asked him if he would be interested in teaching a class and he said ‘Yes!'”

“After the first [nanomaterials] class, one of the students came up to me and said ‘This was the best course I’ve ever been to on this arsenal,'” added Williams. “…This is really how Picatinny shines as a team: when you meet people and utilize your knowledge to benefit the organization.”

The success of the first nanomaterials course encouraged Patel to expand his class into specialty fields, designing a two-day nanoenergetics class taught by himself and Victor Stepanov, a senior scientist at EWMTD.

Stepanov works with nano-organic energetics (RDX, HMX, CL-20) and inorganic materials (metals.) He is responsible for creating the first nanoorganic energetic known as nano-RDX. He is involved in research aimed at understanding the various properties of nanoenergetics including sensitivity, performance, and mechanical characteristics. He and Patel teach the nanoenergetics class that was first offered last fall and due to high demand is expected to be offered annually. The next one will be held in September.

“We always ask for everyone’s feedback. And, after our first class, everyone said ‘[Picatinny] is the home of the Army’s lethality–why did we not talk about nanoenergetics?’ So, in response to the student’s feedback, we implemented that nanoenergetics course,” said Patel. “Besides, in the long run, you’ll probably replace most energetics with nano-energetics, as they have far too many advantages.”

TECHNOLOGY EVOLUTION

Since all living things are a form of nanotechnology manipulated by the forces of nature, the history of nanotechnology dates back to the emergence of life. However, a more concrete example can be traced back to ancient times, when nanomaterials were manipulated to create gold and silver art such as Lycurgus Cup, a 4th century Roman glass [I’ve added more about the Lycurgus Cup later in this post].

According to Stepanov, ARDEC’s interest in nanotechnology gained significant momentum approximately 20 years ago. The initiative at ARDEC was directly tied to the emergence of advanced technologies needed for production and characterization of nanomaterials, and was concurrent with adoption of nanotechnologies in other fields such as pharmaceuticals.

In 2010, an article in The Picatinny Voice titled “Tiny particles, big impact: Nanotechnology to help warfighters” discussed Picatinny’s ongoing research on nanopowders.

It noted that Picatinny’s Nanotechnology Lab is the largest facility in North America to produce nanopowders and nanomaterials, which are used to create nanoexplosives.

It also mentioned how using nanomaterials helped to develop lightweight composites as an alternative to traditional steel.

The more recent heightened study is due to the evolution of technology, which has allowed engineers and scientists to be more productive and made nanotechnology more ubiquitous throughout the military.

“Not too long ago making milligram quantities of nanoexplosives was challenging. Now, we have technologies that allow us make pounds of nanoexplosives per hour at low cost,” said Stepanov.

Pilot scale production of nanoexplosives is currently being performed at ARDEC, lead by Ashok Surapaneni of the Explosives Development Branch.

The broad interest in developing nanoenergetics such as nano-RDX and nano-HMX is their remarkably low initiation sensitivity.

These materials can thus be crucial in the development of safer next generation munitions that are much less vulnerable to accidental initiation.

SMALL CHANGES, BIG RESULTS

As a result, working with nanotechnology can have various payoffs, such as enhancing the performance of military products, said Patel. For instance, by manipulating nanomaterials, an engineer could make a weapon stronger, lighter, or increase its reactivity or durability.

“Generally, if you make something more safe, you make it less powerful,” said Stepanov. “But, with nanomaterials, you can make a product more safe and, in many cases, more powerful.”

There are two basic approaches to studying nanomaterials: bottom-up (building a large object atom by atom) and top-down (deconstructing a larger material.) Both approaches have been successfully employed in the development of nanoenergetics at ARDEC.

One of the challenges with manufacturing nonmaterials can be coping with shockwaves.

A shockwave initiates an explosive as it travels through a weapon’s main fill or the booster. When a shockwave travels through an energetic charge, it can hit small regions of defects, or voids, which heat up quickly and build pressure until the explosive reaches detonation. By using nanoenergetics, one could adjust the size and quantity of the defects and voids, so that the pressure isn’t as strong and ultimately prevent accidental detonation.

Nanomaterials also are difficult to process because they tend to agglomerate (stick together) and are also prone to Ostwald Ripening, or spontaneous growth of the crystals, which is especially pronounced at the nano-scale. This effect is commonly observed with ice cream, where ice can re-crystallize, resulting in a gritty texture.

“It’s a major production challenge because if you want to process nanomaterials–if you want to coat it with some polymer for explosives–any kind of medium that can dissolve these types of materials can promote ripening and you can end up with a product which no longer has the nanomaterial that you began with,” explained Stepanov.

However, nanotechnology research continues to grow at Picatinny as the research advances in the U.S. Army.

This ongoing development and future applicability encourages Patel and Stepanov to teach the nanomaterials and nanoenergetics course at Picatinny.

“I’m interested in making things better for the warfighter,” said Patel. “Nano-materials give you so many opportunities to do so. Also, as a scientist, it’s just a fascinating realm because you always get these little interesting surprises.

“You can know all the material science and equations, but then you get in the nano-world, and there’s something like a wrinkle–something you wouldn’t expect,” Patel added.

“It satisfies three deep needs: getting the warfighter technology, producing something of value, and it’s fun. You always see something new.”

Medieval church windows and the Lycurgus Cup

The shade of red in medieval church window glass is said to have been achieved by the use of gold nanoparticles. There is a source which claims the colour is due to copper rather than gold. I have not had to time to pursue the controversy such as it is but do have November 1, 2010 posting about stained glass and medieval churches which may prove of interest.

As for the Lycurgus Cup, it’s from the 4th century (CE or AD) and is an outstanding example of Roman art and craft. The glass in the cup is dichroic (it looks green or red depending on how the light catches it). The effect was achieved with the presence of gold and silver nanoparticles in the glass. I have a more extensive description and pictures in a Sept. 21, 2010 posting.

Final note

There is an  army initiative involving an educational institution, the Massachusetts Institute of Technology (MIT). The initiative is the MIT Institute for Soldier Nanotechnologies.

English ivy’s stickiness may be useful

Researchers have discovered the secret to English ivy’s stickiness and they hope that secret will lead to improved wound healing and more according to a May 24, 2016 news item on Nanowerk,

English ivy’s natural glue might hold the key to new approaches to wound healing, stronger armor for the military and maybe even cosmetics with better staying power.

New research from The Ohio State University illuminates the tiny particles responsible for ivy’s ability to latch on so tight to trees and buildings that it can withstand hurricanes and tornadoes. (Not to mention infuriate those trying to rid their homes of the vigorous green climber.)

The researchers pinpointed the spherical particles within English ivy’s adhesive and identified the primary protein within them.

A May 23, 2016 Ohio State University news release (also on EurekAlert) by Misti Crane, which originated the news item, expands on the theme,

“By understanding the proteins that give rise to ivy’s strength, we can give rise to approaches to engineer new bio-inspired adhesives for medical and industry products,” said Mingjun Zhang, the biomedical engineering professor who led the work.

“It’s a milestone to resolve this mystery. We now know the secret of this adhesive and the underlying molecular mechanism,” said Zhang, who focuses his work on finding answers in nature for vexing problems in medicine.

“Ivy has these very tiny hairy structures that have a wonderful interaction with the surface as the plant climbs. One day I was looking at the ivy in the backyard and I was amazed at the force,” Zhang said.Like many scientists before him, Charles Darwin among them, Zhang found himself captivated by English ivy – the physics of it, the sheer strength of it. The study appears today in the journal Proceedings of the National Academy of Sciences.

“It’s very difficult to tear down, even in a natural disaster. It’s one of the strongest adhesive forces in nature.”

When he and his team took a look at the ivy’s glue with a powerful atomic-force microscope, they were able to identify a previously unknown element in its adhesive.

Zhang said particles rich in those proteins have exceptional adhesive abilities – abilities that could be used to the advantage of many, from biomedical engineers to paint makers.The tiny particles inside the glue on their laboratory slides turned out to be primarily made up of arabinogalactan proteins. And when the scientists investigated further, they discovered that the driving force behind the curing of the glue was a calcium-mediated interaction between the proteins and pectin in the gelatinous liquid that oozes from ivy as it climbs.

Zhang, a member of Ohio State’s Davis Heart and Lung Research Institute, is particularly interested in bioadhesives that could aid in wound healing after injury or surgeries. Others, notably the U.S. military, are interested in surface-coating applications for purposes that include strengthening armor systems, he said.

Many plants are excellent climbers, but scientists have had limited information about the adhesives that enable those plants to affix themselves to walls, fences and just about anything in their way, he said.

“When climbing, ivy secretes these tiny nanoparticles which make initial surface contact. Due to their high uniformity and low viscosity, they can attach to large areas on various surfaces,” Zhang said.

After the water evaporates, a chemical bond forms, Zhang said.

“It’s really a nature-made amazing mechanism for high-strength adhesion,” he said.

The glue doesn’t just sit on the surface of the object that the ivy is clinging to, he said. It finds its way into openings invisible to the naked eye, further solidifying its bond.

To confirm what they found, Zhang and his collaborators used the nanoparticles to reconstruct a simple glue that mimics ivy adhesive. Advanced bioadhesives based on this research will take more time and research.

In addition to its strength, ivy adhesive has other properties that make it appealing to scientists looking for answers to engineering quandaries, Zhang said.

“Under moisture or high or low temperatures, it’s not easily damaged,” he said. “Ivy is very resistant to various environmental conditions, which makes the adhesive a particularly interesting candidate for the development of armor coatings.”

Ivy also is considered a pest because it can be destructive to buildings and bridges. Knowing what’s at the heart of its sticking ability could help scientists unearth approaches to resist the plant, Zhang said.

Zhang and his work have been featured here before in a Jan. 7, 2013 posting about flesh-eating fungus and in a July 22, 2010 posting about English ivy and sunscreens.

Here’s a link to and a citation for Zhang’s latest paper,

Nanospherical arabinogalactan proteins are a key component of the high-strength adhesive secreted by English ivy by Yujian Huang, Yongzhong Wang, Li Tan, Leming Sun, Jennifer Petrosino, Mei-Zhen Cui, Feng Hao, and Mingjun Zhang. PNAS [Proceedings of the National Academy of Sciences] 2016 doi: 10.1073/pnas.1600406113 Published ahead of print May 23, 2016,

This paper is behind a paywall.

Sensing fuel leaks and fuel-based explosives with a nanofibril composite

A March 28, 2016 news item on Nanowerk highlights some research from the University of Utah (US),

Alkane fuel is a key ingredient in combustible material such as gasoline, airplane fuel, oil — even a homemade bomb. Yet it’s difficult to detect and there are no portable scanners available that can sniff out the odorless and colorless vapor.

But University of Utah engineers have developed a new type of fiber material for a handheld scanner that can detect small traces of alkane fuel vapor, a valuable advancement that could be an early-warning signal for leaks in an oil pipeline, an airliner, or for locating a terrorist’s explosive.

A March 25, 2016 University of Utah news release, which originated the news item, provides a little more detail,

Currently, there are no small, portable chemical sensors to detect alkane fuel vapor because it is not chemically reactive. The conventional way to detect it is with a large oven-sized instrument in a lab.

“It’s not mobile and very heavy,” Zang [Ling Zang, University of Utah materials science and engineering professor] says of the larger instrument. “There’s no way it can be used in the field. Imagine trying to detect the leak from a gas valve or on the pipelines. You ought to have something portable.”

So Zang’s team developed a type of fiber composite that involves two nanofibers transferring electrons from one to the other.

That kind of interaction would then signal the detector that the alkane vapor is present. Vaporsens, a University of Utah spinoff company, has designed a prototype of the handheld detector with an array of 16 sensor materials that will be able to identify a broad range of chemicals including explosives.  This new composite material will be incorporated into the sensor array to include the detection of alkanes. Vaporsens plans to introduce the device on the market in about a year and a half, says Zang, who is the company’s chief science officer.

Such a small sensor device that can detect alkane vapor will benefit three main categories:

  • Oil pipelines. If leaks from pipelines are not detected early enough, the resulting leaked oil could contaminate the local environment and water sources. Typically, only large leaks in pipelines can be detected if there is a drop in pressure. Zang’s portable sensor — when placed along the pipeline — could detect much smaller leaks before they become bigger.
  • Airplane fuel tanks. Fuel for aircraft is stored in removable “bladders” made of flexible fabric. The only way a leak can be detected is by seeing the dyed fuel seeping from the plane and then removing the bladder to inspect it. Zang’s sensors could be placed around the bladder to warn a pilot if a leak is occurring in real time and where it is located.
  • Security. The scanner will be designed to locate the presence of explosives such as bombs at airports or in other buildings. Many explosives, such as the bomb used in the Oklahoma City bombing in 1995, use fuel oils like diesel as one of its major components. These fuel oils are forms of alkane.

The research was funded by the Department of Homeland Security, National Science Foundation and NASA. The lead author of the paper is University of Utah materials science and engineering doctoral student Chen Wang, and [Benjamin] Bunes is the co-author.

Here’s a link to and a citation for the paper,

Interfacial Donor–Acceptor Nanofibril Composites for Selective Alkane Vapor Detection by Chen Wang, Benjamin R. Bunes, Miao Xu, Na Wu, Xiaomei Yang, Dustin E. Gross, and Ling Zang. ACS Sens DOI: 10.1021/acssensors.6b00018 Publication Date (Web): March 09, 2016

Copyright © 2016 American Chemical Society

This paper is behind a paywall.

Getting back to incandescent light (recycling the military way)

MIT (Massachusetts Institute of Technology) issued two news releases about this research into reclaiming incandescent light or as they call it “recycling light.” First off, there’s the Jan. 11, 2016 MIT Institute of Soldier Nanotechnologies news release by Paola Rebusco on EurekAlert,

Humanity started recycling relatively early in its evolution: there are proofs that trash recycling was taking place as early as in the 500 BC. What about light recycling? Consider light bulbs: more than one hundred and thirty years ago Thomas Edison patented the first commercially viable incandescent light bulb, so that “none but the extravagant” would ever “burn tallow candles”, paving the way for more than a century of incandescent lighting. In fact, emergence of electric lighting was the main motivating factor for deployment of electricity into every home in the world. The incandescent bulb is an example of a high temperature thermal emitter. It is very useful, but only a small fraction of the emitted light (and therefore energy) is used: most of the light is emitted in the infrared, invisible to the human eye, and in this context wasted.

Now, in a study published in Nature Nanotechnology on January 11th 2016 (online), a team of MIT researchers describes another way to recycle light emitted at unwanted infrared wavelengths while optimizing the emission at useful visible wavelengths. …

“For a thermal emitter at moderate temperatures one usually nano-patterns its surface to alter the emission,” says Ilic [postdoc Ognjen Ilic], the lead author of the study. “At high temperatures” – a light bulb filament reaches 3000K! – “such nanostructures deteriorate and it is impossible to alter the emission spectrum by having a nanostructure directly on the surface of the emitter.” The team solved the problem by surrounding the hot object with special nanophotonic structures that spectrally filter the emitted light, meaning that they let the light reflect or pass through based on its color (i.e. its wavelength). Because the filters are not in direct physical contact with the emitter, temperatures can be very high.

To showcase this idea, the team picked one of the highest temperature thermal emitters available – an incandescent light bulb. The authors designed nanofilters to recycle the infrared light, while allowing the visible light to go through. “The key advance was to design a photonic structure that transmits visible light and reflects infrared light for a very wide range of angles,” explains Ilic. “Conventional photonic filters usually operate for a single incidence angle. The challenge for us was to extend the desired optical properties across all directions,” a feat the authors achieved using special numerical optimization techniques.

However, for this scheme to work, the authors had to redesign the incandescent filament from scratch. “In a regular light bulb, the filament is a long and curly piece of tungsten wire. Here, the filament is laser-machined out of a flat sheet of tungsten: it is completely planar,” says Bermel [professor Peter Bermel now at Purdue University]. A planar filament has a large area, and is therefore very efficient in re-absorbing the light that was reflected by the filter. In describing how the new device differs from previously suggested concepts, Soljačić [professor Marin Soljačić], the project lead, emphasizes that “it is the combination of the exceptional properties of the filter and the shape of the filament that enabled substantial recycling of unwanted radiated light.”

In the new-concept light bulb prototype built by the authors, the efficiency approaches some fluorescent and LED bulbs. Nonetheless, the theoretical model predicts plenty of room for improvement. “This experimental device is a proof-of-concept, at the low end of performance that could be ultimately achieved by this approach,” argues Celanovic [principal research scientist Ivan Celanovic]. There are other advantages of this approach: “An important feature is that our demonstrated device achieves near-ideal rendering of colors,” notes Ilic, referring to the requirement of light sources to faithfully reproduce surrounding colors. That is precisely the reason why incandescent lights remained dominant for so long: their warm light has remained preferable to drab fluorescent lighting for decades.

Some practical questions need to be addressed before this technology can be widely adopted. “We will work closely with our mechanical engineering colleagues at MIT to try to tackle the issues of thermal stability and long-lifetime,” says Soljačić. The authors are particularly excited about the potential for producing these devices cheaply. “The materials we need are abundant and inexpensive,” Joannopoulos [professor John Joannopoulos] notes, “and the filters themselves–consisting of stacks of commonly deposited materials–are amenable to large-scale deposition.”

Chen [professor Gang Chen] comments further: “The lighting potential of this technology is exciting, but the same approach could also be used to improve the performance of energy conversion schemes such as thermo-photovoltaics.” In a thermo-photovoltaic device, external heat causes the material to glow, emitting light that is converted into an electric current by an absorbing photovoltaic element.

The last point captures the main motivation behind the work. “Light radiated from a hot object can be quite useful, whether that object is an incandescent filament or the Sun,” Ilic says. At its core, this work is about recycling thermal light for a specific application; “a 3000-degree filament is one of the hottest and the most challenging sources to work with,” Ilic continues. “It’s also what makes it a crucial test of our approach.”

There are a few more details in the 2nd Jan. 11, 2016 MIT news release on EurekAlert,

Light recycling

The key is to create a two-stage process, the researchers report. The first stage involves a conventional heated metal filament, with all its attendant losses. But instead of allowing the waste heat to dissipate in the form of infrared radiation, secondary structures surrounding the filament capture this radiation and reflect it back to the filament to be re-absorbed and re-emitted as visible light. These structures, a form of photonic crystal, are made of Earth-abundant elements and can be made using conventional material-deposition technology.

That second step makes a dramatic difference in how efficiently the system converts light into electricity. The efficiency of conventional incandescent lights is between 2 and 3 percent, while that of fluorescents (including CFLs) is currently between 7 and 13 percent, and that of LEDs between 5 and 13 percent. In contrast, the new two-stage incandescents could reach efficiencies as high as 40 percent, the team says.

The first proof-of-concept units made by the team do not yet reach that level, achieving about 6.6 percent efficiency. But even that preliminary result matches the efficiency of some of today’s CFLs and LEDs, they point out. And it is already a threefold improvement over the efficiency of today’s incandescents.

The team refers to their approach as “light recycling,” says Ilic, since their material takes in the unwanted, useless wavelengths of energy and converts them into the visible light wavelengths that are desired. “It recycles the energy that would otherwise be wasted,” says Soljačić.

Bulbs and beyond

One key to their success was designing a photonic crystal that works for a very wide range of wavelengths and angles. The photonic crystal itself is made as a stack of thin layers, deposited on a substrate. “When you put together layers, with the right thicknesses and sequence,” Ilic explains, you can get very efficient tuning of how the material interacts with light. In their system, the desired visible wavelengths pass right through the material and on out of the bulb, but the infrared wavelengths get reflected as if from a mirror. They then travel back to the filament, adding more heat that then gets converted to more light. Since only the visible ever gets out, the heat just keeps bouncing back in toward the filament until it finally ends up as visible light.

I appreciate both MIT news release writers for “Thomas Edison patented the first commercially viable incandescent light bulb” (Rebusco) and the unidentified writer of the 2nd MIT news release for this, from the news release, “Incandescent bulbs, commercially developed by Thomas Edison (and still used by cartoonists as the symbol of inventive insight) … .” Edison did not invent the light bulb. BTW, the emphases are mine.

For interested parties, here’s a link to and a citation for the paper,

Tailoring high-temperature radiation and the resurrection of the incandescent source by Ognjen Ilic, Peter Bermel, Gang Chen, John D. Joannopoulos, Ivan Celanovic, & Marin Soljačić. Nature Nanotechnology  (2016) doi:10.1038/nnano.2015.309 Published online 11 January 2016

This paper is behind a paywall.

Nanotechnology and cybersecurity risks

Gregory Carpenter has written a gripping (albeit somewhat exaggerated) piece for Signal, a publication of the  Armed Forces Communications and Electronics Association (AFCEA) about cybersecurity issues and  nanomedicine endeavours. From Carpenter’s Jan. 1, 2016 article titled, When Lifesaving Technology Can Kill; The Cyber Edge,

The exciting advent of nanotechnology that has inspired disruptive and lifesaving medical advances is plagued by cybersecurity issues that could result in the deaths of people that these very same breakthroughs seek to heal. Unfortunately, nanorobotic technology has suffered from the same security oversights that afflict most other research and development programs.

Nanorobots, or small machines [or nanobots[, are vulnerable to exploitation just like other devices.

At the moment, the issue of cybersecurity exploitation is secondary to making nanobots, or nanorobots, dependably functional. As far as I’m aware, there is no such nanobot. Even nanoparticles meant to function as packages for drug delivery have not been perfected (see one of the controversies with nanomedicine drug delivery described in my Nov. 26, 2015 posting).

That said, Carpenter’s point about cybersecurity is well taken since security features are often overlooked in new technology. For example, automated banking machines (ABMs) had woefully poor (inadequate, almost nonexistent) security when they were first introduced.

Carpenter outlines some of the problems that could occur, assuming some of the latest research could be reliably  brought to market,

The U.S. military has joined the fray of nanorobotic experimentation, embarking on revolutionary research that could lead to a range of discoveries, from unraveling the secrets of how brains function to figuring out how to permanently purge bad memories. Academia is making amazing advances as well. Harnessing progress by Harvard scientists to move nanorobots within humans, researchers at the University of Montreal, Polytechnique Montreal and Centre Hospitalier Universitaire Sainte-Justine are using mobile nanoparticles inside the human brain to open the blood-brain barrier, which protects the brain from toxins found in the circulatory system.

A different type of technology presents a risk similar to the nanoparticles scenario. A DARPA-funded program known as Restoring Active Memory (RAM) addresses post-traumatic stress disorder, attempting to overcome memory deficits by developing neuroprosthetics that bridge gaps in an injured brain. In short, scientists can wipe out a traumatic memory, and they hope to insert a new one—one the person has never actually experienced. Someone could relish the memory of a stroll along the French Riviera rather than a terrible firefight, even if he or she has never visited Europe.

As an individual receives a disruptive memory, a cyber criminal could manage to hack the controls. Breaches of the brain could become a reality, putting humans at risk of becoming zombie hosts [emphasis mine] for future virus deployments. …

At this point, the ‘zombie’ scenario Carpenter suggests seems a bit over-the-top but it does hearken to the roots of the zombie myth where the undead aren’t mindlessly searching for brains but are humans whose wills have been overcome. Mike Mariani in an Oct. 28, 2015 article for The Atlantic has presented a thought-provoking history of zombies,

… the zombie myth is far older and more rooted in history than the blinkered arc of American pop culture suggests. It first appeared in Haiti in the 17th and 18th centuries, when the country was known as Saint-Domingue and ruled by France, which hauled in African slaves to work on sugar plantations. Slavery in Saint-Domingue under the French was extremely brutal: Half of the slaves brought in from Africa were worked to death within a few years, which only led to the capture and import of more. In the hundreds of years since, the zombie myth has been widely appropriated by American pop culture in a way that whitewashes its origins—and turns the undead into a platform for escapist fantasy.

The original brains-eating fiend was a slave not to the flesh of others but to his own. The zombie archetype, as it appeared in Haiti and mirrored the inhumanity that existed there from 1625 to around 1800, was a projection of the African slaves’ relentless misery and subjugation. Haitian slaves believed that dying would release them back to lan guinée, literally Guinea, or Africa in general, a kind of afterlife where they could be free. Though suicide was common among slaves, those who took their own lives wouldn’t be allowed to return to lan guinée. Instead, they’d be condemned to skulk the Hispaniola plantations for eternity, an undead slave at once denied their own bodies and yet trapped inside them—a soulless zombie.

I recommend reading Mariani’s article although I do have one nit to pick. I can’t find a reference to brain-eating zombies until George Romero’s introduction of the concept in his movies. This Zombie Wikipedia entry seems to be in agreement with my understanding (if I’m wrong, please do let me know and, if possible, provide a link to the corrective text).

Getting back to Carpenter and cybersecurity with regard to nanomedicine, while his scenarios may seem a trifle extreme it’s precisely the kind of thinking you need when attempting to anticipate problems. I do wish he’d made clear that the technology still has a ways to go.

DARPA (US Defense Advanced Research Project Agency) ‘Atoms to Product’ program launched

It took over a year after announcing the ‘Atoms to Product’ program in 2014 for DARPA (US Defense Advanced Research Projects Agency) to select 10 proponents for three projects. Before moving onto the latest announcement, here’s a description of the ‘Atoms to Product’ program from its Aug. 27, 2014 announcement on Nanowerk,

Many common materials exhibit different and potentially useful characteristics when fabricated at extremely small scales—that is, at dimensions near the size of atoms, or a few ten-billionths of a meter. These “atomic scale” or “nanoscale” properties include quantized electrical characteristics, glueless adhesion, rapid temperature changes, and tunable light absorption and scattering that, if available in human-scale products and systems, could offer potentially revolutionary defense and commercial capabilities. Two as-yet insurmountable technical challenges, however, stand in the way: Lack of knowledge of how to retain nanoscale properties in materials at larger scales, and lack of assembly capabilities for items between nanoscale and 100 microns—slightly wider than a human hair.

DARPA has created the Atoms to Product (A2P) program to help overcome these challenges. The program seeks to develop enhanced technologies for assembling atomic-scale pieces. It also seeks to integrate these components into materials and systems from nanoscale up to product scale in ways that preserve and exploit distinctive nanoscale properties.

DARPA’s Atoms to Product (A2P) program seeks to develop enhanced technologies for assembling nanoscale items, and integrating these components into materials and systems from nanoscale up to product scale in ways that preserve and exploit distinctive nanoscale properties.

A Dec. 29, 2015 news item on Nanowerk features the latest about the project,

DARPA recently selected 10 performers to tackle this challenge: Zyvex Labs, Richardson, Texas; SRI, Menlo Park, California; Boston University, Boston, Massachusetts; University of Notre Dame, South Bend, Indiana; HRL Laboratories, Malibu, California; PARC, Palo Alto, California; Embody, Norfolk, Virginia; Voxtel, Beaverton, Oregon; Harvard University, Cambridge, Massachusetts; and Draper Laboratory, Cambridge, Massachusetts.

A Dec. 29, 2015 DARPA news release, which originated the news item, offers more information and an image illustrating the type of advances already made by one of the successful proponents,

DARPA recently launched its Atoms to Product (A2P) program, with the goal of developing technologies and processes to assemble nanometer-scale pieces—whose dimensions are near the size of atoms—into systems, components, or materials that are at least millimeter-scale in size. At the heart of that goal was a frustrating reality: Many common materials, when fabricated at nanometer-scale, exhibit unique and attractive “atomic-scale” behaviors including quantized current-voltage behavior, dramatically lower melting points and significantly higher specific heats—but they tend to lose these potentially beneficial traits when they are manufactured at larger “product-scale” dimensions, typically on the order of a few centimeters, for integration into devices and systems.

“The ability to assemble atomic-scale pieces into practical components and products is the key to unlocking the full potential of micromachines,” said John Main, DARPA program manager. “The DARPA Atoms to Product Program aims to bring the benefits of microelectronic-style miniaturization to systems and products that combine mechanical, electrical, and chemical processes.”

The program calls for closing the assembly gap in two steps: From atoms to microns and from microns to millimeters. Performers are tasked with addressing one or both of these steps and have been assigned to one of three working groups, each with a distinct focus area.

A2P

Image caption: Microscopic tools such as this nanoscale “atom writer” can be used to fabricate minuscule light-manipulating structures on surfaces. DARPA has selected 10 performers for its Atoms to Product (A2P) program whose goal is to develop technologies and processes to assemble nanometer-scale pieces—whose dimensions are near the size of atoms—into systems, components, or materials that are at least millimeter-scale in size. (Image credit: Boston University)

Here’s more about the projects and the performers (proponents) from the A2P performers page on the DARPA website,

Nanometer to Millimeter in a Single System – Embody, Draper and Voxtel

Current methods to treat ligament injuries in warfighters [also known as, soldiers]—which account for a significant portion of reported injuries—often fail to restore pre-injury performance, due to surgical complexities and an inadequate supply of donor tissue. Embody is developing reinforced collagen nanofibers that mimic natural ligaments and replicate the biological and biomechanical properties of native tissue. Embody aims to create a new standard of care and restore pre-injury performance for warfighters and sports injury patients at a 50% reduction compared to current costs.

Radio Frequency (RF) systems (e.g., cell phones, GPS) have performance limits due to alternating current loss. In lower frequency power systems this is addressed by braiding the wires, but this is not currently possibly in cell phones due to an inability to manufacture sufficiently small braided wires. Draper is developing submicron wires that can be braided using DNA self-assembly methods. If successful, portable RF systems will be more power efficient and able to send 10 times more information in a given channel.

For seamless control of structures, physics and surface chemistry—from the atomic-level to the meter-level—Voxtel Inc. and partner Oregon State University are developing an efficient, high-rate, fluid-based manufacturing process designed to imitate nature’s ability to manufacture complex multimaterial products across scales. Historically, challenges relating to the cost of atomic-level control, production speed, and printing capability have been effectively insurmountable. This team’s new process will combine synthesis and delivery of materials into a massively parallel inkjet operation that draws from nature to achieve a DNA-like mediated assembly. The goal is to assemble complex, 3-D multimaterial mixed organic and inorganic products quickly and cost-effectively—directly from atoms.

Optical Metamaterial Assembly – Boston University, University of Notre Dame, HRL and PARC.

Nanoscale devices have demonstrated nearly unlimited power and functionality, but there hasn’t been a general- purpose, high-volume, low-cost method for building them. Boston University is developing an atomic calligraphy technique that can spray paint atoms with nanometer precision to build tunable optical metamaterials for the photonic battlefield. If successful, this capability could enhance the survivability of a wide range of military platforms, providing advanced camouflage and other optical illusions in the visual range much as stealth technology has enabled in the radar range.

The University of Notre Dame is developing massively parallel nanomanufacturing strategies to overcome the requirement today that most optical metamaterials must be fabricated in “one-off” operations. The Notre Dame project aims to design and build optical metamaterials that can be reconfigured to rapidly provide on-demand, customized optical capabilities. The aim is to use holographic traps to produce optical “tiles” that can be assembled into a myriad of functional forms and further customized by single-atom electrochemistry. Integrating these materials on surfaces and within devices could provide both warfighters and platforms with transformational survivability.

HRL Laboratories is working on a fast, scalable and material-agnostic process for improving infrared (IR) reflectivity of materials. Current IR-reflective materials have limited use, because reflectivity is highly dependent on the specific angle at which light hits the material. HRL is developing a technique for allowing tailorable infrared reflectivity across a variety of materials. If successful, the process will enable manufacturable materials with up to 98% IR reflectivity at all incident angles.

PARC is working on building the first digital MicroAssembly Printer, where the “inks” are micrometer-size particles and the “image” outputs are centimeter-scale and larger assemblies. The goal is to print smart materials with the throughput and cost of laser printers, but with the precision and functionality of nanotechnology. If successful, the printer would enable the short-run production of large, engineered, customized microstructures, such as metamaterials with unique responses for secure communications, surveillance and electronic warfare.

Flexible, General Purpose Assembly – Zyvex, SRI, and Harvard.

Zyvex aims to create nano-functional micron-scale devices using customizable and scalable manufacturing that is top-down and atomically precise. These high-performance electronic, optical, and nano-mechanical components would be assembled by SRI micro-robots into fully-functional devices and sub-systems such as ultra-sensitive sensors for threat detection, quantum communication devices, and atomic clocks the size of a grain of sand.

SRI’s Levitated Microfactories will seek to combine the precision of MEMS [micro-electromechanical systems] flexures with the versatility and range of pick-and-place robots and the scalability of swarms [an idea Michael Crichton used in his 2002 novel Prey to induce horror] to assemble and electrically connect micron and millimeter components to build stronger materials, faster electronics, and better sensors.

Many high-impact, minimally invasive surgical techniques are currently performed only by elite surgeons due to the lack of tactile feedback at such small scales relative to what is experienced during conventional surgical procedures. Harvard is developing a new manufacturing paradigm for millimeter-scale surgical tools using low-cost 2D layer-by-layer processes and assembly by folding, resulting in arbitrarily complex meso-scale 3D devices. The goal is for these novel tools to restore the necessary tactile feedback and thereby nurture a new degree of dexterity to perform otherwise demanding micro- and minimally invasive surgeries, and thus expand the availability of life-saving procedures.

Sidebar

‘Sidebar’ is my way of indicating these comments have little to do with the matter at hand but could be interesting factoids for you.

First, Zyvex Labs was last mentioned here in a Sept. 10, 2014 posting titled: OCSiAL will not be acquiring Zyvex. Notice that this  announcement was made shortly after DARPA’s A2P program was announced and that OCSiAL is one of RUSNANO’s (a Russian funding agency focused on nanotechnology) portfolio companies (see my Oct. 23, 2015 posting for more).

HRL Laboratories, mentioned here in an April 19, 2012 posting mostly concerned with memristors (nanoscale devices that mimic neural or synaptic plasticity), has its roots in Howard Hughes’s research laboratories as noted in the posting. In 2012, HRL was involved in another DARPA project, SyNAPSE.

Finally and minimally, PARC also known as, Xerox PARC, was made famous by Steven Jobs and Steve Wozniak when they set up their own company (Apple) basing their products on innovations that PARC had rejected. There are other versions of the story and one by Malcolm Gladwell for the New Yorker May 16, 2011 issue which presents a more complicated and, at times, contradictory version of that particular ‘origins’ story.