Tag Archives: Institute of Electrical and Electronics Engineers

Chart junk: rethinking science data visualization

Which of these visualizations will you remember later? (Images courtesy of Michelle Borkin, Harvard SEAS.)

Which of these visualizations will you remember later? (Images courtesy of Michelle Borkin, Harvard SEAS.)

This chart of data visualization images accompanies an Oct. 16, 2013 news item on ScienceDaily concerning some research into what makes some charts more memorable than others,

It’s easy to spot a “bad” data visualization — one packed with too much text, excessive ornamentation, gaudy colors, and clip art. Design guru Edward Tufte derided such decorations as redundant at best, useless at worst, labeling them “chart junk.” Yet a debate still rages among visualization experts: Can these reviled extra elements serve a purpose?

Taking a scientific approach to design, researchers from Harvard University and Massachusetts Institute of Technology are offering a new take on that debate. The same design elements that attract so much criticism, they report, can also make a visualization more memorable.

Detailed results were presented this week at the IEEE Information Visualization (InfoVis) conference in Atlanta, hosted by the Institute of Electrical and Electronics Engineers.

The Oct. 16, 2013 School of Engineering and Applied Sciences (SEAS) Harvard University news release (also on EurekAlert), which originated the news item, details some of the ways in which the researchers attempted to study data visualizations and memorability (Note: Links from the news release to be found on the SEAS website have been removed),

For lead author Michelle Borkin, a doctoral student at the Harvard School of Engineering and Applied Sciences (SEAS), memorability has a particular importance:

“I spend a lot of my time reading these scientific papers, so I have to wonder, when I walk away from my desk, what am I going to remember? Which of the figures and visualizations in these publications are going to stick with me?”

But it’s more than grad-school anxiety. Working at the interface of computer science and psychology, Borkin specializes in the visual representation of data, looking for the best ways to communicate and interpret complex information. The applications of her work have ranged from astronomy to medical diagnostics and may already help save lives.

Her adviser, Hanspeter Pfister, An Wang Professor of Computer Science at Harvard SEAS, was intrigued by the chart junk debate, which has flared up on design blogs and at visualization conferences year after year.

Together, they turned to Aude Oliva, a principal research scientist at MIT’s Computer Science and Artificial Intelligence Lab, and a cognitive psychologist by training. Oliva’s lab has been studying visual memory for about six years now. Her team has found that in photographs, faces and human-centric scenes are typically easy to remember; landscapes are not.

“All of us are sensitive to the same kinds of images, and we forget the same kind as well,” Oliva says. “We like to believe our memories are unique, that they’re like the soul of a person, but in certain situations it’s as if we have the same algorithm in our heads that is going to be sensitive to a particular type of image. So when you find a result like this in photographs, you want to know: is it generalizable to many types of materials—words, sound, images, graphs?”

“Speaking with [Pfister] and his group, it became very exciting, the idea that we could study what makes a visualization memorable or not,” Oliva recalls. “If it turned out to be the same for everyone, we thought this would be a win-win result.”

For Oliva’s group, it would provide more evidence of cognitive similarities in the brain’s visual processing, from person to person. For Pfister’s group, it could suggest that certain design principles make visualizations inherently more memorable than others.

With Harvard students Azalea A. Vo ’13 and Shashank Sunkavalli SM ’13, as well as MIT graduate students Zoya Bylinskii and Phillip Isola, the team designed a large-scale study—in the form of an online game—to rigorously measure the memorability of a wide variety of visualizations. They collected more than 5,000 charts and graphics from scientific papers, design blogs, newspapers, and government reports and manually categorized them by a wide range of attributes. Serving them up in brief glimpses—just one second each—to participants via Amazon Mechanical Turk, the researchers tested the influence of features like color, density, and content themes on users’ ability to recognize which ones they had seen before.

The results meshed well with Oliva’s previous results, but added several new insights.

“A visualization will be instantly and overwhelmingly more memorable if it incorporates an image of a human-recognizable object—if it includes a photograph, people, cartoons, logos—any component that is not just an abstract data visualization,” says Pfister. “We learned that any time you have a graphic with one of those components, that’s the most dominant thing that affects the memorability.”

Visualizations that were visually dense proved memorable, as did those that used many colors. Other results were more surprising.

“You’d think the types of charts you’d remember best are the ones you learned in school—the bar charts, pie charts, scatter plots, and so on,” Borkin says. “But it was the opposite.”

Unusual types of charts, like tree diagrams, network diagrams, and grid matrices, were actually more memorable.

“If you think about those types of diagrams—for example, tree diagrams that show relationships between species, or diagrams that explain a molecular chemical process—every one of them is going to be a little different, but the branching structures feel very natural to us,” explains Borkin. “That combination of the familiar and the unique seems to influence the memorability.”

The best type of chart to use will always depend on the data, but for designers who are required to work within a certain style—for example, to achieve a recognizable consistency within a magazine—the results may be reassuring.

“A graph can be simple or complex, and they both can be memorable,” explains Oliva. “You can make something familiar either by keeping it simple or by having a little story around it. It’s not really that you should choose to use one color or many, or to include additional ornaments or not. If you need to keep it simple because it’s the style your boss likes or the style of your publication, you can still find a way to make it memorable.”

At this stage, however, the team hesitates to issue any sweeping design guidelines for an obvious reason: memorability isn’t the only thing that matters. Visualizations must also be accurate, easy to comprehend, aesthetically pleasing, and appropriate to the context.

“A memorable visualization is not necessarily a good visualization,” Borkin cautions. “As a community we need to keep asking these types of questions: What makes a visualization engaging? What makes it comprehensible?”

As for the chart junk, she says diplomatically, “I think it’s going to be an ongoing debate.”

I believe Michelle Borkin is the lead author of an unpublished (as yet) paper submitted to the 2013 IEEE Information Visualization (InfoVis) conference, which means I can’t offer a link to or a citation for the paper.

Memristors have always been with us

Sprightly, a word not often used in conjunction with technology of any kind,  is the best of way describing the approach that researchers Varun Aggarwal and Gaurav Gandhi, along with Dr. Leon Chua, have taken towards their discovery that memristors are all around us. ( For anyone not familiar with the concept, I suggest reading the Wikipedia essay on memristors as it includes information about the various critiques of the memristor definition, as well as, the definition.)

It was Dexter Johnson in his June 6, 2013 post on the IEEE (Institute of Electrical and Electronics Engineers) Nanoclast blog who alerted me to this latest memristor work (Note: Links have been removed),

Two researchers from mLabs in India, along with Prof. Leon Chua at the University of California Berkeley, who first postulated the memristor in a paper back in 1971, have discovered the simplest physical implementation for the memristor, which can be built by anyone and everyone.

In two separate papers, one published in arXiv (“Bipolar electrical switching in metal-metal contacts”) and the other in the IEEE’s own Circuits and Systems Magazine (“The First Radios Were Made Using Memristors!”), Chua and the researchers, Varun Aggarwal and Gaurav Gandhi, discovered that simple imperfect point contacts all around us act as memristors.

“Our arXiv paper talks about the coherer, which comprises an imperfect metal-metal contact in embodiments such as a point contact between two metallic balls, granular media or a metal-mercury interface,” Gandhi explained to me via e-email. “On the other hand, the CAS paper comprises an imperfect metal-semiconductor contact (Cat’s Whisker) which was also the first solid-state diode. Both the systems have as their signature an imperfect point contact between two conducting/partially-conducting elements. Both act like memristor.”

I’ll get to the articles in a minutes, first let’s look at the researchers’ website, Mlabs home page (splash page). BTW, I have a soft spot for websites that are easy to navigate and don’t irritate me with movement or pop-ups (thank you mLabs). I think this description of the researchers (Aggarwal and Gandhi) and how they came to develop mLabs (excerpted from the About us page) explains why I described their approach as sprightly,

As they say, anything can happen over a cup of coffee and this story is no different! Gaurav and Varun were friends for over a decade, and one fine day they were sitting at a coffee house discussing Gaurav’s trip to the Second Memristor and Memristive Symposium at Berkeley. Gaurav shared the exciting work around memristor that he witnessed at Berkeley. Varun, who has been an evangelist of Jagadish Chandra Bose’s work thought there was some correlation between the research work of Bose and memristor. He convinced Gaurav to look deeper into these aspects. Soon, a plan was put forth, they wore their engineering gloves and mLabs was born. Gaurav quit his job for full time involvement at mLabs, while Varun assisted and advised throughout.

Three years of curiosity, experimentation, discussions and support from various researchers and professors from different parts of the world, led us to where we are today.

We are also sincerely grateful to Prof. Leon Chua for his continuous support, mentorship and indispensable contribution to our work.

As Dexter notes, Aggarwal and Gandhi have written papers about two different ways to create memristors, the arXiv paper, Bipolar electrical switching in metal-metal contacts, describes how coherers* could be used to create simple memristors for research purposes. This paper also makes the argument that the memristor is a fundamental circuit (a claim which is a matter of considerable debate as the Wikipedia Memristor essay notes briefly),

Our new results show that bipolar switching can be observed in a large class of metals by a simple construction in form of a point-contact or granular media. It does not require complex construction, particular materials or small geometries. The signature of all our devices is an imperfect metal-metal contact and the physical mechanism for the observed behavior needs to be further studied. That the electrical behavior of these simple, naturally-occurring physical constructs can be modeled by a memristor, but not the other three passive elements, is an indication of its fundamental nature. By providing the canonic physical implementation for memristor, the present work not only lls an important gap in the study of switching devices, but also brings them into the realm of immediate practical use and implementation.

Due to the fact that the second article, the one in the IEEE published Circuits and Systems magazine, is behind a paywall, I can’t do much more than offer the title and the first paragraph,

The First Radios Were Made Using Memristors!

In 2008, Williams et al. reported the discovery of the fourth fundamental passive circuit element, memristor, which exhibits electrically controllable state-dependent resistance [1]. We show that one of the first wireless radio detector, called cat?s whisker, also the world?s first solid-state diode, had memristive properties. We have identified the state variable governing the resistance state of the device and can program it to switch between multiple stable resistance states. Our observations and results are valid for a larger class of devices called coherers, which include the cat?s whisker. These devices constitute the missing canonical physical implementations for a memristor (ref. Fig. 1).

It’s fascinating when you consider that up until now researching memristors meant having high tech equipment. I wonder how many backyard memristor labs are going to spring up?

On a somewhat related note, Dexter mentions that HP Labs ‘memristor’ products will be available in 2014. This latest date represents two postponements. Originally meant to be on the market in the summer of 2013, the new products were then supposed to brought to market in late 2013 as per my Feb. 7, 2013 posting; scroll down about 75% of the way).

*’corherers’ corrected to ‘coherers’ Oct. 16, 2015 1345 hours PST.

Life-cycle assessment for electric vehicle lithium-ion batteries and nanotechnology is a risk analysis

A May 29, 2013 news item on Azonano features a new study for the US Environmental Protection Agency (EPA) on nanoscale technology and lithium-ion (li-ion) batteries for electric vehicles,

Lithium (Li-ion) batteries used to power plug-in hybrid and electric vehicles show overall promise to “fuel” these vehicles and reduce greenhouse gas emissions, but there are areas for improvement to reduce possible environmental and public health impacts, according to a “cradle to grave” study of advanced Li-ion batteries recently completed by Abt Associates for the U.S. Environmental Protection Agency (EPA).

“While Li-ion batteries for electric vehicles are definitely a step in the right direction from traditional gasoline-fueled vehicles and nickel metal-hydride automotive batteries, some of the materials and methods used to manufacture them could be improved,” said Jay Smith, an Abt senior analyst and co-lead of the life-cycle assessment.

Smith said, for example, the study showed that the batteries that use cathodes with nickel and cobalt, as well as solvent-based electrode processing, show the highest potential for certain environmental and human health impacts. The environmental impacts, Smith explained, include resource depletion, global warming, and ecological toxicity—primarily resulting from the production, processing and use of cobalt and nickel metal compounds, which can cause adverse respiratory, pulmonary and neurological effects in those exposed.

There are viable ways to reduce these impacts, he said, including cathode material substitution, solvent-less electrode processing and recycling of metals from the batteries.

The May 28, 2013 Abt Associates news release, which originated the news item, describes some of the findings,

Among other findings, Shanika Amarakoon, an Abt associate who co-led the life-cycle assessment with Smith, said global warming and other environmental and health impacts were shown to be influenced by the electricity grids used to charge the batteries when driving the vehicles.
“These impacts are sensitive to local and regional grid mixes,” Amarakoon said.  “If the batteries in use are drawing power from the grids in the Midwest or South, much of the electricity will be coming from coal-fired plants.  If it’s in New England or California, the grids rely more on renewables and natural gas, which emit less greenhouse gases and other toxic pollutants.” However,” she added, “impacts from the processing and manufacture of these batteries should not be overlooked.”
In terms of battery performance, Smith said that “the nanotechnology applications that Abt assessed were single-walled carbon nanotubes (SWCNTs), which are currently being researched for use as anodes as they show promise for improving the energy density and ultimate performance of the Li-ion batteries in vehicles.  What we found, however, is that the energy needed to produce the SWCNT anodes in these early stages of development is prohibitive. Over time, if researchers focus on reducing the energy intensity of the manufacturing process before commercialization, the environmental profile of the technology has the potential to improve dramatically.”

Abt’s Application of Life-Cycle Assessment to Nanoscale Technology: Lithium-ion Batteries for Electric Vehicles can be found here, all 126 pp.

This assessment was performed under the auspices of an interesting assortment of agencies (from the news release),

The research for the life-cycle assessment was undertaken through the Lithium-ion Batteries and Nanotechnology for Electric Vehicles Partnership, which was led by EPA’s Design for the Environment Program in the Office of Chemical Safety and Pollution Prevention and Toxics, and EPA’s National Risk Management Research Laboratory in the Office of Research and Development.  [emphasis mine] The Partnership also included industry partners (i.e., battery manufacturers, recyclers, and suppliers, and other industry groups), the Department of Energy’s Argonne National Lab, Arizona State University, and the Rochester Institute of Technology

I highlighted the National Risk Management Research Laboratory as it reminded me of the lithium-ion battery fires in airplanes reported in January 2013. I realize that cars and planes are not the same thing but lithium-ion batteries have some well defined problems especially since the summer of 2006 when there was a series of li-ion battery laptop fires. From Tracy V. Wilson’s What causes laptop batteries to overheat? article for How stuff works.com (Note: A link has been removed),

In conjunction with the United States Consumer Product Safety Commission (CPSC), Dell and Apple Computer announced large recalls of laptop batteries in the summer of 2006, followed by Toshiba and Lenovo. Sony manufactured all of the recalled batteries, and in October 2006, the company announced its own large-scale recall. Under the right circumstances, these batteries could overheat, potentially causing burns, an explosion or a fire.

Larry Greenemeier in a Jan. 17, 2013 article for Scientific American offers some details about the lithium-ion battery fires in airplanes and elsewhere,

Boeing’s Dreamliner has likely become a nightmare for the company, its airline customers and regulators worldwide. An inflight lithium-ion battery fire broke out Wednesday [Jan. 16, 2013] on an All Nippon Airways 787 over Japan, forcing an emergency landing. And another battery fire occurred last week aboard a Japan Airlines 787 at Boston’s Logan International Airport. Both battery failures resulted in release of flammable electrolytes, heat damage and smoke on the aircraft, according to the U.S. Federal Aviation Administration (FAA).

Lithium-ion batteries—used to power mobile phones, laptops and electric vehicles—have summoned plenty of controversy during their relatively brief existence. Introduced commercially in 1991, by the mid 2000s they had become infamous for causing fires in laptop computers.

More recently, the plug-in hybrid electric Chevy Volt’s lithium-ion battery packs burst into flames following several National Highway Traffic Safety Administration (NHTSA) tests to measure the vehicle’s ability to protect occupants from injury in a side collision. The NHTSA investigated and concluded in January 2012 that Chevy Volts and other electric vehicles do not pose a greater risk of fire than gasoline-powered vehicles.

Philip E. Ross in his Jan. 18, 2013 article about the airplane fires for IEEE’s (Institute of Electrical and Electronics Engineers) Spectrum provides some insight into the fires,

It seems that the batteries heated up in a self-accelerating pattern called thermal runaway. Heat from the production of electricity speeds up the production of electricity, and… you’re off. This sort of things happens in a variety of reactions, not just in batteries, let alone the Li-ion kind. But thermal runaway is particularly grave in Li-ion batteries because they pack a lot more power than the tried-and-true metal-hydride ones, not to speak of Ye Olde lead-acid.

It’s because of this very quality that Li-ion batteries found their first application in small mobile devices, where power is critical and fires won’t cost anyone his life. It’s also why it took so long for the new tech to find its way into electric and hybrid-electric cars.

Perhaps it would have been wiser of Boeing to go for the safest possible Li-ion design, even if it didn’t have quite as much oomph as possible. That’s what today’s main-line electric-drive cars do, as our colleague, John Voelcker, points out.

“The cells in the 787 [Dreamliner], from Japanese company GS Yuasa, use a cobalt oxide (CoO2) chemistry, just as mobile-phone and laptop batteries do,” he writes in greencarreports.com. “That chemistry has the highest energy content, but it is also the most susceptible to overheating that can produce “thermal events” (which is to say, fires). Only one electric car has been built in volume using CoO2 cells, and that’s the Tesla Roadster. Only 2,500 of those cars will ever exist.” Most of today’s electric cars, Voelcker adds, use chemistries that trade some energy density for safety.

The Dreamliner (Boeing 787) is designed to be the lightest of airplanes and using a more energy dense but safer lithium-ion battery seems not to have been an acceptable trade-off.  Interestingly, Boeing according to Ross still had a backlog of orders after the fires.

I find that some of the discussion about risk and nanotechnology-enabled products oddly disconnected. There are the concerns about what happens at the nanoscale (environmental implications, etc.) but that discussion is divorced from some macroscale issues such as battery fires. Taken to absurd lengths, technology at the nanoscale could be considered safe while macroscale issues are completely ignored. It’s as if our institutions are not yet capable of managing multiple scales at once.

For more about an emphasis on scale and other minutiae (pun intended), there’s my May 28, 2013 posting about Steffen Foss Hansen’s plea to revise current European Union legislation to create more categories for nanotechnology regulation, amongst other things.

For more about airplanes and their efforts to get more energy efficient, there’s my May 27, 2013 posting about a biofuel study in Australia.

University of Alberta (Canada) student nanorobotics team demonstrates potential medical technology in competitiion

A University of Alberta (Canada) nanorobotics team has entered its nanobot system into the International Mobile Micro/nanorobotics Competition in Karlsruhe, Germany, as part of the ICRA Robot Challenges at the IEEE (Institute of Electrical and Electronics Engineers) International Conference on Robotics and Automation (ICRA) being held May 6 – 10, 2013 in Karlsruhe, Germany. From the May 6, 2013 news item on Nanowerk,

A team of engineering students is putting a twist on robotics, developing a nano-scale robotics system that could lead to new medical therapies.

In less than a year, the U of A team has assembled a working system that manipulates nano-scale ‘robots’. The team uses magnets to manipulate a droplet filled with iron oxide nanoparticles. Barely visible to the naked eye, the droplet measures 400-500 micrometres.

The May 3, 2013 University of Alberta news release by Richard Cairney, which originated the news item, describes the system,

Using a joystick, team members control the robot, making it travel along a specific route, navigate an obstacle course or to push micro-sized objects from one point to another.

The challenge is simple in concept but highly technical and challenging to execute: the team first injects a water droplet with iron oxide nanoparticles into into oil. The droplet holds its shape because it is encased in a surfactant—a soap-like formula that repels water on one side and attracts water on the other.

“It’s like a capsule,” said team member Yang Gao, who is working on her master’s degree in chemical engineering. “It’s a vehicle for the nanoparticles.”

The iron-filled droplet is placed in a playing ‘field’ measuring 2 x 3 millimetres. The team uses four magnets mounted each side of the rectangular field to move the droplet in a figure-8, manoeuvring it through four gates built into the field.

“We use the magnets to pull the droplet,” explains electrical engineering PhD student Remko van den Hurk.

In a second challenge, the team will be required to use the droplet as a bulldozer of sorts, to arrange micro-scale objects that measure 200 x 300 micrometres into a particular order on an even smaller playing field.

The competition has its serious side, these nanobots could one day be used in medical applications.

In the meantime there’s the competition, good luck!

Skills training: get ready for the robots

If the boffins at the Massachusetts Institute of Technology (MIT) are right, soon we may be learning alongside robots and using the same techniques.  Helen Knight’s Feb. 11, 2013 news release for MIT highlights a recent study showing that robots, like humans, learn better if they cross-train. From the news release,

Robots are increasingly being used in the manufacturing industry to perform tasks that bring them into closer contact with humans. But while a great deal of work is being done to ensure robots and humans can operate safely side-by-side, more effort is needed to make robots smart enough to work effectively with people, says Julie Shah, an assistant professor of aeronautics and astronautics at MIT and head of the Interactive Robotics Group in the Computer Science and Artificial Intelligence Laboratory (CSAIL).

“People aren’t robots, they don’t do things the same way every single time,” Shah says. “And so there is a mismatch between the way we program robots to perform tasks in exactly the same way each time and what we need them to do if they are going to work in concert with people.”

Most existing research into making robots better team players is based on the concept of interactive reward, in which a human trainer gives a positive or negative response each time a robot performs a task.

However, human studies carried out by the military have shown that simply telling people they have done well or badly at a task is a very inefficient method of encouraging them to work well as a team.

Here’s the experiment Shah and her student performed,

So Shah and PhD student Stefanos Nikolaidis began to investigate whether techniques that have been shown to work well in training people could also be applied to mixed teams of humans and robots. One such technique, known as cross-training, sees team members swap roles with each other on given days. “This allows people to form a better idea of how their role affects their partner and how their partner’s role affects them,” Shah says.

In a paper to be presented at the International Conference on Human-Robot Interaction in Tokyo in March [2013], Shah and Nikolaidis will present the results of experiments they carried out with a mixed group of humans and robots, demonstrating that cross-training is an extremely effective team-building tool.

More specifically,

To allow robots to take part in the cross-training experiments, the pair first had to design a new algorithm to allow the devices to learn from their role-swapping experiences. So they modified existing reinforcement-learning algorithms to allow the robots to take in not only information from positive and negative rewards, but also information gained through demonstration. In this way, by watching their human counterparts switch roles to carry out their work, the robots were able to learn how the humans wanted them to perform the same task.

Each human-robot team then carried out a simulated task in a virtual environment, with half of the teams using the conventional interactive reward approach, and half using the cross-training technique of switching roles halfway through the session. Once the teams had completed this virtual training session, they were asked to carry out the task in the real world, but this time sticking to their own designated roles.

Shah and Nikolaidis found that the period in which human and robot were working at the same time — known as concurrent motion — increased by 71 percent in teams that had taken part in cross-training, compared to the interactive reward teams. They also found that the amount of time the humans spent doing nothing — while waiting for the robot to complete a stage of the task, for example — decreased by 41 percent.

What’s more, when the pair studied the robots themselves, they found that the learning algorithms recorded a much lower level of uncertainty about what their human teammate was likely to do next — a measure known as the entropy level — if they had been through cross-training.

Finally, when responding to a questionnaire after the experiment, human participants in cross-training were far more likely to say the robot had carried out the task according to their preferences than those in the reward-only group, and reported greater levels of trust in their robotic teammate. “This is the first evidence that human-robot teamwork is improved when a human and robot train together by switching roles, in a manner similar to effective human team training practices,” Nikolaidis says.

Shah believes this improvement in team performance could be due to the greater involvement of both parties in the cross-training process. “When the person trains the robot through reward it is one-way: The person says ‘good robot’ or the person says ‘bad robot,’ and it’s a very one-way passage of information,” Shah says. “But when you switch roles the person is better able to adapt to the robot’s capabilities and learn what it is likely to do, and so we think that it is adaptation on the person’s side that results in a better team performance.”

The work shows that strategies that are successful in improving interaction among humans can often do the same for humans and robots, says Kerstin Dautenhahn, a professor of artificial intelligence at the University of Hertfordshire in the U.K. “People easily attribute human characteristics to a robot and treat it socially, so it is not entirely surprising that this transfer from the human-human domain to the human-robot domain not only made the teamwork more efficient, but also enhanced the experience for the participants, in terms of trusting the robot,” Dautenhahn says.

The paper (Human-Robot Cross-Training: Computational Formulation, Modeling and Evaluation of a Human Team Training Strategy) written by Nikolaidis and Shah can be found here and the website for the conference (International Conference on Human-Robot Interaction [HRI]; 8th ACM [Association of Computing Machinery]/IEEE [Institute of Electrical and Electronics Engineers] Conference on Human-Robot Interaction) where it will be presented is here.

FrogHeart’s 2012, a selective roundup of my international online colleagues, and other bits

This blog will be five years old in April 2013 and, sometime in January or February, the 2000th post will be published.

Statisticswise it’s been a tumultuous year for FrogHeart with ups and downs,  thankfully ending on an up note. According to my AW stats, I started with 54,920 visits in January (which was a bit of an increase over December 2011. The numbers rose right through to March 2012 when the blog registered 68,360 visits and then the numbers fell and continued to fall. At the low point, this blog registered 45, 972 visits in June 2012 and managed to rise and fall through to Oct. 2012 when the visits rose to 54,520 visits. November 2012 was better with 66,854 visits and in December 2012 the blog will have received over 75,000 visits. (ETA Ja.2.13: This blog registered 81,0036 in December 2012 and an annual total of 681,055 visits.) Since I have no idea why the numbers fell or why they rose again, I have absolutely no idea what 2013 will bring in terms of statistics (the webalizer numbers reflect similar trends).

Interestingly and for the first time since I’ve activated the AW statistics package in Feb. 2009, the US ceased to be the primary source for visitors. As of April 2012, the British surged ahead for several months until November 2012 when the US regained the top spot only to lose it to China in December 2012.

Favourite topics according to the top 10 key terms included: nanocrystalline cellulose for Jan. – Oct. 2012 when for the first time in almost three years the topic fell out of the top 10; Jackson Pollock and physics also popped up in the top 10 in various months throughout the year; Clipperton Island (a sci/art project) has made intermittent appearances; SPAUN (Semantic Pointer Arichitecture Unified Network; a project at the University of Waterloo) has made the top 10 in the two months since it was announced); weirdly, frogheart.ca has appeared in the top 10 these last few months; the Lycurgus Cup, nanosilver, and literary tattoos also made appearances in the top 10 in various months throughout the year, while the memristor and Québec nanotechnology made appearances in the fall.

Webalizer tells a similar but not identical story. The numbers started with 83, 133 visits in January 2012 rising to a dizzying height of 119, 217 in March.  These statistics fell too but July 2012 was another six figure month with 101,087 visits and then down again to five figures until Oct. 2012 with 108, 266 and 136,161 visits in November 2012. The December 2012 visits number appear to be dipping down slightly with 130,198 visits counted to 5:10 am PST, Dec. 31, 2012. (ETA Ja.2.13: In December 2012, 133,351 were tallied with an annual total of 1,660,771 visits.)

Thanks to my international colleagues who inspire and keep me apprised of the latest information on nanotechnology and other emerging technologies:

  • Pasco Phronesis, owned by David Bruggeman, focuses more on science policy and science communicati0n (via popular media) than on emerging technology per se but David provides excellent analysis and a keen eye for the international scene. He kindly dropped by frogheart.ca  some months ago to challenge my take on science and censorship in Canada and I have not finished my response. I’ve posted part 1 in the comments but have yet to get to part 2. His latest posting on Dec. 30, 2012 features this title, For Better Science And Technology Policing, Don’t Forget The Archiving.
  • Nanoclast is on the IEEE (Institute of Electrical and Electronics Engineers) website and features Dexter Johnson’s writing on nanotechnology government initiatives, technical breakthroughs, and, occasionally, important personalities within the field. I notice Dexter, who’s always thoughtful and thought-provoking, has cut back to a weekly posting. I encourage you to read his work as he fills in an important gap in a lot of nanotechnology reporting with his intimate understanding of the technology itself.  Dexter’s Dec. 20, 2012 posting (the latest) is titled, Nanoparticle Coated Lens Converts Light into Sound for Precise Non-invasive Surgery.
  • Insight (formerly TNTlog) is Tim Harper’s (CEO of Cientifica) blog features an international perspective (with a strong focus on the UK scene) on emerging technologies and the business of science. His writing style is quite lively (at times, trenchant) and it reflects his long experience with nanotechnology and other emerging technologies. I don’t know how he finds the time and here’s his latest, a Dec. 4, 2012 posting titled, Is Printable Graphene The Key To Widespread Applications?
  • 2020 Science is Dr. Andrew Maynard’s (director of University of Michigan’s Risk Science Center) more or less personal blog. An expert on nanotechnology (he was the Chief Science Adviser for the Project on Emerging Nanotechnologies, located in Washington, DC), Andrew writes extensively about risk, uncertainty, nanotechnology, and the joys of science. Over time his blog has evolved to include the occasional homemade but science-oriented video, courtesy of one of his children. I usually check Andrew’s blog when there’s a online nanotechnology kerfuffle as he usually has the inside scoop. His latest posting on Dec. 23, 2012 features this title, On the benefits of wearing a hat while dancing naked, and other insights into the science of risk.
  • Andrew also produces and manages the Mind the Science Gap blog, which is a project encouraging MA students in the University of Michigan’s Public Health Program to write. Andrew has posted a summary of the last semester’s triumphs titled, Looking back at another semester of Mind The Science Gap.
  • NanoWiki is, strictly speaking, not a blog but the authors provide the best compilation of stories on nanotechnology issues and controversies that I have found yet. Here’s how they describe their work, “NanoWiki tracks the evolution of paradigms and discoveries in nanoscience and nanotechnology field, annotates and disseminates them, giving an overall view and feeds the essential public debate on nanotechnology and its practical applications.” There are also Spanish, Catalan, and mobile versions of NanoWiki. Their latest posting, dated  Dec. 29, 2012, Nanotechnology shows we can innovate without economic growth, features some nanotechnology books.
  • In April 2012, I was contacted by Dorothée Browaeys about a French blog, Le Meilleur Des Nanomondes. Unfortunately, there doesn’t seem to have been much action there since Feb. 2010 but I’m delighted to hear from my European colleagues and hope to hear more from them.

Sadly, there was only one interview here this year but I think they call these things ‘a big get’ as the interview was with Vanessa Clive who manages the nanotechnology portfolio at Industry Canada. I did try to get an interview with Dr. Marie D’Iorio, the new Executive Director of Canada’s National Institute of Nanotechnology (NINT; BTW, the National Research Council has a brand new site consequently [since the NINT is a National Research Council agency, so does the NINT]), and experienced the same success I had with her predecessor, Dr. Nils Petersen.

I attended two conferences this year, S.NET (Society for the Study of Nanoscience and Emerging Technologies) 2012 meeting in Enschede, Holland where I presented on my work on memristors, artificial brains, and pop culture. The second conference I attended was in Calgary where I  moderated a panel I’d organized on the topic of Canada’s science culture and policy for the 2012 Canadian Science Policy Conference.

There are a few items of note which appeared on the Canadian science scene. ScienceOnlineVancouver emerged in April 2012. From the About page,

ScienceOnlineVancouver is a monthly discussion series exploring how online communication and social media impact current scientific research and how the general public learns about it. ScienceOnlineVancouver is an ongoing discussion about online science, including science communication and available research tools, not a lecture series where scientists talk about their work. Follow the conversation on Twitter at @ScioVan, hashtag is #SoVan.

The concept of these monthly meetings originated in New York with SoNYC @S_O_NYC, brought to life by Lou Woodley (@LouWoodley, Communities Specialist at Nature.com) and John Timmer (@j_timmer, Science Editor at Ars Technica). With the success of that discussion series, participation in Scio2012, and the 2012 annual meeting of the AAAS in Vancouver, Catherine Anderson, Sarah Chow, and Peter Newbury were inspired to bring it closer to home, leading to the beginning of ScienceOnlineVancouver.

ScienceOnlineVancouver is part of the ScienceOnlineNOW community that includes ScienceOnlineBayArea, @sciobayarea and ScienceOnlineSeattle, @scioSEA. Thanks to Brian Glanz of the Open Science Federation and SciFund Challenge and thanks to Science World for a great venue.

I have mentioned the arts/engineering festival coming up in Calgary, Beakerhead, a few times but haven’t had occasion to mention Science Rendezvous before. This festival started in Toronto in 2008 and became a national festival in 2012 (?). Their About page doesn’t describe the genesis of the ‘national’ aspect to this festival as clearly as I would like. They seem to be behind with their planning as there’s no mention of the 2013 festival,which should be coming up in May.

The twitter (@frogheart) feed continues to grow in both (followed and following) albeit slowly. I have to give special props to @carlacap, @cientifica, & @timharper for their mentions, retweets, and more.

As for 2013, there are likely to be some changes here; I haven’t yet decided what changes but I will keep you posted. Have a lovely new year and I wish you all the best in 2013.

Breakthroughs with self-assembling DNA-based nanoscaled structures

With all the talk about self-assembling DNA nanotechnology, it’s possible to misunderstand the stage of development this endeavour occupies as the title, Reality check for DNA Nanotechnology, for a Dec. 13, 2012 news release on EurekAlert suggests,

… This emerging technology employs DNA as a programmable building material for self-assembled, nanometer-scale structures. Many practical applications have been envisioned, and researchers recently demonstrated a synthetic membrane channel made from DNA. Until now, however, design processes were hobbled by a lack of structural feedback. Assembly was slow and often of poor quality.

In fact, the news release is touting two breakthroughs,

Now researchers led by Prof. Hendrik Dietz of the Technische Universitaet Muenchen (TUM) have removed these obstacles.

One barrier holding the field back was an unproven assumption. Researchers were able to design a wide variety of discrete objects and specify exactly how DNA strands should zip together and fold into the desired shapes. They could show that the resulting nanostructures closely matched the designs. Still lacking, though, was the validation of the assumed subnanometer-scale precise positional control. This has been confirmed for the first time through analysis of a test object designed specifically for the purpose. A technical breakthrough based on advances in fundamental understanding, this demonstration has provided a crucial reality check for DNA nanotechnology.

In a separate set of experiments, the researchers discovered that the time it takes to make a batch of complex DNA-based objects can be cut from a week to a matter of minutes, and that the yield can be nearly 100%. They showed for the first time that at a constant temperature, hundreds of DNA strands can fold cooperatively to form an object — correctly, as designed — within minutes. Surprisingly, they say, the process is similar to protein folding, despite significant chemical and structural differences. “Seeing this combination of rapid folding and high yield,” Dietz says, “we have a stronger sense than ever that DNA nanotechnology could lead to a new kind of manufacturing, with a commercial, even industrial future.” And there are immediate benefits, he adds: “Now we don’t have to wait a week for feedback on an experimental design, and multi-step assembly processes have suddenly become so much more practical.”

Dexter Johnson comments in his Dec. 18, 2012 posting (which includes an embedded video) on the Nanoclast blog (located on the Institute of Electrical and Electronics Engineers [IEEE] website),

The field of atomically precise manufacturing—or molecular manufacturing—has taken a big step towards realizing its promise with this latest research.  We may still be a long way from realizing the “nanotech rapture”  but certainly knowing that the objects built meet their design specifications and can be produced in minutes rather than weeks has to be recognized as a significant development.

Three papers have been published on these breakthroughs, here are the citations,

Xiao-chen Bai, Thomas G. Martin, Sjors H. W. Scheres, Hendrik Dietz. Cryo-EM structure of a 3D DNA-origami object. Proceedings of the National Academy of Sciences of the USA, Dec. 4, 2012, 109 (49) 20012-20017; on-line in PNAS Early Edition, Nov. 19, 2012. DOI: 10.1073/pnas.1215713109

Jean-Philippe J. Sobczak, Thomas G. Martin, Thomas Gerling, Hendrik Dietz. Rapid folding of DNA into nanoscale shapes at constant temperature. Science, vol. 338, issue 6113, pp. 1458-1461. DOI: 10.1126/science.1229919

See also: Martin Langecker, Vera Arnaut, Thomas G. Martin, Jonathan List, Stephan Renner, Michael Mayer, Hendrik Dietz, and Friedrich C. Simmel. Synthetic lipid membrane channels formed by designed DNA nanostructures. Science, vol. 338, issue 6109, pp. 932-936. DOI: 10.1126/science.1225624

Mad about Madder in lithium-ion batteries

It hasn’t happened yet but it looks like the future might hold greener lithium-ion (Li-ion) batteries. According to the Dec. 11, 2012 news release on EurekAlert,

Scientists at Rice University and the City College of New York have discovered that the madder plant, aka Rubia tinctorum, is a good source of purpurin, an organic dye that can be turned into a highly effective, natural cathode for lithium-ion batteries. The plant has been used since ancient times to create dye for fabrics.

The goal, according to lead author Arava Leela Mohana Reddy, a research scientist in the Rice lab of materials scientist Pulickel Ajayan, is to create environmentally friendly batteries that solve many of the problems with lithium-ion batteries in use today.

Purpurin, left, extracted from madder root, center, is chemically lithiated, right, for use as an organic cathode in batteries. The material was developed as a less expensive, easier-to-recycle alternative to cobalt oxide cathodes now used in lithium-ion batteries. Credit: Ajayan Lab/Rice University

The Dec. 11, 2012 Rice University news release by Mike Williams, the origin for the one on EurekAlert, describes why the researchers are so interested in a more environmentally-friendly cathode,

While lithium-ion batteries have become standard in conventional electronics since their commercial introduction in 1991, the rechargeable units remain costly to manufacture, Reddy said. “They’re not environmentally friendly. They use cathodes of lithium cobalt oxide, which are very expensive. You have to mine the cobalt metal and manufacture the cathodes in a high-temperature environment. There are a lot of costs.

“And then, recycling is a big issue,” he said. “In 2010, almost 10 billion lithium-ion batteries had to be recycled, which uses a lot of energy. Extracting cobalt from the batteries is an expensive process.”

Reddy and his colleagues came across purpurin while testing a number of organic molecules for their ability to electrochemically interact with lithium and found purpurin most amenable to binding lithium ions. With the addition of 20 percent carbon to add conductivity, the team built a half-battery cell with a capacity of 90 milliamp hours per gram after 50 charge/discharge cycles. The cathodes can be made at room temperature, he said.

“It’s a new mechanism we are proposing with this paper, and the chemistry is really simple,” Reddy said. He suggested agricultural waste may be a source of purpurin, as may other suitable molecules, which makes the process even more economical.

Innovation in the battery space is needed to satisfy future demands and counter environmental issues like waste management, “and hence we are quite fascinated by the ability to develop alternative electrode technologies to replace conventional inorganic materials in lithium-ion batteries,” said Ajayan, Rice’s Benjamin M. and Mary Greenwood Anderson Professor in Mechanical Engineering and Materials Science and of chemistry.

“We’re interested in developing value-added chemicals, products and materials from renewable feedstocks as a sustainable technology platform,” said co-lead author George John, a professor of chemistry at the City College of New York-CUNY and an expert on bio-based materials and green chemistry. “The point has been to understand the chemistry between lithium ions and the organic molecules. Now that we have that proper understanding, we can tap other molecules and improve capacity.”

For anyone who’s interested, you can read the researchers’ article (open access),

Lithium storage mechanisms in purpurin based organic lithium ion battery electrodes by Arava Leela Mohana Reddy,  Subbiah Nagarajan, Porramate Chumyim, Sanketh R. Gowda, Padmanava Pradhan, Swapnil R. Jadhav, Madan Dubey,  George John & Pulickel M. Ajayan in Scientific Reports 2 Article number: 960 doi:10.1038/srep00960

You might also want to check out Dexter Johnson’s Nov. 26, 2012 posting (on Nanoclast, an IEEE [Institute of Electrical and Electronics Engineers] blog)where he mentions a technical deficiency (recharging becomes increasingly difficult) with the current Li-ion batteries in the context of his description of a new imaging technique.

Christmas-tree shaped ‘4-D’ nanowires

This Dec. 5, 2012 news item on Nanowerk features a seasonal approach to a study about ‘4-D’ nanowires,

A new type of transistor shaped like a Christmas tree has arrived just in time for the holidays, but the prototype won’t be nestled under the tree along with the other gifts.

“It’s a preview of things to come in the semiconductor industry,” said Peide “Peter” Ye, a professor of electrical and computer engineering at Purdue University.

Researchers from Purdue and Harvard universities created the transistor, which is made from a material that could replace silicon within a decade. Each transistor contains three tiny nanowires made not of silicon, like conventional transistors, but from a material called indium-gallium-arsenide. The three nanowires are progressively smaller, yielding a tapered cross section resembling a Christmas tree.

Sadly, Purdue University (Indiana, US) will not be releasing any images to accompany their Dec. 4, 2012 news release (which originated the news item) about the ‘4-D’ transistor  until Saturday, Dec. 8, 2012.  So here’s an image of a real Christmas tree from the National Christmas Tree Organization’s Common Tree Characteristics webpage,

Douglas Fir Christmas tree from http://www.realchristmastrees.org/dnn/AllAboutTrees/TreeCharacteristics.aspx

 

The Purdue University news release written by Emil Venere provides more detail about the work,

“A one-story house can hold so many people, but more floors, more people, and it’s the same thing with transistors,” Ye said. “Stacking them results in more current and much faster operation for high-speed computing. This adds a whole new dimension, so I call them 4-D.”

The work is led by Purdue doctoral student Jiangjiang Gu and Harvard postdoctoral researcher Xinwei Wang.

The newest generation of silicon computer chips, introduced this year, contain transistors having a vertical 3-D structure instead of a conventional flat design. However, because silicon has a limited “electron mobility” – how fast electrons flow – other materials will likely be needed soon to continue advancing transistors with this 3-D approach, Ye said.

Indium-gallium-arsenide is among several promising semiconductors being studied to replace silicon. Such semiconductors are called III-V materials because they combine elements from the third and fifth groups of the periodic table.

Transistors contain critical components called gates, which enable the devices to switch on and off and to direct the flow of electrical current. Smaller gates make faster operation possible. In today’s 3-D silicon transistors, the length of these gates is about 22 nanometers, or billionths of a meter.

The 3-D design is critical because gate lengths of 22 nanometers and smaller do not work well in a flat transistor architecture. Engineers are working to develop transistors that use even smaller gate lengths; 14 nanometers are expected by 2015, and 10 nanometers by 2018.

However, size reductions beyond 10 nanometers and additional performance improvements are likely not possible using silicon, meaning new materials will be needed to continue progress, Ye said.

Creating smaller transistors also will require finding a new type of insulating, or “dielectric” layer that allows the gate to switch off. As gate lengths shrink smaller than 14 nanometers, the dielectric used in conventional transistors fails to perform properly and is said to “leak” electrical charge when the transistor is turned off.

Nanowires in the new transistors are coated with a different type of composite insulator, a 4-nanometer-thick layer of lanthanum aluminate with an ultrathin, half-nanometer layer of aluminum oxide. The new ultrathin dielectric allowed researchers to create transistors made of indium-gallium- arsenide with 20-nanometer gates, which is a milestone, Ye said.

This work will be presented at the 2012 International Electron Devices (IEEE [Institute of Electrical and Electronics Engineers]) meeting in San Francisco, California, Dec. 10 – 12, 2012 (as per the information on the registration page) with the two papers written by the team will be published in the proceedings.

I have a full list of the authors, from the news release,

The authors of the research papers are Gu [Jiangjiang Gu]; Wang [Xinwei Wang]; Purdue doctoral student H. Wu; Purdue postdoctoral research associate J. Shao; Purdue doctoral student A. T. Neal; Michael J. Manfra, Purdue’s William F. and Patty J. Miller Associate Professor of Physics; Roy Gordon, Harvard’s Thomas D. Cabot Professor of Chemistry; and Ye [Peide “Peter” Ye].

More questions about whether nanoparticles penetrate the skin

The research from the University of Bath about nanoparticles not penetrating the skin has drawn some interest. In addition to the mention here yesterday, in this Oct. 3, 2012 posting, there was this Oct. 2, 2012 posting by Dexter Johnson at the Nanoclast blog on the IEEE [Institute of Electrical and Electronics Engineers] website. I have excerpted the first and last paragraphs of Dexter’s posting as they neatly present the campaign to regulate the use of  nanoparticles in cosmetics and the means by which science progresses, i.e. this study is not definitive,

For at least the last several years, NGO’s like Friends of the Earth (FoE) have been leveraging preliminary studies that indicated that nanoparticles might pass right through our skin to call for a complete moratorium on the use of any nanomaterials in sunscreens and cosmetics.

This latest UK research certainly won’t put this issue to rest. These experiments will need to be repeated and the results duplicated. That’s how science works. We should not be jumping to any conclusions that this research proves nanoparticles are absolutely safe any more than we should be jumping to the conclusion that they are a risk. Science cuts both ways.

Meanwhile a writer in Australia, Sarah Berry, takes a different approach in her Oct. 4, 2012 article for the Australian newspaper, the  Sydney Morning Herald,

“Breakthrough” claims by cosmetic companies aren’t all they’re cracked up to be, according to a new study.

Nanotechnology — the science of super-small particles — has featured in cosmetic formulations since the late ’80s. Brands claim the technology delivers the “deep-penetrating action” of vitamins and other “active ingredients”.

You may think you know what direction Berry is going to pursue but she swerves,

Dr Gregory Crocetti, a nanotechnology campaigner with Friends of the Earth Australia, was scathing of the study. “To conclude that nanoparticles do not penetrate human skin based on a short-term study using excised pig skin is highly irresponsible,” he said. “This is yet another example of short-term, in-vitro research that doesn’t reflect real-life conditions like skin flexing, and the fact that penetration enhancers are used in most cosmetics. There is an urgent need for more long-term studies that actually reflect realistic conditions.”

Professor Brian Gulson, from Macquarie University in NSW, was was similarly critical. The geochemist’s own study, from 2010 and in conjunction with CSIRO [Australia’s national science agency, the Commonwealth Scientific and Industrial Research Organization], found that small amounts of zinc particles in sunscreen “can pass through the protective layers of skin exposed to the sun in a real-life environment and be detected in blood and urine”.

Of the latest study he said: “Even though they used a sophisticated method of laser scanning confocal microscopy, their results only reinforced earlier studies [and had] no relevance to ‘real life’, especially to cosmetics, because they used polystyrene nanoparticles, and because they used excised (that is, ‘dead’) pig’s skin.”

I missed the fact that this study was an in vitro test, which is always less convincing than in vivo testing. In my Nov. 29, 2011 posting about some research into nano zinc oxide I mentioned in vitro vs. in vivo testing and Brian Gulson’s research,

I was able to access the study and while I’m not an expert by any means I did note that the study was ‘in vitro’, in this case, the cells were on slides when they were being studied. It’s impossible to draw hard and fast conclusions about what will happen in a body (human or otherwise) since there are other systems at work which are not present on a slide.

… here’s what Brian Gulson had to say about nano zinc oxide concentrations in his work and about a shortcoming in his study (from an Australian Broadcasting Corporation [ABC] Feb. 25, 2010 interviewwith Ashley Hall,

BRIAN GULSON: I guess the critical thing was that we didn’t find large amounts of it getting through the skin. The sunscreens contain 18 to 20 per cent zinc oxide usually and ours was about 20 per zinc. So that’s an awful lot of zinc you’re putting on the skin but we found tiny amounts in the blood of that tracer that we used.

ASHLEY HALL: So is it a significant amount?

BRIAN GULSON: No, no it’s really not.

ASHLEY HALL: But Brian Gulson is warning people who use a lot of sunscreen over an extended period that they could be at risk of having elevated levels of zinc.

BRIAN GULSON: Maybe with young children where you’re applying it seven days a week, it could be an issue but I’m more than happy to continue applying it to my grandchildren.

ASHLEY HALL: This study doesn’t shed any light on the question of whether the nano-particles themselves played a part in the zinc absorption.

BRIAN GULSON: That was the most critical thing. This isotope technique cannot tell whether or not it’s a zinc oxide nano-particle that got through skin or whether it’s just zinc that was dissolved up in contact with the skin and then forms zinc ions or so-called soluble ions. So that’s one major deficiency of our study.

Of course, I have a question about Gulson’s conclusion  that very little of the nano zinc oxide was penetrating the skin based on blood and urine samples taken over the course of the study. Is it possible that after penetrating the skin it was stored in the cells  instead of being eliminated?

It seems it’s not yet time to press the panic button since more research is needed for scientists to refine their understanding of nano zinc oxide and possible health effects from its use.

What I found most interesting in Berry’s article was the advice from the Friends of the Earth,

The contradictory claims about sunscreen can make it hard to know what to do this summer. Friends of the Earth Australia advise people to continue to be sun safe — seeking shade, wearing protective clothing, a hat and sunglasses and using broad spectrum SPF 30+ sunscreen.

This is a huge change in tone for that organization, which until now has been relentless in its anti nanosunscreen stance. Here they advise using a sunscreen and they don’t qualify it as they would usually by saying you should avoid nanosunscreens. I guess after the debacle earlier this year (mentioned in this Feb. 9, 2012 posting titled: Unintended consequences: Australians not using sunscreens to avoid nanoparticles?), they have reconsidered the intensity of their campaign.

For anyone interested in some of the history of the Friends of the Earth’s campaign and the NGO (non governemental organization) which went against the prevailing sentiment against nanosunscreen, I suggest reading Dexter’s posting in full and for those interested in the response from Australian scientists about this latest research, do read Berry’s article.