Tag Archives: Action Science Explorer

Ouch, my brain hurts! Information overload in the neurosciences

Alcino Silva, a professor of neurobiology at the David Geffen School of Medicine at UCLA and of psychiatry at the Semel Institute for Neuroscience and Human Behavior, has been working on the information overload problem in neuroscience for almost 30 years. In Silva’s latest effort he and his team are designing and testing  research maps, from the Aug. 8, 2013 news item  on ScienceDaily,

Before the digital age, neuroscientists got their information in the library like the rest of us. But the field’s explosion has created nearly 2 million papers — more data than any researcher can read and absorb in a lifetime.

That’s why a UCLA [University of California at Los Angeles] team has invented research maps. Equipped with an online app, the maps help neuroscientists quickly scan what is already known and plan their next study. The Aug. 8 edition of Neuron describes the findings.

The Aug. 8, 2013 UCLA news release written by Elaine Schmidt, which originated the news item, provides details about the team’s strategy for developing and testing this new tool,

Silva collaborated with Anthony Landreth, a former UCLA postdoctoral fellow, to create maps that offer simplified, interactive and unbiased summaries of research findings designed to help neuroscientists in choosing what to study next. As a testing ground for their maps, the team focused on findings in molecular and cellular cognition.

UCLA programmer Darin Gilbert Nee also created a Web-based app to help scientists expand and interact with their field’s map.

“We founded research maps on a crowd-sourcing strategy in which individual scientists add papers that interest them to a growing map of their fields,” said Silva, who started working on the problem nearly 30 years ago as a graduate student and who wrote, along with Landreth, an upcoming Oxford Press book on the subject. “Each map is interactive and searchable; scientists see as much of the map as they query, much like an online search.”

According to Silva, the map allows scientists to zero in on areas that interest them. By tracking published findings, researchers can determine what’s missing and pinpoint worthwhile experiments to pursue.

“Just as a GPS map offers different levels of zoom, a research map would allow a scientist to survey a specific research area at different levels of resolution — from coarse summaries to fine-grained accounts of experimental results,” Silva said. “The map would display no more and no less detail than is necessary for the researcher’s purposes.”

Each map encodes information by classifying it into categories and scoring the weight of its evidence based on key criteria, such as reproducibility and “convergence” — when different experiments point to a single conclusion.

The team’s next step will be to automate the map-creation process. As scientists publish papers, their findings will automatically be added to the research map representing their field.

According to Silva, automation could be achieved by using journals’ existing publication process to divide an article’s findings into smaller chapters and build “nano-publications.” Publishers would use a software plug-in to render future papers machine-readable.

Here’s a link to and a citation for the published paper,

The Need for Research Maps to Navigate Published Work and Inform Experiment Planning by Anthony Landreth and Alcino J. Silva.  Neuron, Volume 79, Issue 3, 411-415, 7 August 2013 doi:10.1016/j.neuron.2013.07.024

Copyright © 2013 Elsevier Inc. All rights reserved.

I have provided a link to the HTML with thumbnail images version of the paper, which appears to  be open access (at least for now). I found this paper to be quite readable, from the Introduction,

The amount of published research in neuroscience has grown to be massive. The past three decades have accumulated more than 1.6 million articles alone. The rapid expansion of the published record has been accompanied by an unprecedented widening of the range of concepts, approaches, and techniques that individual neuroscientists are expected to be familiar with. The cutting edge of neuroscience is increasingly defined by studies demanding researchers in one area (e.g., molecular and cellular neuroscience) to have more than a passing familiarity with the tools, concepts, and literature of other areas (e.g., systems or behavioral neuroscience). [emphasis mine] As research relevant to a topic expands, it becomes increasingly more likely that researchers will be either overwhelmed or unaware of relevant results (or both).

Interestingly, neither author not any other team members (in addition to Nee, John Bickle, not mentioned in the news release, has co-written the forthcoming book with Silva and Landreth) mentioned seem to have any background in library or archival sciences or in information architecture or records management, all fields where people deal with massive amounts of information and accessibility issues. For example, the US National and Records Administration (NARA) is developing a data visualization tool (Action Science Explorer; my Dec. 9, 2011 posting profiles this project) to address some very similar issues to those faced in the neuroscience community.

Action Science Explorer (data visualization tool)

There’s a lot of data being generated and we need to find new ways to manage and navigate through it. The Dec. 8, 2011 news item by Ellen Ferrante and Lisa-Joy Zgorski on phsyorg.com describes a data visualization tool designed by the Human-Computer Interaction Laboratory (HCIL) at the University of Maryland,

The National Science Foundation- (NSF) funded Action Science Explorer (ASE) allows users to simultaneously search through thousands of academic papers, using a visualization method that determines how papers are connected, for instance, by topic, date, authors, etc.   The goal is to use these connections to identify emerging scientific trends and advances.

“We are creating an early warning system for scientific breakthroughs,” said Ben Shneiderman, a professor at the University of Maryland (UM) and founding director of the UM Human-Computer Interaction Lab.

“Such a system would dramatically improve the capability of academic researchers, government program managers and industry analysts to understand emerging scientific topics so as to recognize breakthroughs, controversies and centers of activity,” said Shneiderman. “This would enable appropriate allocation of funds, encourage new collaborations among groups that unknowingly were working on similar topics and accelerate research progress.”

I went to the HCIL website to find more about the ASE project here where I also located a screen capture of the graphical interface,

A large-screen window layout of the overall interface of ASE. Credit: Cody Dunne, Robert Gove, Ben Shneiderman, Bonnie Dorr and Judith Klavans. University of Maryland

There’s also a video explaining some aspects of ASE,


For those who can’t get enough data, there’s a technical report here.

I expect we will be seeing more of these kinds of tools and not just for science research. There was this April 6, 2011 news item by Aaron Dubrow on physorg.com describing the US National Archives and Records Administration’s (NARA) new data visualization tools,

At the end of President George W. Bush’s administration in 2009, NARA received roughly 35 times the amount of data as previously received from the administration of President Bill Clinton, which itself was many times that of the previous administration. With the federal government increasingly using social media, cloud computing and other technologies to contribute to open government, this trend is not likely to decline. By 2014, NARA is expecting to accumulate more than 35 petabytes (quadrillions of bytes) of data in the form of electronic records.

“The National Archives is a unique national institution that responds to requirements for preservation, access and the continued use of government records,” said Robert Chadduck, acting director for the National Archives Center for Advanced Systems and Technologies.

After consulting with NARA about its needs, members of TACC’s [Texas Advanced Computing Center] Data and Information Analysis group developed a multi-pronged approach that combines different data analysis methods into a visualization framework. The visualizations act as a bridge between the archivist and the data by interactively rendering information as shapes and colors to facilitate an understanding of the archive’s structure and content.

I’d best get ready to develop new literacy skills as these data visualization tools come into play.