The Fred Hutchinson Cancer Research Centre in Seattle, Washington has announced a proposed cancer treatment using nanoparticle-programmed T cells according to an April 12, 2017 news release (received via email; also on EurekAlert), Note: A link has been removed,
Researchers at Fred Hutchinson Cancer Research Center have developed biodegradable nanoparticles that can be used to genetically program immune cells to recognize and destroy cancer cells — while the immune cells are still inside the body.
In a proof-of-principle study to be published April 17  in Nature Nanotechnology, the team showed that nanoparticle-programmed immune cells, known as T cells, can rapidly clear or slow the progression of leukemia in a mouse model.
“Our technology is the first that we know of to quickly program tumor-recognizing capabilities into T cells without extracting them for laboratory manipulation,” said Fred Hutch’s Dr. Matthias Stephan, the study’s senior author. “The reprogrammed cells begin to work within 24 to 48 hours and continue to produce these receptors for weeks. This suggests that our technology has the potential to allow the immune system to quickly mount a strong enough response to destroy cancerous cells before the disease becomes fatal.”
Cellular immunotherapies have shown promise in clinical trials, but challenges remain to making them more widely available and to being able to deploy them quickly. At present, it typically takes a couple of weeks to prepare these treatments: the T cells must be removed from the patient and genetically engineered and grown in special cell processing facilities before they are infused back into the patient. These new nanoparticles could eliminate the need for such expensive and time consuming steps.
Although his T-cell programming method is still several steps away from the clinic, Stephan imagines a future in which nanoparticles transform cell-based immunotherapies — whether for cancer or infectious disease — into an easily administered, off-the-shelf treatment that’s available anywhere.
“I’ve never had cancer, but if I did get a cancer diagnosis I would want to start treatment right away,” Stephan said. “I want to make cellular immunotherapy a treatment option the day of diagnosis and have it able to be done in an outpatient setting near where people live.”
The body as a genetic engineering lab
Stephan created his T-cell homing nanoparticles as a way to bring the power of cellular cancer immunotherapy to more people.
In his method, the laborious, time-consuming T-cell programming steps all take place within the body, creating a potential army of “serial killers” within days.
As reported in the new study, Stephan and his team developed biodegradable nanoparticles that turned T cells into CAR T cells, a particular type of cellular immunotherapy that has delivered promising results against leukemia in clinical trials.
The researchers designed the nanoparticles to carry genes that encode for chimeric antigen receptors, or CARs, that target and eliminate cancer. They also tagged the nanoparticles with molecules that make them stick like burrs to T cells, which engulf the nanoparticles. The cell’s internal traffic system then directs the nanoparticle to the nucleus, and it dissolves.
The study provides proof-of-principle that the nanoparticles can educate the immune system to target cancer cells. Stephan and his team designed the new CAR genes to integrate into chromosomes housed in the nucleus, making it possible for T cells to begin decoding the new genes and producing CARs within just one or two days.
Once the team determined that their CAR-carrying nanoparticles reprogrammed a noticeable percent of T cells, they tested their efficacy. Using a preclinical mouse model of leukemia, Stephan and his colleagues compared their nanoparticle-programming strategy against chemotherapy followed by an infusion of T cells programmed in the lab to express CARs, which mimics current CAR-T-cell therapy.
The nanoparticle-programmed CAR-T cells held their own against the infused CAR-T cells. Treatment with nanoparticles or infused CAR-T cells improved survival 58 days on average, up from a median survival of about two weeks.
The study was funded by Fred Hutch’s Immunotherapy Initiative, the Leukemia & Lymphoma Society, the Phi Beta Psi Sorority, the National Science Foundation and the National Cancer Institute.
Next steps and other applications
Stephan’s nanoparticles still have to clear several hurdles before they get close to human trials. He’s pursuing new strategies to make the gene-delivery-and-expression system safe in people and working with companies that have the capacity to produce clinical-grade nanoparticles. Additionally, Stephan has turned his sights to treating solid tumors and is collaborating to this end with several research groups at Fred Hutch.
And, he said, immunotherapy may be just the beginning. In theory, nanoparticles could be modified to serve the needs of patients whose immune systems need a boost, but who cannot wait for several months for a conventional vaccine to kick in.
“We hope that this can be used for infectious diseases like hepatitis or HIV,” Stephan said. This method may be a way to “provide patients with receptors they don’t have in their own body,” he explained. “You just need a tiny number of programmed T cells to protect against a virus.”
This artificial synapse is apparently an improvement on the standard memristor-based artificial synapse but that doesn’t become clear until reading the abstract for the paper. First, there’s a Feb. 20, 2017 Stanford University news release by Taylor Kubota (dated Feb. 21, 2017 on EurekAlert), Note: Links have been removed,
For all the improvements in computer technology over the years, we still struggle to recreate the low-energy, elegant processing of the human brain. Now, researchers at Stanford University and Sandia National Laboratories have made an advance that could help computers mimic one piece of the brain’s efficient design – an artificial version of the space over which neurons communicate, called a synapse.
“It works like a real synapse but it’s an organic electronic device that can be engineered,” said Alberto Salleo, associate professor of materials science and engineering at Stanford and senior author of the paper. “It’s an entirely new family of devices because this type of architecture has not been shown before. For many key metrics, it also performs better than anything that’s been done before with inorganics.”
The new artificial synapse, reported in the Feb. 20 issue of Nature Materials, mimics the way synapses in the brain learn through the signals that cross them. This is a significant energy savings over traditional computing, which involves separately processing information and then storing it into memory. Here, the processing creates the memory.
This synapse may one day be part of a more brain-like computer, which could be especially beneficial for computing that works with visual and auditory signals. Examples of this are seen in voice-controlled interfaces and driverless cars. Past efforts in this field have produced high-performance neural networks supported by artificially intelligent algorithms but these are still distant imitators of the brain that depend on energy-consuming traditional computer hardware.
Building a brain
When we learn, electrical signals are sent between neurons in our brain. The most energy is needed the first time a synapse is traversed. Every time afterward, the connection requires less energy. This is how synapses efficiently facilitate both learning something new and remembering what we’ve learned. The artificial synapse, unlike most other versions of brain-like computing, also fulfills these two tasks simultaneously, and does so with substantial energy savings.
“Deep learning algorithms are very powerful but they rely on processors to calculate and simulate the electrical states and store them somewhere else, which is inefficient in terms of energy and time,” said Yoeri van de Burgt, former postdoctoral scholar in the Salleo lab and lead author of the paper. “Instead of simulating a neural network, our work is trying to make a neural network.”
The artificial synapse is based off a battery design. It consists of two thin, flexible films with three terminals, connected by an electrolyte of salty water. The device works as a transistor, with one of the terminals controlling the flow of electricity between the other two.
Like a neural path in a brain being reinforced through learning, the researchers program the artificial synapse by discharging and recharging it repeatedly. Through this training, they have been able to predict within 1 percent of uncertainly what voltage will be required to get the synapse to a specific electrical state and, once there, it remains at that state. In other words, unlike a common computer, where you save your work to the hard drive before you turn it off, the artificial synapse can recall its programming without any additional actions or parts.
Testing a network of artificial synapses
Only one artificial synapse has been produced but researchers at Sandia used 15,000 measurements from experiments on that synapse to simulate how an array of them would work in a neural network. They tested the simulated network’s ability to recognize handwriting of digits 0 through 9. Tested on three datasets, the simulated array was able to identify the handwritten digits with an accuracy between 93 to 97 percent.
Although this task would be relatively simple for a person, traditional computers have a difficult time interpreting visual and auditory signals.
“More and more, the kinds of tasks that we expect our computing devices to do require computing that mimics the brain because using traditional computing to perform these tasks is becoming really power hungry,” said A. Alec Talin, distinguished member of technical staff at Sandia National Laboratories in Livermore, California, and senior author of the paper. “We’ve demonstrated a device that’s ideal for running these type of algorithms and that consumes a lot less power.”
This device is extremely well suited for the kind of signal identification and classification that traditional computers struggle to perform. Whereas digital transistors can be in only two states, such as 0 and 1, the researchers successfully programmed 500 states in the artificial synapse, which is useful for neuron-type computation models. In switching from one state to another they used about one-tenth as much energy as a state-of-the-art computing system needs in order to move data from the processing unit to the memory.
This, however, means they are still using about 10,000 times as much energy as the minimum a biological synapse needs in order to fire. The researchers are hopeful that they can attain neuron-level energy efficiency once they test the artificial synapse in smaller devices.
Every part of the device is made of inexpensive organic materials. These aren’t found in nature but they are largely composed of hydrogen and carbon and are compatible with the brain’s chemistry. Cells have been grown on these materials and they have even been used to make artificial pumps for neural transmitters. The voltages applied to train the artificial synapse are also the same as those that move through human neurons.
All this means it’s possible that the artificial synapse could communicate with live neurons, leading to improved brain-machine interfaces. The softness and flexibility of the device also lends itself to being used in biological environments. Before any applications to biology, however, the team plans to build an actual array of artificial synapses for further research and testing.
Additional Stanford co-authors of this work include co-lead author Ewout Lubberman, also of the University of Groningen in the Netherlands, Scott T. Keene and Grégorio C. Faria, also of Universidade de São Paulo, in Brazil. Sandia National Laboratories co-authors include Elliot J. Fuller and Sapan Agarwal in Livermore and Matthew J. Marinella in Albuquerque, New Mexico. Salleo is an affiliate of the Stanford Precourt Institute for Energy and the Stanford Neurosciences Institute. Van de Burgt is now an assistant professor in microsystems and an affiliate of the Institute for Complex Molecular Studies (ICMS) at Eindhoven University of Technology in the Netherlands.
This research was funded by the National Science Foundation, the Keck Faculty Scholar Funds, the Neurofab at Stanford, the Stanford Graduate Fellowship, Sandia’s Laboratory-Directed Research and Development Program, the U.S. Department of Energy, the Holland Scholarship, the University of Groningen Scholarship for Excellent Students, the Hendrik Muller National Fund, the Schuurman Schimmel-van Outeren Foundation, the Foundation of Renswoude (The Hague and Delft), the Marco Polo Fund, the Instituto Nacional de Ciência e Tecnologia/Instituto Nacional de Eletrônica Orgânica in Brazil, the Fundação de Amparo à Pesquisa do Estado de São Paulo and the Brazilian National Council.
Here’s an abstract for the researchers’ paper (link to paper provided after abstract) and it’s where you’ll find the memristor connection explained,
The brain is capable of massively parallel information processing while consuming only ~1–100fJ per synaptic event1, 2. Inspired by the efficiency of the brain, CMOS-based neural architectures3 and memristors4, 5 are being developed for pattern recognition and machine learning. However, the volatility, design complexity and high supply voltages for CMOS architectures, and the stochastic and energy-costly switching of memristors complicate the path to achieve the interconnectivity, information density, and energy efficiency of the brain using either approach. Here we describe an electrochemical neuromorphic organic device (ENODe) operating with a fundamentally different mechanism from existing memristors. ENODe switches at low voltage and energy (<10pJ for 103μm2 devices), displays >500 distinct, non-volatile conductance states within a ~1V range, and achieves high classification accuracy when implemented in neural network simulations. Plastic ENODes are also fabricated on flexible substrates enabling the integration of neuromorphic functionality in stretchable electronic systems6, 7. Mechanical flexibility makes ENODes compatible with three-dimensional architectures, opening a path towards extreme interconnectivity comparable to the human brain.
Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle.. Courtesy of Colin Ophus and Florian Nickel/Berkeley Lab
The image of the iron-platinum nanoparticle (referenced in the headline) reminds of foetal ultrasound images. A Feb. 1, 2017 news item on ScienceDaily tells us more,
In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA [University of California at Los Angeles] physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.
The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level.
Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.
“No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.
Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.
By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.
“For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.
The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.
“Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”
The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.
“This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.
In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.
Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.
That means that radiation-sensitive objects can be imaged with lower doses of radiation.
The US Dept. of Energy (DOE) Lawrence Berkeley National Laboratory issued their own Feb. 1, 2017 news release (also on EurekAlert) about the work with a focus on how their equipment made this breakthrough possible (it repeats a little of the info. from the UCLA news release),
Scientists used one of the world’s most powerful electron microscopes to map the precise location and chemical type of 23,000 atoms in an extremely small particle made of iron and platinum.
The 3-D reconstruction reveals the arrangement of atoms in unprecedented detail, enabling the scientists to measure chemical order and disorder in individual grains, which sheds light on the material’s properties at the single-atom level. Insights gained from the particle’s structure could lead to new ways to improve its magnetic performance for use in high-density, next-generation hard drives.
What’s more, the technique used to create the reconstruction, atomic electron tomography (which is like an incredibly high-resolution CT scan), lays the foundation for precisely mapping the atomic composition of other useful nanoparticles. This could reveal how to optimize the particles for more efficient catalysts, stronger materials, and disease-detecting fluorescent tags.
Microscopy data was obtained and analyzed by scientists from the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) at the Molecular Foundry, in collaboration with Foundry users from UCLA, Oak Ridge National Laboratory, and the United Kingdom’s University of Birmingham. …
Atoms are the building blocks of matter, and the patterns in which they’re arranged dictate a material’s properties. These patterns can also be exploited to greatly improve a material’s function, which is why scientists are eager to determine the 3-D structure of nanoparticles at the smallest scale possible.
“Our research is a big step in this direction. We can now take a snapshot that shows the positions of all the atoms in a nanoparticle at a specific point in its growth. This will help us learn how nanoparticles grow atom by atom, and it sets the stage for a materials-design approach starting from the smallest building blocks,” says Mary Scott, who conducted the research while she was a Foundry user, and who is now a staff scientist. Scott and fellow Foundry scientists Peter Ercius and Colin Ophus developed the method in close collaboration with Jianwei Miao, a UCLA professor of physics and astronomy.
Their nanoparticle reconstruction builds on an achievement they reported last year in which they measured the coordinates of more than 3,000 atoms in a tungsten needle to a precision of 19 trillionths of a meter (19 picometers), which is many times smaller than a hydrogen atom. Now, they’ve taken the same precision, added the ability to distinguish different elements, and scaled up the reconstruction to include tens of thousands of atoms.
Importantly, their method maps the position of each atom in a single, unique nanoparticle. In contrast, X-ray crystallography and cryo-electron microscopy plot the average position of atoms from many identical samples. These methods make assumptions about the arrangement of atoms, which isn’t a good fit for nanoparticles because no two are alike.
“We need to determine the location and type of each atom to truly understand how a nanoparticle functions at the atomic scale,” says Ercius.
A TEAM approach
The scientists’ latest accomplishment hinged on the use of one of the highest-resolution transmission electron microscopes in the world, called TEAM I. It’s located at the National Center for Electron Microscopy, which is a Molecular Foundry facility. The microscope scans a sample with a focused beam of electrons, and then measures how the electrons interact with the atoms in the sample. It also has a piezo-controlled stage that positions samples with unmatched stability and position-control accuracy.
The researchers began growing an iron-platinum nanoparticle from its constituent elements, and then stopped the particle’s growth before it was fully formed. They placed the “partially baked” particle in the TEAM I stage, obtained a 2-D projection of its atomic structure, rotated it a few degrees, obtained another projection, and so on. Each 2-D projection provides a little more information about the full 3-D structure of the nanoparticle.
They sent the projections to Miao at UCLA, who used a sophisticated computer algorithm to convert the 2-D projections into a 3-D reconstruction of the particle. The individual atomic coordinates and chemical types were then traced from the 3-D density based on the knowledge that iron atoms are lighter than platinum atoms. The resulting atomic structure contains 6,569 iron atoms and 16,627 platinum atoms, with each atom’s coordinates precisely plotted to less than the width of a hydrogen atom.
Translating the data into scientific insights
Interesting features emerged at this extreme scale after Molecular Foundry scientists used code they developed to analyze the atomic structure. For example, the analysis revealed chemical order and disorder in interlocking grains, in which the iron and platinum atoms are arranged in different patterns. This has large implications for how the particle grew and its real-world magnetic properties. The analysis also revealed single-atom defects and the width of disordered boundaries between grains, which was not previously possible in complex 3-D boundaries.
“The important materials science problem we are tackling is how this material transforms from a highly randomized structure, what we call a chemically-disordered structure, into a regular highly-ordered structure with the desired magnetic properties,” says Ophus.
To explore how the various arrangements of atoms affect the nanoparticle’s magnetic properties, scientists from DOE’s Oak Ridge National Laboratory ran computer calculations on the Titan supercomputer at ORNL–using the coordinates and chemical type of each atom–to simulate the nanoparticle’s behavior in a magnetic field. This allowed the scientists to see patterns of atoms that are very magnetic, which is ideal for hard drives. They also saw patterns with poor magnetic properties that could sap a hard drive’s performance.
“This could help scientists learn how to steer the growth of iron-platinum nanoparticles so they develop more highly magnetic patterns of atoms,” says Ercius.
Adds Scott, “More broadly, the imaging technique will shed light on the nucleation and growth of ordered phases within nanoparticles, which isn’t fully theoretically understood but is critically important to several scientific disciplines and technologies.”
The folks at the Berkeley Lab have created a video (notice where the still image from the beginning of this post appears),
The Oak Ridge National Laboratory (ORNL), not wanting to be left out, has been mentioned in a Feb. 3, 2017 news item on ScienceDaily,
… researchers working with magnetic nanoparticles at the University of California, Los Angeles (UCLA), and the US Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab) approached computational scientists at DOE’s Oak Ridge National Laboratory (ORNL) to help solve a unique problem: to model magnetism at the atomic level using experimental data from a real nanoparticle.
“These types of calculations have been done for ideal particles with ideal crystal structures but not for real particles,” said Markus Eisenbach, a computational scientist at the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.
Eisenbach develops quantum mechanical electronic structure simulations that predict magnetic properties in materials. Working with Paul Kent, a computational materials scientist at ORNL’s Center for Nanophase Materials Sciences, the team collaborated with researchers at UCLA and Berkeley Lab’s Molecular Foundry to combine world-class experimental data with world-class computing to do something new–simulate magnetism atom by atom in a real nanoparticle.
Using the new data from the research teams on the West Coast, Eisenbach and Kent were able to precisely model the measured atomic structure, including defects, from a unique iron-platinum (FePt) nanoparticle and simulate its magnetic properties on the 27-petaflop Titan supercomputer at the OLCF.
Electronic structure codes take atomic and chemical structure and solve for the corresponding magnetic properties. However, these structures are typically derived from many 2-D electron microscopy or x-ray crystallography images averaged together, resulting in a representative, but not true, 3-D structure.
“In this case, researchers were able to get the precise 3-D structure for a real particle,” Eisenbach said. “The UCLA group has developed a new experimental technique where they can tell where the atoms are–the coordinates–and the chemical resolution, or what they are — iron or platinum.”
The ORNL news release goes on to describe the work from the perspective of the people who ran the supercompute simulationsr,
A Supercomputing Milestone
Magnetism at the atomic level is driven by quantum mechanics — a fact that has shaken up classical physics calculations and called for increasingly complex, first-principle calculations, or calculations working forward from fundamental physics equations rather than relying on assumptions that reduce computational workload.
For magnetic recording and storage devices, researchers are particularly interested in magnetic anisotropy, or what direction magnetism favors in an atom.
“If the anisotropy is too weak, a bit written to the nanoparticle might flip at room temperature,” Kent said.
To solve for magnetic anisotropy, Eisenbach and Kent used two computational codes to compare and validate results.
To simulate a supercell of about 1,300 atoms from strongly magnetic regions of the 23,000-atom nanoparticle, they used the Linear Scaling Multiple Scattering (LSMS) code, a first-principles density functional theory code developed at ORNL.
“The LSMS code was developed for large magnetic systems and can tackle lots of atoms,” Kent said.
As principal investigator on 2017, 2016, and previous INCITE program awards, Eisenbach has scaled the LSMS code to Titan for a range of magnetic materials projects, and the in-house code has been optimized for Titan’s accelerated architecture, speeding up calculations more than 8 times on the machine’s GPUs. Exceptionally capable of crunching large magnetic systems quickly, the LSMS code received an Association for Computing Machinery Gordon Bell Prize in high-performance computing achievement in 1998 and 2009, and developments continue to enhance the code for new architectures.
Working with Renat Sabirianov at the University of Nebraska at Omaha, the team also ran VASP, a simulation package that is better suited for smaller atom counts, to simulate regions of about 32 atoms.
“With both approaches, we were able to confirm that the local VASP results were consistent with the LSMS results, so we have a high confidence in the simulations,” Eisenbach said.
Computer simulations revealed that grain boundaries have a strong effect on magnetism. “We found that the magnetic anisotropy energy suddenly transitions at the grain boundaries. These magnetic properties are very important,” Miao said.
In the future, researchers hope that advances in computing and simulation will make a full-particle simulation possible — as first-principles calculations are currently too intensive to solve small-scale magnetism for regions larger than a few thousand atoms.
Also, future simulations like these could show how different fabrication processes, such as the temperature at which nanoparticles are formed, influence magnetism and performance.
“There’s a hope going forward that one would be able to use these techniques to look at nanoparticle growth and understand how to optimize growth for performance,” Kent said.
Finally, here’s a link to and a citation for the paper,
A Jan. 31, 2017 news item on ScienceDaily announces a new report from the US National Science Foundation’s (NSF) National Center for Science and Engineering Statistics (NCSES),
The National Center for Science and Engineering Statistics (NCSES) today [Jan. 31, 2017,] announced the release of the 2017 Women, Minorities, and Persons with Disabilities in Science and Engineering (WMPD) report, the federal government’s most comprehensive look at the participation of these three demographic groups in science and engineering education and employment.
The report shows the degree to which women, people with disabilities and minorities from three racial and ethnic groups — black, Hispanic and American Indian or Alaska Native — are underrepresented in science and engineering (S&E). Women have reached parity with men in educational attainment but not in S&E employment. Underrepresented minorities account for disproportionately smaller percentages in both S&E education and employment
Congress mandated the biennial report in the Science and Engineering Equal Opportunities Act as part of the National Science Foundation’s (NSF) mission to encourage and strengthen the participation of underrepresented groups in S&E.
A Jan. 31, 2017 NSF news release (also on EurekAlert), which originated the news item, provides information about why the report is issued every two years and provides highlights from the 2017 report,
“An important part of fulfilling our mission to further the progress of science is producing current, accurate information about the U.S. STEM workforce,” said NSF Director France Córdova. “This report is a valuable resource to the science and engineering policy community.”
NSF maintains a portfolio of programs aimed at broadening participation in S&E, including ADVANCE: Increasing the Participation and Advancement of Women in Academic Science and Engineering Careers; LSAMP: the Louis Stokes Alliances for Minority Participation; and NSF INCLUDES, which focuses on building networks that can scale up proven approaches to broadening participation.
The digest provides highlights and analysis in five topic areas: enrollment, field of degree, occupation, employment status and early career doctorate holders. That last topic area includes analysis of pilot study data from the Early Career Doctorates Survey, a new NCSES product. NCSES also maintains expansive WMPD data tables, updated periodically as new data become available, which present the latest S&E education and workforce data available from NCSES and other agencies. The tables provide the public access to detailed, field-by-field information that includes both percentages and the actual numbers of people involved in S&E.
“WMPD is more than just a single report or presentation,” said NCSES Director John Gawalt. “It is a vast and unique information resource, carefully curated and maintained, that allows anyone (from the general public to highly trained researchers) ready access to data that facilitate and support their own exploration and analyses.”
Key findings from the new digest include:
The types of schools where students enroll vary among racial and ethnic groups. Hispanics, American Indians or Alaska Natives and Native Hawaiians or Other Pacific Islanders are more likely to enroll in community colleges. Blacks and Native Hawaiian or Other Pacific Islanders are more likely to enroll in private, for profit schools.
Since the late 1990s, women have earned about half of S&E bachelor’s degrees. But their representation varies widely by field, ranging from 70 percent in psychology to 18 percent in computer sciences.
At every level — bachelor’s, master’s and doctorate — underrepresented minority women earn a higher proportion of degrees than their male counterparts. White women, in contrast earn a smaller proportion of degrees than their male counterparts.
Despite two decades of progress, a wide gap in educational attainment remains between underrepresented minorities and whites and Asians, two groups that have higher representation in S&E education than they do in the U.S. population.
White men constitute about one-third of the overall U.S. population; they comprise half of the S&E workforce. Blacks, Hispanics and people with disabilities are underrepresented in the S&E workforce.
Women’s participation in the workforce varies greatly by field of occupation.
In 2015, scientists and engineers had a lower unemployment rate compared to the general U.S. population (3.3 percent versus 5.8 percent), although the rate varied among groups. For example, it was 2.8 percent among white women in S&E but 6.0 percent for underrepresented minority women.
For more information, including access to the digest and data tables, see the updated WMPD website.
Caption: In 2015, women and some minority groups were represented less in science and engineering (S&E) occupations than they were in the US general population.. Credit: NSF
I hinted in a Jan. 27, 2017 posting (scroll down abotu 15% of the way) that advice from Canadians with regard to an ‘American war on science’ might not be such a good idea. It seems that John Dupuis (mentioned in the Jan. 27, 2017 posting) has yet more advice for our neighbours to the south in his Feb. 5, 2017 posting (on the Confessions of a Science Librarian blog; Note: A link has been removed),
My advice? Don’t bring a test tube to a Bunsen burner fight. Mobilize, protest, form partnerships, wrote op-eds and blog posts and books and articles, speak about science at every public event you get a chance, run for office, help out someone who’s a science supporter run for office.
Don’t want your science to be seen as political or for your “objectivity” to be compromised? Too late, the other side made it political while you weren’t looking. And you’re the only one that thinks you’re objective. What difference will it make?
Don’t worry about changing the other side’s mind. Worry about mobilizing and energizing your side so they’ll turn out to protest and vote and send letters and all those other good things.
Worried that you will ruin your reputation and that when the good guys come back into power your “objectivity” will be forever compromised? Worry first about getting the good guys back in power. They will understand what you went through and why you had to mobilize. And they never thought your were “objective” to begin with.
Proof? The Canadian experience. After all, even the Guardian wants to talk about How science helped to swing the Canadian election? Two or four years from now, you want them to be writing articles about how science swung the US mid-term or presidential elections.
Dupuis goes on to offer a good set of links to articles about the Canadian experience written for media outlets from across the world.
The thing is, Stephen Harper is not Donald Trump. So, all this Canadian experience may not be as helpful as we or our neighbours to the south might like.
This Feb . 6, 2017 article by Daniel Engber for Slate.com gives a perspective that I think has been missed in this ‘Canadian’ discussion about the latest US ‘war on science’ (Note: Link have been removed),
An army of advocates for science will march on Washington, D.C. on April 22, according to a press release out last Thursday. The show of force aims to “draw attention to dangerous trends in the politicization of science,” the organizers say, citing “threats to the scientific community” and the need to “safeguard” researchers from a menacing regime. If Donald Trump plans to escalate his apparent assault on scientific values, then let him be on notice: Science will fight back.
We’ve been through this before. Casting opposition to a sitting president as resistance to a “war on science” likely helped progressives 10 or 15 years ago, when George W. Bush alienated voters with his apparent disrespect for climate science and embryonic stem-cell research (among other fields of study). The Bush administration’s meddling in research and disregard for expertise turned out to be a weakness, as the historian Daniel Sarewitz described in an insightful essay from 2009. Who could really argue with the free pursuit of knowledge? Democratic challengers made a weapon of their support for scientific progress: “Americans deserve a president who believes in science,” said John Kerry during the 2004 campaign. “We will end the Bush administration’s war on science, restore scientific integrity and return to evidence-based decision-making,” the Democratic Party platform stated four years later.
But what had been a sharp-edged political strategy may now have lost its edge. I don’t mean to say that the broad appeal of science has been on the wane; overall, Americans are about as sanguine on the value of our scientific institutions as they were before. Rather, the electorate has reorganized itself, or has been reorganized by Trump, in such a way that fighting on behalf of science no longer cuts across party lines, and it doesn’t influence as many votes beyond the Democratic base.
The War on Science works for Trump because it’s always had more to do with social class than politics. A glance at data from the National Science Foundation shows how support for science tracks reliably with socioeconomic status. As of 2014, 50 percent of Americans in the highest income quartile and more than 55 percent of those with college degrees reported having great confidence in the nation’s scientific leaders. Among those in the lowest income bracket or with very little education, that support drops to 33 percent or less. Meanwhile, about five-sixths of rich or college-educated people—compared to less than half of poor people or those who never finished high school—say they believe that the benefits of science outweigh the potential harms. To put this in crude, horse-race terms, the institution of scientific research consistently polls about 30 points higher among the elites than it does among the uneducated working class.
Ten years ago, that distinction didn’t matter quite so much for politics. …
… with the battle lines redrawn, the same approach to activism now seems as though it could have the opposite effect. In the same way that fighting the War on Journalism delegitimizes the press by making it seem partisan and petty, so might the present fight against the War on Science sap scientific credibility. By confronting it directly, science activists may end up helping to consolidate Trump’s support among his most ardent, science-skeptical constituency. If they’re not careful where and how they step, the science march could turn into an ambush.
I think Engber is making an important point and the strategies and tactics being employed need to be carefully reviewed.
As for the Canadian situation, things are indeed better now but my experience is that while we rarely duplicate the situation in the US, we often find ourselves echoing their cries, albeit years later and more faintly. The current leadership race for the Conservative party has at least one Trump admirer (Kelly Leitch see the section titled: Controversy) fashioning her campaign in light of his perceived successes. Our next so called ‘war on science’ could echo in some ways the current situation in the US and we’d best keep that in mind.
In the future, our health may be monitored and maintained by tiny sensors and drug dispensers, deployed within the body and made from graphene—one of the strongest, lightest materials in the world. Graphene is composed of a single sheet of carbon atoms, linked together like razor-thin chicken wire, and its properties may be tuned in countless ways, making it a versatile material for tiny, next-generation implants.
But graphene is incredibly stiff, whereas biological tissue is soft. Because of this, any power applied to operate a graphene implant could precipitously heat up and fry surrounding cells.
Now, engineers from MIT [Massachusetts Institute of Technology] and Tsinghua University in Beijing have precisely simulated how electrical power may generate heat between a single layer of graphene and a simple cell membrane. While direct contact between the two layers inevitably overheats and kills the cell, the researchers found they could prevent this effect with a very thin, in-between layer of water.
By tuning the thickness of this intermediate water layer, the researchers could carefully control the amount of heat transferred between graphene and biological tissue. They also identified the critical power to apply to the graphene layer, without frying the cell membrane. …
Co-author Zhao Qin, a research scientist in MIT’s Department of Civil and Environmental Engineering (CEE), says the team’s simulations may help guide the development of graphene implants and their optimal power requirements.
“We’ve provided a lot of insight, like what’s the critical power we can accept that will not fry the cell,” Qin says. “But sometimes we might want to intentionally increase the temperature, because for some biomedical applications, we want to kill cells like cancer cells. This work can also be used as guidance [for those efforts.]”
Typically, heat travels between two materials via vibrations in each material’s atoms. These atoms are always vibrating, at frequencies that depend on the properties of their materials. As a surface heats up, its atoms vibrate even more, causing collisions with other atoms and transferring heat in the process.
The researchers sought to accurately characterize the way heat travels, at the level of individual atoms, between graphene and biological tissue. To do this, they considered the simplest interface, comprising a small, 500-nanometer-square sheet of graphene and a simple cell membrane, separated by a thin layer of water.
“In the body, water is everywhere, and the outer surface of membranes will always like to interact with water, so you cannot totally remove it,” Qin says. “So we came up with a sandwich model for graphene, water, and membrane, that is a crystal clear system for seeing the thermal conductance between these two materials.”
Qin’s colleagues at Tsinghua University had previously developed a model to precisely simulate the interactions between atoms in graphene and water, using density functional theory — a computational modeling technique that considers the structure of an atom’s electrons in determining how that atom will interact with other atoms.
However, to apply this modeling technique to the group’s sandwich model, which comprised about half a million atoms, would have required an incredible amount of computational power. Instead, Qin and his colleagues used classical molecular dynamics — a mathematical technique based on a “force field” potential function, or a simplified version of the interactions between atoms — that enabled them to efficiently calculate interactions within larger atomic systems.
The researchers then built an atom-level sandwich model of graphene, water, and a cell membrane, based on the group’s simplified force field. They carried out molecular dynamics simulations in which they changed the amount of power applied to the graphene, as well as the thickness of the intermediate water layer, and observed the amount of heat that carried over from the graphene to the cell membrane.
Because the stiffness of graphene and biological tissue is so different, Qin and his colleagues expected that heat would conduct rather poorly between the two materials, building up steeply in the graphene before flooding and overheating the cell membrane. However, the intermediate water layer helped dissipate this heat, easing its conduction and preventing a temperature spike in the cell membrane.
Looking more closely at the interactions within this interface, the researchers made a surprising discovery: Within the sandwich model, the water, pressed against graphene’s chicken-wire pattern, morphed into a similar crystal-like structure.
“Graphene’s lattice acts like a template to guide the water to form network structures,” Qin explains. “The water acts more like a solid material and makes the stiffness transition from graphene and membrane less abrupt. We think this helps heat to conduct from graphene to the membrane side.”
The group varied the thickness of the intermediate water layer in simulations, and found that a 1-nanometer-wide layer of water helped to dissipate heat very effectively. In terms of the power applied to the system, they calculated that about a megawatt of power per meter squared, applied in tiny, microsecond bursts, was the most power that could be applied to the interface without overheating the cell membrane.
Qin says going forward, implant designers can use the group’s model and simulations to determine the critical power requirements for graphene devices of different dimensions. As for how they might practically control the thickness of the intermediate water layer, he says graphene’s surface may be modified to attract a particular number of water molecules.
“I think graphene provides a very promising candidate for implantable devices,” Qin says. “Our calculations can provide knowledge for designing these devices in the future, for specific applications, like sensors, monitors, and other biomedical applications.”
This research was supported in part by the MIT International Science and Technology Initiative (MISTI): MIT-China Seed Fund, the National Natural Science Foundation of China, DARPA [US Defense Advanced Research Projects Agency], the Department of Defense (DoD) Office of Naval Research, the DoD Multidisciplinary Research Initiatives program, the MIT Energy Initiative, and the National Science Foundation.
An upcoming (Oct. 19, 2016) webinar from the US National Nanotechnology Initiative (NNI) is the first of a new series (from an Oct. 7, 2016 news item on Nanowerk),
“Water Sustainability through Nanotechnology: A Federal Perspective” – This webinar is the first in a series exploring the confluence of nanotechnology and water. This event will introduce the Nanotechnology Signature Initiative (NSI): Water Sustainability through Nanotechnology and highlight the activities of several participating Federal agencies. …
Panelists include Nora Savage (National Science Foundation), Daniel Barta (National Aeronautics and Space Adminstration), Paul Shapiro (U.S. Environmental Protection Agency), Jim Dobrowolski (USDA National Institute of Food and Agriculture), and Hongda Chen (USDA National Institute of Food and Agriculture).
Webinar viewers will be able to submit questions for the panelists to answer during the Q&A period. Submitted questions will be considered in the order received and may be posted on the NNI website. A moderator will identify relevant questions and pose them to the speakers. Due to time constraints, not all questions may be addressed during the webinar. The moderator reserves the right to group similar questions and to skip questions, as appropriate.
There will be more in this series according to the webinar event page,
Water is essential to all life, and its significance bridges many critical areas for society: food, energy, security, and the environment. Projected population growth in the coming decades and associated increases in demands for water exacerbate the mounting pressure to address water sustainability. Yet, only 2.5% of the world’s water is fresh water, and some of the most severe impacts of climate change are on our country’s water resources. For example, in 2012, droughts affected about two-thirds of the continental United States, impacting water supplies, tourism, transportation, energy, and fisheries – costing the agricultural sector alone $30 billion. In addition, the ground water in many of the Nation’s aquifers is being depleted at unsustainable rates, which necessitates drilling ever deeper to tap groundwater resources. Finally, water infrastructure is a critically important but sometimes overlooked aspect of water treatment and distribution. Both technological and sociopolitical solutions are required to address these problems.
The text also goes on to describe how nanotechnology could assist with this challenge.
WHIZ! POW! BAM! BOOM! Today [Oct. 5, 2016], the National Science Foundation (NSF) and the National Nanotechnology Initiative (NNI) announce the opening of the second annual Generation Nano: Small Science, Superheroes! competition. The contest invites U.S. high school and home-schooled students to create a superhero that uses nanotechnology — science and technology on the scale of a nanometer, or one billionth of a meter — to solve crimes and meet today’s challenges.
By challenging students to think big (or small, in this case) to create superheroes with nanotechnology-inspired gear or powers, NSF and NNI aim to promote an early interest in science, technology, engineering and mathematics (STEM).
“An increasing number of students are drawn to the fascinating field of nanotechnology, which allows us to do things not possible before in computing, mobile communication, medicine and the environment,” said NSF Senior Advisor for Science and Engineering Mihail Roco. “The younger generation will carry on future progress in this exciting field. The Generation Nano competition gives students an opportunity to creatively combine this scientific interest with their artistic side.”
An Oct. 5, 2016 NSF news release, which originated the news item, describes the first competition and provides information for students wanting to enter this second one,
Last year’s first-ever Generation Nano competition inspired entries from more than 115 students across the U.S. The winning superhero creations included Nanoman, who battled a malignant crab-monster named Cancer; Radio Blitz, who helped dispose of local waste; and Nine, a rising superhero who used his nanosuit to defeat a pair of kidnappers.
Actor Wil Wheaton hosted the awards ceremony at the 2016 USA Science & Engineering Festival in Washington, D.C., where legendary comic book creator Stan Lee made a surprise virtual appearance to congratulate the finalists.
“The number and quality of the submissions to the Generation Nano contest last year were fantastic,” said Lisa Friedersdorf, deputy director of the National Nanotechnology Coordination Office, which provides public outreach for NNI. “I’m very excited by the four key societal missions identified for this year’s contest as nanotechnology can play a critical role in addressing each of these needs. I can’t wait to see the creative and imaginative ways the student teams take on this challenge!”
This year, participants’ superhero creations must tackle one of the following societal issues:
Justice — Using nanotechnology to fight criminals, bullies, supervillains and other wrongdoers.
Relief — Using nanotechnology to aid victims of famine, drought and other disasters.
Health — Using nanotechnology to heal the sick and injured.
Environment — Using nanotechnology to generate clean energy, control pollution and create a sustainable future.
NSF will promote the opening of the competition this week at New York Comic Con, the East Coast’s largest popular culture convention. A panel will bring together NSF-funded scientists and storytellers to talk about their imagined worlds.
Student contestants are encouraged to submit their superhero creations to the Generation Nano competition website for an opportunity to compete for prizes. A panel will review the submissions and select 15 semifinalists, and then a first and second place winner. Submissions from all semifinalists will also be posted to the Generation Nano website to allow the public to vote for their favorite superhero, which will receive a People’s Choice award.
Additional competition details
U.S. high school and home-schooled students should submit a written entry explaining how their superhero uses nanotechnology to do good, along with a two-to-three-page comic and 90-second video.
Competition opens Oct. 5, 2016, and submissions are due by midnight, Jan. 31, 2017 EST.
Three rounds of judging will take place, with winners announced in the spring.
Prizes: $1,500 for first place; $1,000 for second place; and $750 for the People’s Choice award.
Visit the Generation Nano competition website for full eligibility criteria, entry guidelines, timeline and prize information. For additional questions about the contest, contact the Generation Nano team at email@example.com.
As noted in the news release, the competition opened Oct. 5, 2016 and entries can be submitted until Jan. 31, 2017 and you need to submit a written piece, a 2-3 page comic, and a short video.
According to a Sept. 2, 2016 news item on phys.org, researchers at the University of Wisconsin-Madison have produced carbon nanotube transistors that outperform state-of-the-art silicon transistors,
For decades, scientists have tried to harness the unique properties of carbon nanotubes to create high-performance electronics that are faster or consume less power—resulting in longer battery life, faster wireless communication and faster processing speeds for devices like smartphones and laptops.
But a number of challenges have impeded the development of high-performance transistors made of carbon nanotubes, tiny cylinders made of carbon just one atom thick. Consequently, their performance has lagged far behind semiconductors such as silicon and gallium arsenide used in computer chips and personal electronics.
Now, for the first time, University of Wisconsin-Madison materials engineers have created carbon nanotube transistors that outperform state-of-the-art silicon transistors.
Led by Michael Arnold and Padma Gopalan, UW-Madison professors of materials science and engineering, the team’s carbon nanotube transistors achieved current that’s 1.9 times higher than silicon transistors. …
“This achievement has been a dream of nanotechnology for the last 20 years,” says Arnold. “Making carbon nanotube transistors that are better than silicon transistors is a big milestone. This breakthrough in carbon nanotube transistor performance is a critical advance toward exploiting carbon nanotubes in logic, high-speed communications, and other semiconductor electronics technologies.”
This advance could pave the way for carbon nanotube transistors to replace silicon transistors and continue delivering the performance gains the computer industry relies on and that consumers demand. The new transistors are particularly promising for wireless communications technologies that require a lot of current flowing across a relatively small area.
As some of the best electrical conductors ever discovered, carbon nanotubes have long been recognized as a promising material for next-generation transistors.
Carbon nanotube transistors should be able to perform five times faster or use five times less energy than silicon transistors, according to extrapolations from single nanotube measurements. The nanotube’s ultra-small dimension makes it possible to rapidly change a current signal traveling across it, which could lead to substantial gains in the bandwidth of wireless communications devices.
But researchers have struggled to isolate purely carbon nanotubes, which are crucial, because metallic nanotube impurities act like copper wires and disrupt their semiconducting properties — like a short in an electronic device.
The UW–Madison team used polymers to selectively sort out the semiconducting nanotubes, achieving a solution of ultra-high-purity semiconducting carbon nanotubes.
“We’ve identified specific conditions in which you can get rid of nearly all metallic nanotubes, where we have less than 0.01 percent metallic nanotubes,” says Arnold.
Placement and alignment of the nanotubes is also difficult to control.
To make a good transistor, the nanotubes need to be aligned in just the right order, with just the right spacing, when assembled on a wafer. In 2014, the UW–Madison researchers overcame that challenge when they announced a technique, called “floating evaporative self-assembly,” that gives them this control.
The nanotubes must make good electrical contacts with the metal electrodes of the transistor. Because the polymer the UW–Madison researchers use to isolate the semiconducting nanotubes also acts like an insulating layer between the nanotubes and the electrodes, the team “baked” the nanotube arrays in a vacuum oven to remove the insulating layer. The result: excellent electrical contacts to the nanotubes.
The researchers also developed a treatment that removes residues from the nanotubes after they’re processed in solution.
“In our research, we’ve shown that we can simultaneously overcome all of these challenges of working with nanotubes, and that has allowed us to create these groundbreaking carbon nanotube transistors that surpass silicon and gallium arsenide transistors,” says Arnold.
The researchers benchmarked their carbon nanotube transistor against a silicon transistor of the same size, geometry and leakage current in order to make an apples-to-apples comparison.
They are continuing to work on adapting their device to match the geometry used in silicon transistors, which get smaller with each new generation. Work is also underway to develop high-performance radio frequency amplifiers that may be able to boost a cellphone signal. While the researchers have already scaled their alignment and deposition process to 1 inch by 1 inch wafers, they’re working on scaling the process up for commercial production.
Arnold says it’s exciting to finally reach the point where researchers can exploit the nanotubes to attain performance gains in actual technologies.
“There has been a lot of hype about carbon nanotubes that hasn’t been realized, and that has kind of soured many people’s outlook,” says Arnold. “But we think the hype is deserved. It has just taken decades of work for the materials science to catch up and allow us to effectively harness these materials.”
The researchers have patented their technology through the Wisconsin Alumni Research Foundation.
Interestingly, at least some of the research was publicly funded according to the news release,
Funding from the National Science Foundation, the Army Research Office and the Air Force supported their work.
Will the public ever benefit financially from this research?
The US has embarked on a number of what is called “Grand Challenges.” I first came across the concept when reading about the Bill and Melinda Gates (of Microsoft fame) Foundation. I gather these challenges are intended to provide funding for research that advances bold visions.
There is the US National Strategic Computing Initiative established on July 29, 2015 and its first anniversary results were announced one year to the day later. Within that initiative a nanotechnology-inspired Grand Challenge for Future Computing was issued and, according to a July 29, 2016 news item on Nanowerk, a white paper on the topic has been issued (Note: A link has been removed),
Today [July 29, 2016), Federal agencies participating in the National Nanotechnology Initiative (NNI) released a white paper (pdf) describing the collective Federal vision for the emerging and innovative solutions needed to realize the Nanotechnology-Inspired Grand Challenge for Future Computing.
The grand challenge, announced on October 20, 2015, is to “create a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.” The white paper describes the technical priorities shared by the agencies, highlights the challenges and opportunities associated with these priorities, and presents a guiding vision for the research and development (R&D) needed to achieve key technical goals. By coordinating and collaborating across multiple levels of government, industry, academia, and nonprofit organizations, the nanotechnology and computer science communities can look beyond the decades-old approach to computing based on the von Neumann architecture and chart a new path that will continue the rapid pace of innovation beyond the next decade.
“Materials and devices for computing have been and will continue to be a key application domain in the field of nanotechnology. As evident by the R&D topics highlighted in the white paper, this challenge will require the convergence of nanotechnology, neuroscience, and computer science to create a whole new paradigm for low-power computing with revolutionary, brain-like capabilities,” said Dr. Michael Meador, Director of the National Nanotechnology Coordination Office. …
The white paper was produced as a collaboration by technical staff at the Department of Energy, the National Science Foundation, the Department of Defense, the National Institute of Standards and Technology, and the Intelligence Community. …
A new materials base may be needed for future electronic hardware. While most of today’s electronics use silicon, this approach is unsustainable if billions of disposable and short-lived sensor nodes are needed for the coming Internet-of-Things (IoT). To what extent can the materials base for the implementation of future information technology (IT) components and systems support sustainability through recycling and bio-degradability? More sustainable materials, such as compostable or biodegradable systems (polymers, paper, etc.) that can be recycled or reused, may play an important role. The potential role for such alternative materials in the fabrication of integrated systems needs to be explored as well. [p. 5]
The basic architecture of computers today is essentially the same as those built in the 1940s—the von Neumann architecture—with separate compute, high-speed memory, and high-density storage components that are electronically interconnected. However, it is well known that continued performance increases using this architecture are not feasible in the long term, with power density constraints being one of the fundamental roadblocks.7 Further advances in the current approach using multiple cores, chip multiprocessors, and associated architectures are plagued by challenges in software and programming models. Thus, research and development is required in radically new and different computing architectures involving processors, memory, input-output devices, and how they behave and are interconnected. [p. 7]
Neuroscience research suggests that the brain is a complex, high-performance computing system with low energy consumption and incredible parallelism. A highly plastic and flexible organ, the human brain is able to grow new neurons, synapses, and connections to cope with an ever-changing environment. Energy efficiency, growth, and flexibility occur at all scales, from molecular to cellular, and allow the brain, from early to late stage, to never stop learning and to act with proactive intelligence in both familiar and novel situations. Understanding how these mechanisms work and cooperate within and across scales has the potential to offer tremendous technical insights and novel engineering frameworks for materials, devices, and systems seeking to perform efficient and autonomous computing. This research focus area is the most synergistic with the national BRAIN Initiative. However, unlike the BRAIN Initiative, where the goal is to map the network connectivity of the brain, the objective here is to understand the nature, methods, and mechanisms for computation, and how the brain performs some of its tasks. Even within this broad paradigm, one can loosely distinguish between neuromorphic computing and artificial neural network (ANN) approaches. The goal of neuromorphic computing is oriented towards a hardware approach to reverse engineering the computational architecture of the brain. On the other hand, ANNs include algorithmic approaches arising from machinelearning, which in turn could leverage advancements and understanding in neuroscience as well as novel cognitive, mathematical, and statistical techniques. Indeed, the ultimate intelligent systems may as well be the result of merging existing ANN (e.g., deep learning) and bio-inspired techniques. [p. 8]
As government documents go, this is quite readable.