Tag Archives: Intel

IBM weighs in with plans for a 7nm computer chip

On the heels of Intel’s announcement about a deal utilizing their 14nm low-power manufacturing process and speculations about a 10nm computer chip (my July 9, 2014 posting), IBM makes an announcement about a 7nm chip as per this July 10, 2014 news item on Azonano,

IBM today [July 10, 2014] announced it is investing $3 billion over the next 5 years in two broad research and early stage development programs to push the limits of chip technology needed to meet the emerging demands of cloud computing and Big Data systems. These investments will push IBM’s semiconductor innovations from today’s breakthroughs into the advanced technology leadership required for the future.

A very comprehensive July 10, 2014 news release lays out the company’s plans for this $3B investment representing 10% of IBM’s total research budget,

The first research program is aimed at so-called “7 nanometer and beyond” silicon technology that will address serious physical challenges that are threatening current semiconductor scaling techniques and will impede the ability to manufacture such chips. The second is focused on developing alternative technologies for post-silicon era chips using entirely different approaches, which IBM scientists and other experts say are required because of the physical limitations of silicon based semiconductors.

Cloud and big data applications are placing new challenges on systems, just as the underlying chip technology is facing numerous significant physical scaling limits.  Bandwidth to memory, high speed communication and device power consumption are becoming increasingly challenging and critical.

The teams will comprise IBM Research scientists and engineers from Albany and Yorktown, New York; Almaden, California; and Europe. In particular, IBM will be investing significantly in emerging areas of research that are already underway at IBM such as carbon nanoelectronics, silicon photonics, new memory technologies, and architectures that support quantum and cognitive computing. [emphasis mine]

These teams will focus on providing orders of magnitude improvement in system level performance and energy efficient computing. In addition, IBM will continue to invest in the nanosciences and quantum computing–two areas of fundamental science where IBM has remained a pioneer for over three decades.

7 nanometer technology and beyond

IBM Researchers and other semiconductor experts predict that while challenging, semiconductors show promise to scale from today’s 22 nanometers down to 14 and then 10 nanometers in the next several years.  However, scaling to 7 nanometers and perhaps below, by the end of the decade will require significant investment and innovation in semiconductor architectures as well as invention of new tools and techniques for manufacturing.

“The question is not if we will introduce 7 nanometer technology into manufacturing, but rather how, when, and at what cost?” said John Kelly, senior vice president, IBM Research. “IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems. This new investment will ensure that we produce the necessary innovations to meet these challenges.”

“Scaling to 7nm and below is a terrific challenge, calling for deep physics competencies in processing nano materials affinities and characteristics. IBM is one of a very few companies who has repeatedly demonstrated this level of science and engineering expertise,” said Richard Doherty, technology research director, The Envisioneering Group.

Bridge to a “Post-Silicon” Era

Silicon transistors, tiny switches that carry information on a chip, have been made smaller year after year, but they are approaching a point of physical limitation. Their increasingly small dimensions, now reaching the nanoscale, will prohibit any gains in performance due to the nature of silicon and the laws of physics. Within a few more generations, classical scaling and shrinkage will no longer yield the sizable benefits of lower power, lower cost and higher speed processors that the industry has become accustomed to.

With virtually all electronic equipment today built on complementary metal–oxide–semiconductor (CMOS) technology, there is an urgent need for new materials and circuit architecture designs compatible with this engineering process as the technology industry nears physical scalability limits of the silicon transistor.

Beyond 7 nanometers, the challenges dramatically increase, requiring a new kind of material to power systems of the future, and new computing platforms to solve problems that are unsolvable or difficult to solve today. Potential alternatives include new materials such as carbon nanotubes, and non-traditional computational approaches such as neuromorphic computing, cognitive computing, machine learning techniques, and the science behind quantum computing.

As the leader in advanced schemes that point beyond traditional silicon-based computing, IBM holds over 500 patents for technologies that will drive advancements at 7nm and beyond silicon — more than twice the nearest competitor. These continued investments will accelerate the invention and introduction into product development for IBM’s highly differentiated computing systems for cloud, and big data analytics.

Several exploratory research breakthroughs that could lead to major advancements in delivering dramatically smaller, faster and more powerful computer chips, include quantum computing, neurosynaptic computing, silicon photonics, carbon nanotubes, III-V technologies, low power transistors and graphene:

Quantum Computing

The most basic piece of information that a typical computer understands is a bit. Much like a light that can be switched on or off, a bit can have only one of two values: “1” or “0.” Described as superposition, this special property of qubits enables quantum computers to weed through millions of solutions all at once, while desktop PCs would have to consider them one at a time.

IBM is a world leader in superconducting qubit-based quantum computing science and is a pioneer in the field of experimental and theoretical quantum information, fields that are still in the category of fundamental science – but one that, in the long term, may allow the solution of problems that are today either impossible or impractical to solve using conventional machines. The team recently demonstrated the first experimental realization of parity check with three superconducting qubits, an essential building block for one type of quantum computer.

Neurosynaptic Computing

Bringing together nanoscience, neuroscience, and supercomputing, IBM and university partners have developed an end-to-end ecosystem including a novel non-von Neumann architecture, a new programming language, as well as applications. This novel technology allows for computing systems that emulate the brain’s computing efficiency, size and power usage. IBM’s long-term goal is to build a neurosynaptic system with ten billion neurons and a hundred trillion synapses, all while consuming only one kilowatt of power and occupying less than two liters of volume.

Silicon Photonics

IBM has been a pioneer in the area of CMOS integrated silicon photonics for over 12 years, a technology that integrates functions for optical communications on a silicon chip, and the IBM team has recently designed and fabricated the world’s first monolithic silicon photonics based transceiver with wavelength division multiplexing.  Such transceivers will use light to transmit data between different components in a computing system at high data rates, low cost, and in an energetically efficient manner.

Silicon nanophotonics takes advantage of pulses of light for communication rather than traditional copper wiring and provides a super highway for large volumes of data to move at rapid speeds between computer chips in servers, large datacenters, and supercomputers, thus alleviating the limitations of congested data traffic and high-cost traditional interconnects.

Businesses are entering a new era of computing that requires systems to process and analyze, in real-time, huge volumes of information known as Big Data. Silicon nanophotonics technology provides answers to Big Data challenges by seamlessly connecting various parts of large systems, whether few centimeters or few kilometers apart from each other, and move terabytes of data via pulses of light through optical fibers.

III-V technologies

IBM researchers have demonstrated the world’s highest transconductance on a self-aligned III-V channel metal-oxide semiconductor (MOS) field-effect transistors (FETs) device structure that is compatible with CMOS scaling. These materials and structural innovation are expected to pave path for technology scaling at 7nm and beyond.  With more than an order of magnitude higher electron mobility than silicon, integrating III-V materials into CMOS enables higher performance at lower power density, allowing for an extension to power/performance scaling to meet the demands of cloud computing and big data systems.

Carbon Nanotubes

IBM Researchers are working in the area of carbon nanotube (CNT) electronics and exploring whether CNTs can replace silicon beyond the 7 nm node.  As part of its activities for developing carbon nanotube based CMOS VLSI circuits, IBM recently demonstrated — for the first time in the world — 2-way CMOS NAND gates using 50 nm gate length carbon nanotube transistors.

IBM also has demonstrated the capability for purifying carbon nanotubes to 99.99 percent, the highest (verified) purities demonstrated to date, and transistors at 10 nm channel length that show no degradation due to scaling–this is unmatched by any other material system to date.

Carbon nanotubes are single atomic sheets of carbon rolled up into a tube. The carbon nanotubes form the core of a transistor device that will work in a fashion similar to the current silicon transistor, but will be better performing. They could be used to replace the transistors in chips that power data-crunching servers, high performing computers and ultra fast smart phones.

Carbon nanotube transistors can operate as excellent switches at molecular dimensions of less than ten nanometers – the equivalent to 10,000 times thinner than a strand of human hair and less than half the size of the leading silicon technology. Comprehensive modeling of the electronic circuits suggests that about a five to ten times improvement in performance compared to silicon circuits is possible.

Graphene

Graphene is pure carbon in the form of a one atomic layer thick sheet.  It is an excellent conductor of heat and electricity, and it is also remarkably strong and flexible.  Electrons can move in graphene about ten times faster than in commonly used semiconductor materials such as silicon and silicon germanium. Its characteristics offer the possibility to build faster switching transistors than are possible with conventional semiconductors, particularly for applications in the handheld wireless communications business where it will be a more efficient switch than those currently used.

Recently in 2013, IBM demonstrated the world’s first graphene based integrated circuit receiver front end for wireless communications. The circuit consisted of a 2-stage amplifier and a down converter operating at 4.3 GHz.

Next Generation Low Power Transistors

In addition to new materials like CNTs, new architectures and innovative device concepts are required to boost future system performance. Power dissipation is a fundamental challenge for nanoelectronic circuits. To explain the challenge, consider a leaky water faucet — even after closing the valve as far as possible water continues to drip — this is similar to today’s transistor, in that energy is constantly “leaking” or being lost or wasted in the off-state.

A potential alternative to today’s power hungry silicon field effect transistors are so-called steep slope devices. They could operate at much lower voltage and thus dissipate significantly less power. IBM scientists are researching tunnel field effect transistors (TFETs). In this special type of transistors the quantum-mechanical effect of band-to-band tunneling is used to drive the current flow through the transistor. TFETs could achieve a 100-fold power reduction over complementary CMOS transistors, so integrating TFETs with CMOS technology could improve low-power integrated circuits.

Recently, IBM has developed a novel method to integrate III-V nanowires and heterostructures directly on standard silicon substrates and built the first ever InAs/Si tunnel diodes and TFETs using InAs as source and Si as channel with wrap-around gate as steep slope device for low power consumption applications.

“In the next ten years computing hardware systems will be fundamentally different as our scientists and engineers push the limits of semiconductor innovations to explore the post-silicon future,” said Tom Rosamilia, senior vice president, IBM Systems and Technology Group. “IBM Research and Development teams are creating breakthrough innovations that will fuel the next era of computing systems.”

IBM’s historic contributions to silicon and semiconductor innovation include the invention and/or first implementation of: the single cell DRAM, the “Dennard scaling laws” underpinning “Moore’s Law”, chemically amplified photoresists, copper interconnect wiring, Silicon on Insulator, strained engineering, multi core microprocessors, immersion lithography, high speed silicon germanium (SiGe), High-k gate dielectrics, embedded DRAM, 3D chip stacking, and Air gap insulators.

IBM researchers also are credited with initiating the era of nano devices following the Nobel prize winning invention of the scanning tunneling microscope which enabled nano and atomic scale invention and innovation.

IBM will also continue to fund and collaborate with university researchers to explore and develop the future technologies for the semiconductor industry. In particular, IBM will continue to support and fund university research through private-public partnerships such as the NanoElectornics Research Initiative (NRI), and the Semiconductor Advanced Research Network (STARnet), and the Global Research Consortium (GRC) of the Semiconductor Research Corporation.

I highlighted ‘memory systems’ as this brings to mind HP Labs and their major investment in ‘memristive’ technologies noted in my June 26, 2014 posting,

… During a two-hour presentation held a year and a half ago, they laid out how the computer might work, its benefits, and the expectation that about 75 percent of HP Labs personnel would be dedicated to this one project. “At the end, Meg {Meg Whitman, CEO of HP Labs] turned to [Chief Financial Officer] Cathie Lesjak and said, ‘Find them more money,’” says John Sontag, the vice president of systems research at HP, who attended the meeting and is in charge of bringing the Machine to life. “People in Labs see this as a once-in-a-lifetime opportunity.”

The Machine is based on the memristor and other associated technologies.

Getting back to IBM, there’s this analysis of the $3B investment ($600M/year for five years) by Alex Konrad in a July 10, 2014 article for Forbes (Note: A link has been removed),

When IBM … announced a $3 billion commitment to even tinier semiconductor chips that no longer depended on silicon on Wednesday, the big news was that IBM’s putting a lot of money into a future for chips where Moore’s Law no longer applies. But on second glance, the move to spend billions on more experimental ideas like silicon photonics and carbon nanotubes shows that IBM’s finally shifting large portions of its research budget into more ambitious and long-term ideas.

… IBM tells Forbes the $3 billion isn’t additional money being added to its R&D spend, an area where analysts have told Forbes they’d like to see more aggressive cash commitments in the future. IBM will still spend about $6 billion a year on R&D, 6% of revenue. Ten percent of that research budget, however, now has to come from somewhere else to fuel these more ambitious chip projects.

Neal Ungerleider’s July 11, 2014 article for Fast Company focuses on the neuromorphic computing and quantum computing aspects of this $3B initiative (Note: Links have been removed),

The new R&D initiatives fall into two categories: Developing nanotech components for silicon chips for big data and cloud systems, and experimentation with “post-silicon” microchips. This will include research into quantum computers which don’t know binary code, neurosynaptic computers which mimic the behavior of living brains, carbon nanotubes, graphene tools and a variety of other technologies.

IBM’s investment is one of the largest for quantum computing to date; the company is one of the biggest researchers in the field, along with a Canadian company named D-Wave which is partnering with Google and NASA to develop quantum computer systems.

The curious can find D-Wave Systems here. There’s also a January 19, 2012 posting here which discusses the D-Wave’s situation at that time.

Final observation, these are fascinating developments especially for the insight they provide into the worries troubling HP Labs, Intel, and IBM as they jockey for position.

ETA July 14, 2014: Dexter Johnson has a July 11, 2014 posting on his Nanoclast blog (on the IEEE [Institute for Electrical and Electronics Engineers]) about the IBM announcement and which features some responses he received from IBM officials to his queries,

While this may be a matter of fascinating speculation for investors, the impact on nanotechnology development  is going to be significant. To get a better sense of what it all means, I was able to talk to some of the key figures of IBM’s push in nanotechnology research.

I conducted e-mail interviews with Tze-Chiang (T.C.) Chen, vice president science & technology, IBM Fellow at the Thomas J. Watson Research Center and Wilfried Haensch, senior manager, physics and materials for logic and communications, IBM Research.

Silicon versus Nanomaterials

First, I wanted to get a sense for how long IBM envisioned sticking with silicon and when they expected the company would permanently make the move away from CMOS to alternative nanomaterials. Unfortunately, as expected, I didn’t get solid answers, except for them to say that new manufacturing tools and techniques need to be developed now.

He goes on to ask about carbon nanotubes and graphene. Interestingly, IBM does not have a wide range of electronics applications in mind for graphene.  I encourage you to read Dexter’s posting as Dexter got answers to some very astute and pointed questions.

Intel to produce Panasonic SoCs (system-on-chips) using 14nm low-power process

A July 8, 2014 news item on Azonano describes a manufacturing agreement between Intel and Panasonic,

Intel Corporation today announced that it has entered into a manufacturing agreement with Panasonic Corporation’s System LSI Business Division. Intel’s custom foundry business will manufacture future Panasonic system-on-chips (SoCs) using Intel’s 14nm low-power manufacturing process.

Panasonic’s next-generation SoCs will target audio visual-based equipment markets, and will enable higher levels of performance, power and viewing experience for consumers.

A July 7, 2014 Intel press release, which originated the news item, reveals more details,

“Intel’s 14nm Tri-Gate process technology is very important to develop the next- generation SoCs,” said Yoshifumi Okamoto, director, Panasonic Corporation SLSI Business Division. “We will deliver highly improved performance and power advantages with next-generation SoCs by leveraging Intel’s 14nm Tri-Gate process technology through our collaboration.”

Intel’s leading-edge 14nm low-power process technology, which includes the second generation of Tri-Gate transistors, is optimized for low-power applications. This will enable Panasonic’s SoCs to achieve high levels of performance and functionality at lower power levels than was possible with planar transistors.

“We look forward to collaborating with the Panasonic SLSI Business Division,” said Sunit Rikhi, vice president and general manager, Intel Custom Foundry. “We will work hard to deliver the value of power-efficient performance of our 14nm LP process to Panasonic’s next-generation SoCs. This agreement with Panasonic is an important step in the buildup of Intel’s foundry business.”

Five other semiconductor companies have announced agreements with Intel’s custom foundry business, including Altera, Achronix Semiconductor, Tabula, Netronome and Microsemi.

Rick Merritt in a July 7, 2014 article for EE Times provides some insight,

“We are doing extremely well getting customers who can use our technology,” Sunit Rikhi, general manager of Intel’s foundry group, said in a talk at Semicon West, though he would not provide details. …

He suggested that the low-power variant of Intel’s 14nm process is relatively new. Intel uses a general-purpose 22nm process but supports multiple flavors of its 32nm process.

Intel expects to make 10nm chips without extreme ultraviolet (EUV) lithography, he said, reiterating comments from Intel’s Mark Bohr. …

This news provides an update of sorts to my October 21, 2010 posting,

Paul Otellini, Chief Executive Officer of Intel, just announced that the company will invest $6B to $8B for new and upgraded manufacturing facilities to produce 22 nanometre (nm) computer chips.

Now, almost our years later they’re talking about 10 nm chips. I wonder what 2018 will bring?

Memristor, memristor! What is happening? News from the University of Michigan and HP Laboratories

Professor Wei Lu (whose work on memristors has been mentioned here a few times [an April 15, 2010 posting and an April 19, 2012 posting]) has made a discovery about memristors with significant implications (from a June 25, 2014 news item on Azonano),

In work that unmasks some of the magic behind memristors and “resistive random access memory,” or RRAM—cutting-edge computer components that combine logic and memory functions—researchers have shown that the metal particles in memristors don’t stay put as previously thought.

The findings have broad implications for the semiconductor industry and beyond. They show, for the first time, exactly how some memristors remember.

A June 24, 2014 University of Michigan news release, which originated the news item, includes Lu’s perspective on this discovery and more details about it,

“Most people have thought you can’t move metal particles in a solid material,” said Wei Lu, associate professor of electrical and computer engineering at the University of Michigan. “In a liquid and gas, it’s mobile and people understand that, but in a solid we don’t expect this behavior. This is the first time it has been shown.”

Lu, who led the project, and colleagues at U-M and the Electronic Research Centre Jülich in Germany used transmission electron microscopes to watch and record what happens to the atoms in the metal layer of their memristor when they exposed it to an electric field. The metal layer was encased in the dielectric material silicon dioxide, which is commonly used in the semiconductor industry to help route electricity.

They observed the metal atoms becoming charged ions, clustering with up to thousands of others into metal nanoparticles, and then migrating and forming a bridge between the electrodes at the opposite ends of the dielectric material.

They demonstrated this process with several metals, including silver and platinum. And depending on the materials involved and the electric current, the bridge formed in different ways.

The bridge, also called a conducting filament, stays put after the electrical power is turned off in the device. So when researchers turn the power back on, the bridge is there as a smooth pathway for current to travel along. Further, the electric field can be used to change the shape and size of the filament, or break the filament altogether, which in turn regulates the resistance of the device, or how easy current can flow through it.

Computers built with memristors would encode information in these different resistance values, which is in turn based on a different arrangement of conducting filaments.

Memristor researchers like Lu and his colleagues had theorized that the metal atoms in memristors moved, but previous results had yielded different shaped filaments and so they thought they hadn’t nailed down the underlying process.

“We succeeded in resolving the puzzle of apparently contradicting observations and in offering a predictive model accounting for materials and conditions,” said Ilia Valov, principle investigator at the Electronic Materials Research Centre Jülich. “Also the fact that we observed particle movement driven by electrochemical forces within dielectric matrix is in itself a sensation.”

The implications for this work (from the news release),

The results could lead to a new approach to chip design—one that involves using fine-tuned electrical signals to lay out integrated circuits after they’re fabricated. And it could also advance memristor technology, which promises smaller, faster, cheaper chips and computers inspired by biological brains in that they could perform many tasks at the same time.

As is becoming more common these days (from the news release),

Lu is a co-founder of Crossbar Inc., a Santa Clara, Calif.-based startup working to commercialize RRAM. Crossbar has just completed a $25 million Series C funding round.

Here’s a link to and a citation for the paper,

Electrochemical dynamics of nanoscale metallic inclusions in dielectrics by Yuchao Yang, Peng Gao, Linze Li, Xiaoqing Pan, Stefan Tappertzhofen, ShinHyun Choi, Rainer Waser, Ilia Valov, & Wei D. Lu. Nature Communications 5, Article number: 4232 doi:10.1038/ncomms5232 Published 23 June 2014

This paper is behind a paywall.

The other party instrumental in the development and, they hope, the commercialization of memristors is HP (Hewlett Packard) Laboratories (HP Labs). Anyone familiar with this blog will likely know I have frequently covered the topic starting with an essay explaining the basics on my Nanotech Mysteries wiki (or you can check this more extensive and more recently updated entry on Wikipedia) and with subsequent entries here over the years. The most recent entry is a Jan. 9, 2014 posting which featured the then latest information on the HP Labs memristor situation (scroll down about 50% of the way). This new information is more in the nature of a new revelation of details rather than an update on its status. Sebastian Anthony’s June 11, 2014 article for extremetech.com lays out the situation plainly (Note: Links have been removed),

HP, one of the original 800lb Silicon Valley gorillas that has seen much happier days, is staking everything on a brand new computer architecture that it calls… The Machine. Judging by an early report from Bloomberg Businessweek, up to 75% of HP’s once fairly illustrious R&D division — HP Labs – are working on The Machine. As you would expect, details of what will actually make The Machine a unique proposition are hard to come by, but it sounds like HP’s groundbreaking work on memristors (pictured top) and silicon photonics will play a key role.

First things first, we’re probably not talking about a consumer computing architecture here, though it’s possible that technologies commercialized by The Machine will percolate down to desktops and laptops. Basically, HP used to be a huge player in the workstation and server markets, with its own operating system and hardware architecture, much like Sun. Over the last 10 years though, Intel’s x86 architecture has rapidly taken over, to the point where HP (and Dell and IBM) are essentially just OEM resellers of commodity x86 servers. This has driven down enterprise profit margins — and when combined with its huge stake in the diminishing PC market, you can see why HP is rather nervous about the future. The Machine, and IBM’s OpenPower initiative, are both attempts to get out from underneath Intel’s x86 monopoly.

While exact details are hard to come by, it seems The Machine is predicated on the idea that current RAM, storage, and interconnect technology can’t keep up with modern Big Data processing requirements. HP is working on two technologies that could solve both problems: Memristors could replace RAM and long-term flash storage, and silicon photonics could provide faster on- and off-motherboard buses. Memristors essentially combine the benefits of DRAM and flash storage in a single, hyper-fast, super-dense package. Silicon photonics is all about reducing optical transmission and reception to a scale that can be integrated into silicon chips (moving from electrical to optical would allow for much higher data rates and lower power consumption). Both technologies can be built using conventional fabrication techniques.

In a June 11, 2014 article by Ashlee Vance for Bloomberg Business Newsweek, the company’s CTO (Chief Technical Officer), Martin Fink provides new details,

That’s what they’re calling it at HP Labs: “the Machine.” It’s basically a brand-new type of computer architecture that HP’s engineers say will serve as a replacement for today’s designs, with a new operating system, a different type of memory, and superfast data transfer. The company says it will bring the Machine to market within the next few years or fall on its face trying. “We think we have no choice,” says Martin Fink, the chief technology officer and head of HP Labs, who is expected to unveil HP’s plans at a conference Wednesday [June 11, 2014].

In my Jan. 9, 2014 posting there’s a quote from Martin Fink stating that 2018 would be earliest date for the company’s StoreServ arrays to be packed with 100TB Memristor drives (the Machine?). The company later clarified the comment by noting that it’s very difficult to set dates for new technology arrivals.

Vance shares what could be a stirring ‘origins’ story of sorts, provided the Machine is successful,

The Machine started to take shape two years ago, after Fink was named director of HP Labs. Assessing the company’s projects, he says, made it clear that HP was developing the needed components to create a better computing system. Among its research projects: a new form of memory known as memristors; and silicon photonics, the transfer of data inside a computer using light instead of copper wires. And its researchers have worked on operating systems including Windows, Linux, HP-UX, Tru64, and NonStop.

Fink and his colleagues decided to pitch HP Chief Executive Officer Meg Whitman on the idea of assembling all this technology to form the Machine. During a two-hour presentation held a year and a half ago, they laid out how the computer might work, its benefits, and the expectation that about 75 percent of HP Labs personnel would be dedicated to this one project. “At the end, Meg turned to [Chief Financial Officer] Cathie Lesjak and said, ‘Find them more money,’” says John Sontag, the vice president of systems research at HP, who attended the meeting and is in charge of bringing the Machine to life. “People in Labs see this as a once-in-a-lifetime opportunity.”

Here is the memristor making an appearance in Vance’s article,

HP’s bet is the memristor, a nanoscale chip that Labs researchers must build and handle in full anticontamination clean-room suits. At the simplest level, the memristor consists of a grid of wires with a stack of thin layers of materials such as tantalum oxide at each intersection. When a current is applied to the wires, the materials’ resistance is altered, and this state can hold after the current is removed. At that point, the device is essentially remembering 1s or 0s depending on which state it is in, multiplying its storage capacity. HP can build these chips with traditional semiconductor equipment and expects to be able to pack unprecedented amounts of memory—enough to store huge databases of pictures, files, and data—into a computer.

New memory and networking technology requires a new operating system. Most applications written in the past 50 years have been taught to wait for data, assuming that the memory systems feeding the main computers chips are slow. Fink has assigned one team to develop the open-source Machine OS, which will assume the availability of a high-speed, constant memory store. …

Peter Bright in his June 11, 2014 article for Ars Technica opens his article with a controversial statement (Note: Links have been removed),

In 2008, scientists at HP invented a fourth fundamental component to join the resistor, capacitor, and inductor: the memristor. [emphasis mine] Theorized back in 1971, memristors showed promise in computing as they can be used to both build logic gates, the building blocks of processors, and also act as long-term storage.

Whether or not the memristor is a fourth fundamental component has been a matter of some debate as you can see in this Memristor entry (section on Memristor definition and criticism) on Wikipedia.

Bright goes on to provide a 2016 delivery date for some type of memristor-based product and additional technical insight about the Machine,

… By 2016, the company plans to have memristor-based DIMMs, which will combine the high storage densities of hard disks with the high performance of traditional DRAM.

John Sontag, vice president of HP Systems Research, said that The Machine would use “electrons for processing, photons for communication, and ions for storage.” The electrons are found in conventional silicon processors, and the ions are found in the memristors. The photons are because the company wants to use optical interconnects in the system, built using silicon photonics technology. With silicon photonics, photons are generated on, and travel through, “circuits” etched onto silicon chips, enabling conventional chip manufacturing to construct optical parts. This allows the parts of the system using photons to be tightly integrated with the parts using electrons.

The memristor story has proved to be even more fascinating than I thought in 2008 and I was already as fascinated as could be, or so I thought.

Canon-Molecular Imprints deal and its impact on shrinking chips (integrated circuits)

There’s quite an interesting April 20, 2014 essay on Nanotechnology Now which provides some insight into the nanoimprinting market. I recommend reading it but for anyone who is not intimately familiar with the scene, here are a few excerpts along with my attempts to decode this insider’s (from Martini Tech) view,

About two months ago, important news shook the small but lively Japanese nanoimprint community: Canon has decided to acquire, making it a wholly-owned subsidiary, Texas-based Molecular Imprints, a strong player in the nanotechnology industry and one of the main makers of nanoimprint devices such as the Imprio 450 and other models.

So, Canon, a Japanese company, has made a move into the nanoimpriting sector by purchasing Molecular Imprints, a US company based in Texas, outright.

This next part concerns the expiration of Moore’s Law (i.e., every 18 months computer chips get smaller and faster) and is why the major chip makers are searching for new solutions as per the fifth paragraph in this excerpt,

Molecular Imprints` devices are aimed at the IC [integrated circuits, aka chips, I think] patterning market and not just at the relatively smaller applications market to which nanoimprint is usually confined: patterning of bio culture substrates, thin film applications for the solar industry, anti-reflection films for smartphone and LED TV screens, patterning of surfaces for microfluidics among others.

While each one of the markets listed above has the potential of explosive growth in the medium-long term future, at the moment none of them is worth more than a few percentage points, at best, of the IC patterning market.

The mainstream technology behind IC patterning is still optical stepper lithography and the situation is not likely to change in the near term future.

However, optical lithography has its limitations, the main challenge to its 40-year dominance not coming only from technological and engineering issues, but mostly from economical ones.

While from a strictly technological point of view it may still be possible for the major players in the chip industry (Intel, GF, TSMC, Nvidia among others) to go ahead with optical steppers and reach the 5nm node using multi-patterning and immersion, the cost increases associated with each die shrink are becoming staggeringly high.

A top-of-the-notch stepper in the early 90s could have been bought for a few millions of dollars, now the price has increased to some tens of millions for the top machines

The essay describes the market impact this acquisition may have for Canon,

Molecular Imprints has been a company on the forefront of commercialization of nanoimprint-based solutions for IC manufacturing, but so far their solutions have yet to become a viable alternative HVM IC manufacturing market.

The main stumbling blocks for IC patterning using nanoimprint technology are: the occurrence of defects on the mask that inevitably replicates them on each substrate and the lack of alignment precision between the mold and the substrate needed to pattern multi-layered structures.

Therefore, applications for nanoimprint have been limited to markets where no non-periodical structure patterning is needed and where one-layered patterning is sufficient.

But the big market where everyone is aiming for is, of course, IC patterning and this is where much of the R&D effort goes.

While logic patterning with nanoimprint may still be years away, simple patterning of NAND structures may be feasible in the near future, and the purchase of Molecular Imprints by Canon is a step in this direction

Patterning of NAND structures may still require multi-layered structures, but the alignment precision needed is considerably lower than logic.

Moreover, NAND requirements for defectivity are more relaxed than for logic due to the inherent redundancy of the design, therefore, NAND manufacturing is the natural first step for nanoimprint in the IC manufacturing market and, if successful, it may open a whole new range of opportunities for the whole sector.

Assuming I’ve read the rest of this essay rightly, here’s my summary: there are a number of techniques being employed to make chips smaller and more efficient. Canon has purchased a company that is versed in a technique that creates NAND (you can find definitions here) structures in the hope that this technique can be commercialized so that Canon becomes dominant in the sector because (1) they got there first and/or because (2) NAND manufacturing becomes a clear leader, crushing competition from other technologies. This could cover short-term goals and, I imagine Canon hopes, long-term goals.

It was a real treat coming across this essay as it’s an insider’s view. So, thank you to the folks at Martini Tech who wrote this. You can find Molecular Imprints here.

Extending memristive theory

This is kind of fascinating. A German research team based at JARA (Jülich Aachen Research Alliance) is suggesting that memristive theory be extended beyond passive components in their paper about Resistive Memory Cells (ReRAM) which was recently published in Nature Communications. From the Apr. 26, 2013 news item on Azonano,

Resistive memory cells (ReRAM) are regarded as a promising solution for future generations of computer memories. They will dramatically reduce the energy consumption of modern IT systems while significantly increasing their performance.

Unlike the building blocks of conventional hard disk drives and memories, these novel memory cells are not purely passive components but must be regarded as tiny batteries. This has been demonstrated by researchers of Jülich Aachen Research Alliance (JARA), whose findings have now been published in the prestigious journal Nature Communications. The new finding radically revises the current theory and opens up possibilities for further applications. The research group has already filed a patent application for their first idea on how to improve data readout with the aid of battery voltage.

The Apr. 23, 2013 JARA news release, which originated the news item, provides some background information about data memory before going on to discuss the ReRAMs,

Conventional data memory works on the basis of electrons that are moved around and stored. However, even by atomic standards, electrons are extremely small. It is very difficult to control them, for example by means of relatively thick insulator walls, so that information will not be lost over time. This does not only limit storage density, it also costs a great deal of energy. For this reason, researchers are working feverishly all over the world on nanoelectronic components that make use of ions, i.e. charged atoms, for storing data. Ions are some thousands of times heavier that electrons and are therefore much easier to ‘hold down’. In this way, the individual storage elements can almost be reduced to atomic dimensions, which enormously improves the storage density.

Here’s how the ions behave in ReRAMs (from the news release),

In resistive switching memory cells (ReRAMs), ions behave on the nanometre scale in a similar manner to a battery. The cells have two electrodes, for example made of silver and platinum, at which the ions dissolve and then precipitate again. This changes the electrical resistance, which can be exploited for data storage. Furthermore, the reduction and oxidation processes also have another effect. They generate electric voltage. ReRAM cells are therefore not purely passive systems – they are also active electrochemical components. Consequently, they can be regarded as tiny batteries whose properties provide the key to the correct modelling and development of future data storage.

In complex experiments, the scientists from Forschungszentrum Jülich and RWTH Aachen University determined the battery voltage of typical representatives of ReRAM cells and compared them with theoretical values. This comparison revealed other properties (such as ionic resistance) that were previously neither known nor accessible. “Looking back, the presence of a battery voltage in ReRAMs is self-evident. But during the nine-month review process of the paper now published we had to do a lot of persuading, since the battery voltage in ReRAM cells can have three different basic causes, and the assignment of the correct cause is anything but trivial,” says Dr. Ilia Valov, the electrochemist in Prof. Rainer Waser’s research group.

This discovery could lead to optimizing ReRAMs and exploiting them in new applications (from the news release),

“The new findings will help to solve a central puzzle of international ReRAM research,” says Prof. Rainer Waser, deputy spokesman of the collaborative research centre SFB 917 ‘Nanoswitches’ established in 2011. In recent years, these puzzling aspects include unexplained long-term drift phenomena or systematic parameter deviations, which had been attributed to fabrication methods. “In the light of this new knowledge, it is possible to specifically optimize the design of the ReRAM cells, and it may be possible to discover new ways of exploiting the cells’ battery voltage for completely new applications, which were previously beyond the reach of technical possibilities,” adds Waser, whose group has been collaborating for years with companies such as Intel and Samsung Electronics in the field of ReRAM elements.

The part I found most interesting, given my interest in memristors, is this bit about extending the memristor theory, from the news release,

The new finding is of central significance, in particular, for the theoretical description of the memory components. To date, ReRAM cells have been described with the aid of the concept of memristors – a portmanteau word composed of “memory” and “resistor”. The theoretical concept of memristors can be traced back to Leon Chua in the 1970s. It was first applied to ReRAM cells by the IT company Hewlett-Packard in 2008. It aims at the permanent storage of information by changing the electrical resistance. The memristor theory leads to an important restriction. It is limited to passive components. “The demonstrated internal battery voltage of ReRAM elements clearly violates the mathematical construct of the memristor theory. This theory must be expanded to a whole new theory – to properly describe the ReRAM elements,” says Dr. Eike Linn, the specialist for circuit concepts in the group of authors. [emphases mine] This also places the development of all micro- and nanoelectronic chips on a completely new footing.

Here’s a link to and a citation for the paper,

Nanobatteries in redox-based resistive switches require extension of memristor theory by I. Valov,  E. Linn, S. Tappertzhofen,  S. Schmelzer,  J. van den Hurk,  F. Lentz,  & R. Waser. Nature Communications 4, Article number: 1771 doi:10.1038/ncomms2784 Published 23 April 2013

This paper is open access (as of this writing).

Here’s a list of my 2013 postings on memristors and memristive devices,

2.5M Euros for Ireland’s John Boland and his memristive nanowires (Apr. 4, 2013 posting)

How to use a memristor to create an artificial brain (Feb. 26, 2013 posting)

CeNSE (Central Nervous System of the Earth) and billions of tiny sensors from HP plus a memristor update (Feb. 7, 2013 posting)

For anyone who cares to search the blog, there are several more.

15-year-old Jake Andraka and his nanotechnology-enabled test for pancreatic cancer

We’re led to believe that good ideas can come from anyone, anywhere, at any time and that they will be recognized as such. Every once in a while it’s nice to see evidence that there’s some truth to that notion. Jake Andraka, 15 years old, has invented a test for pancreatic cancer that seems to be mostly accurate and is cheap making it far superior to any other such test currently available. (H/T Foresight Institute, Mar.6.13 posting)

The Jan. 29, 2013 article by Damien Gayle for the UK’s Daily Mail highlights these points and goes on to describe Jake’s accomplishments at more length (there are are also videos embedded in the article),

  • Jack Andraka’s new test detects pancreatic cancer earlier than any other
  • Deadly disease currently kills 19 out of 20 within five years
  • He claims his invention could raise survival rates to ‘close to 100 per cent’

… Jack’s invention, for which he was last month awarded the grand prize of $75,000 in scholarship funds at the 2012 Intel Science Fair, means that patients now have a simple method to detect pancreatic cancer before it becomes invasive.

His novel patent-pending sensor has proved to be 28 times faster, 28 times less expensive, and over 100 times more sensitive than current tests.[emphasis mine]

The test works in a similar way to diabetic testing strips, with his paper strips using only a drop of blood to determine whether patients carry the mesothelin biomarker.

It is said to be over 90 per cent accurate, practically instant – and costs only 3 cents.

And what’s more, his simple test can also be used to detect ovarian and lung cancer, and it could be easily altered to detect the biomarkers of a range of other conditions.

‘What’s so cool about that is its applicability to other diseases…for example other forms of cancer, tuberculosis, HIV, environmental contaminants like E Coli, salmonella,’ Jack told Take Part.

Andraka is also profiled in a December 2012 article by Abigail Tucker for the Smithsonian Institution. It reads more like a profile for a fan magazine (in parts) than one might expect from the Smithsonian but all that’s mixed in with some science and a discussion about product availability,

It’s first period digital arts class, and the assignment is to make Photoshop monsters. Sophomore Jack Andraka considers crossing a velociraptor with a Brazilian wandering spider, while another boy grafts butterfly wings onto a rhinoceros. Meanwhile, the teacher lectures on the deranged genius of Doctor Moreau and Frankenstein, “a man who created something he didn’t take responsibility for.”

“You don’t have to do this, Jack!” somebody in back shouts.

The silver glint of a retainer: Andraka grins. Since he won the $75,000 grand prize at this past spring’s Intel International Science and Engineering Fair, one of the few freshman ever to do so, he’s become a North County High School celebrity to rival any soccer star or homecoming queen.

That’s exactly what Andraka may have invented: A small dipstick probe that uses just a sixth of a drop of blood appears to be much more accurate than existing approaches and takes five minutes to complete. It’s still preliminary, but drug companies are interested, and word is spreading. “I’ve gotten these Facebook messages asking, ‘Can I have the test?’” Andraka says. “I am heartbroken to say no.” [emphasis mine]

According to the Jan. 27, 2013 article by Andri Antoniades for Take Part, Andraka has been talking to companies such as LabCorp and QuestDiagnostics,

He has big plans to turn the medical community on its ear by mass marketing his work, making it widely available. He says, “Essentially what I’m envisioning here is that this could be on your shelf at your Walgreens, your Kmart. Let’s say you suspect you have a condition…you buy the test for that. And you can see immediately if you have it. Instead of your doctor being the doctor, you’re the doctor.” The teenager reports that he’s already in talks with major corporations like LabCorp and QuestDiagnostics to bring his kits to store shelves “as soon as possible,” though how long that may actually take isn’t yet known.

John Nosta’s interview with Andraka, which highlights some of the difficulties associated with science research, was published in a Feb. 1, 2013 posting on Forbes.com,

–Was your discovery easy?  Did the innovation come in a flash…then the details worked out?

I like to read a lot of journals and articles about different topics and then lie on the couch or take a walk and just let all the information settle. Then all of a sudden I can get an idea and connect some dots. Then it’s back to reading so I can fill in missing pieces. With this sensor I had put in a lot of time learning about nanoparticles for my previous research on the effects of bulk and nano metal oxides on marine and freshwater organisms. I felt that single walled carbon nano tubes were like the super heroes of material science and I wanted to work with them some more. Then when I was reading a paper about them in biology class, the teacher was explaining about antibodies. All of a sudden I made a connection and wondered what would happen if I dispersed single wall carbon nanotubes with an antibody to a protein over-expressed in pancreatic cancer. Then of course there was a lot of reading, learning and planning in front of me!

It seemed so easy so I stalked the internet and found the names and professional emails of lots of professors in my area who were working on pancreatic cancer. Then I just figured I’d sit back and wait for the acceptances to roll in! Week after week I’d receive endless rejections. The most helpful one was actually from a researcher who took the time to point out every flaw and reason why my project was impossible. I began to despair!

… Finally, after 199 rejections, I received one email from Dr Maitra at Johns Hopkins School of Medicine. He invited me to come for a meeting. My mom drove me there and dropped me off. It was pretty exhilarating yet scary to walk in to the interview! Luckily I was really prepared and even had the cost and catalog numbers of the material I needed. He said it was like reading a grant proposal. I still had a great deal of basic lab routine to learn and I appreciate the time and patience of both Dr Maitra [Anirban Maitra] and Dr Chenna [V. Chenna], the post- doc who supported me.

There’s a brief description of Andraka’s test in an article (published June 16, 2012 online) by Devin Powell for Science News, 181 (12),

Searching for a better detector for mesothelin, Andraka coated paper with tiny tubes of atom-thick carbon. Antibodies stuck to the carbon nanotubes can grab the telltale protein and spread the tubes apart. The carbon’s resistance to the flow of electricity drops measurably as more protein attaches. Tests of the paper using blood samples from 100 people with cancer at different stages of the disease identified the presence of cancer every time, Andraka reported.

It’s quite a story on any number of levels. It’s not just Andraka’s age. There’s the simplicity of the idea, the difficulty of getting anyone to pay attention (199 rejections, that number seems suspiciously poetic), and what was undoubtedly a lot of painstaking, boring, hard work. Finally, the reference to a patent seems very much in the tenor of the times. I wish Andraka good luck with his work and I hope the test is available soon.

Canada’s Queen’s University strikes again with its ‘paper’ devices

Roel Vertegaal at Queen’s University (Ontario, Canada) has released a ‘paper’ tablet. Like the bendable, flexible ‘paper’ phone he presented at the CHI 2011 meeting in Vancouver, Canada (my May 12, 2011 posting), this tablet offers some intriguing possibilities but is tethered. The Jan. 9, 2013 news item on phys.org provides more information about the new ‘paper’ device (Note: Links have been removed),

Watch out tablet lovers – a flexible paper computer developed at Queen’s University in collaboration with Plastic Logic and Intel Labs will revolutionize the way people work with tablets and computers.

The PaperTab tablet looks and feels just like a sheet of paper. However, it is fully interactive with a flexible, high-resolution 10.7-inch plastic display developed by Plastic Logic and a flexible touchscreen. It is powered by the second generation I5 Core processor developed by Intel.

Vertegaal and his team have produced a video demonstrating their ‘paper’ tablet/computer:

The Jan. 8, 2013 Queen’s University news release, which originated the news item, provides descriptions (for those who don’t have time to watch the video),

“Using several PaperTabs makes it much easier to work with multiple documents,” says Roel Vertegaal, Director of Queen’s University’s Human Media Lab. “Within five to ten years, most computers, from ultra-notebooks to tablets, will look and feel just like these sheets of printed color paper.”

“We are actively exploring disruptive user experiences. The ‘PaperTab’ project, developed by the Human Media Lab at Queen’s University and Plastic Logic, demonstrates novel interactions powered by Intel processors that could potentially delight tablet users in the future,” says Intel’s Experience Design Lead Research Scientist, Ryan Brotman.

PaperTab’s intuitive interface allows users to create a larger drawing or display surface by placing two or more PaperTabs side by side. PaperTab emulates the natural handling of multiple sheets of paper. It can file and display thousands of paper documents, replacing the need for a computer monitor and stacks of papers or printouts.

Unlike traditional tablets, PaperTabs keep track of their location relative to each other, and to the user, providing a seamless experience across all apps, as if they were physical computer windows.

“Plastic Logic’s flexible plastic displays allow a natural human interaction with electronic paper, being lighter, thinner and more robust compared with today’s standard glass-based displays. This is just one example of the innovative revolutionary design approaches enabled by flexible displays,” explains Indro Mukerjee, CEO of Plastic Logic.

The partners are saying that ‘paper’ tablets may be on the market in foreseeable future  according to Emma Wollacott’s Jan. 8, 2013 article for TG Daily,

The bendy tablet has been coming for quite a while now, but a version to be shown off today at CES [Consumer Electronics Show] could be ready for the market within three years, say its creators.

You can find out more about the Human Media Lab at Queen’s University here, Plastic Logic here, and Intel Core I5 Processors here.

Better night vision goggles for the military

I remember a military type, a friend who served as a Canadian peacekeeper (Infantry) in the Balkans, describing night-vision goggles and mentioning they are loud. After all, it’s imaging equipment and that requires a power source or, in this case, a source of noise. The Dec. 29, 2012 news item on Nanowerk about improved imaging for night vision goggles doesn’t mention noise but hopefully, the problem has been addressed or mitigated (assuming this technology is meant to be worn),

Through some key breakthroughs in flexible semiconductors, electrical and computer engineering Professor Zhenqiang “Jack” Ma has created two imaging technologies that have potential applications beyond the 21st century battlefield.

With $750,000 in support from the Air Force Office of Scientific Research (AFOSR), Ma has developed curved night-vision goggles using germanium nanomembranes.

The Dec. 28, 2012 University of Wisconsin-Madison news release, which originated the news item, describes the Air Force project and another night vision project for the US Department of Defense,

Creating night-vision goggles with a curved surface allows a wider field of view for pilots, but requires highly photosensitive materials with mechanical bendability-the silicon used in conventional image sensors doesn’t cut it.

…  Ma’s design employs flexible germanium nanomembranes: a transferrable flexible semiconductor that until now has been too challenging to use in imagers due to a high dark current, the background electrical current that flows through photosensitive materials even when they aren’t exposed to light.

“Because of their higher dark current, the image often comes up much noisier on germanium-based imagers,” says Ma. “We solved that problem.”

Ma’s dark current reduction technology has also been recently licensed to Intel.

In another imaging project, the U.S. Department of Defense has provided Ma with $750,000 in support of development of imagers for military surveillance that span multiple spectra, combining infrared and visible light into a single image.

“The reason they are interested in IR is because visible light can be blocked by clouds, dust, smoke,” says Ma. “IR can go through, so simultaneous visible and IR imaging allows them to see everything.”

Inexpensive silicon makes production of visible light imagers a simple task, but IR relies on materials incompatible with silicon.

The current approach involves a sensor for IR images and a sensor for visible light, combining the two images in post-processing, which requires greater computing power and hardware complexity. Instead, Ma will employ a heterogeneous semiconductor nanomembrane, stacking the two incompatible materials in each pixel of the new imager to layer IR and visible images on top of one another in a single image.

The result will be imagers that can seamlessly shift between IR and visible images, allowing the picture to be richer and more quickly utilized for strategic decisionmaking.

It’s impossible to tell from the description if this particular technology will be worn by foot soldiers or human military personnel but, in the event it will be worn,  it does well to remember that it will need a power source. Interestingly, the average soldier already carries a lot of weight in batteries (up to 35 pounds!) as per my May 9, 2012 posting about energy-harvesting textiles and the military.

Machine Wilderness: ISEA 2012 in Albuquerque, New Mexico

The 2012 ISEA (International Symposium on Electronic Arts) is being held in Albuquerque, New Mexico from Sept. 19 – 24, 2012. From the ISEA 2012 home page,

The Eighteenth International Symposium on Electronic Art, ISEA2012 Albuquerque: Machine Wilderness is a symposium and series of events exploring the discourse of global proportions on the subject of art, technology and nature. The ISEA symposium is held every year in a different location around the world, and has a 30-year history of significant acclaim. Albuquerque is the first host city in the U.S. in six years.

The ISEA2012 symposium will consist of a conference September 19 – 24, 2012 based in Albuquerque with outreach days along the state’s “Cultural Corridor” in Santa Fe and Taos, and an expansive, regional collaboration throughout the fall of 2012, including art exhibitions, public events, performances and educational activities. This project will bring together a wealth of leading creative minds from around the globe, and engage the local community through in-depth partnerships.

Machine Wilderness references the New Mexico region as an area of rapid growth and technology alongside wide expanses of open land, and aims to present artists’ and technologists’ ideas for a more humane interaction between technology and wilderness in which “machines” can take many forms to support life on Earth. Machine Wilderness focuses on creative solutions for how technology and the natural world can sustainably co-exist.

The program will include: a bilingual [English/Spanish] focus, an indigenous thread, and a focus on land and skyscape. Because of our vast resource of land in New Mexico, proposals from artists are being sought that will take ISEA participants out into the landscape. The Albuquerque Balloon Museum offers a unique opportunity for artworks to extend into the sky as well.

Final decisions are being made now so the lists of programs and speakers aren’t complete yet but there is a sampling of some of what you’ll find in New Mexico this coming September (excerpted from the sampling on the Artworks/Performances page),

Eve Andrée Laramée & Tom Jennings (USA)
Invisible Landscape
at 516 ARTS
Invisible Landscape is a collaborative installation concerning the Cold War, “atomic” legacy; uranium mining and radioactive waste from the nuclear power industry and its “Parent machine” the nuclear weapons complex. The installation includes video projections and sculptures, digital photography, and light-box and sound sculptures. It is a mash-up of works by Laramée & Jennings, and includes components from Jennings’ installation Rocks and Code and Laramée’s installations Halfway to Invisible and Slouching Yucca Mountain.

Agnes Chavez (USA/Cuba) & Alessandro Saccoia (Italy)
(x)trees
at The Albuquerque Museum
(x)trees is a collaborative experiment in open source data visualization, video mapping and participatory art. Multi-disciplinary artist Agnes Chavez created the project in collaboration with open source net artist Jared Tarbell to write the open source video mapping code which captures data live from twitter, converts it into branches of trees and allows it to be projected onto walls and buildings as part of a socially interactive art piece. Chavez has collaborated with a team in Buenos Aires, Argentina; Creative Coder Jeff Milton, actionscript programmer Joe Roth, and videographer Matia Legaria, to realize a live event in BsAs. For ISEA2012, Chavez and collaborators will push the boundaries of the new medium to create a socially interactive virtual forest. New forms such as leaves and flowers will emerge around most used topics/key words, visualizing the “buzz” around the conference. (x)tree helps raise awareness to the importance of preserving linguistic, cultural and ecological diversity around the world.

Fred Paulino & Lucas Mafra (Brazil)
Gambiocycle
at 516 ARTS
Gambiocycle is a Mobile Broadcast unit. It is a tricycle containing electronic great for interactive video projection and digital graffiti in public space. The vehicle is inspired by anonymous ambulant salesmen that ride on wheels over Brazilian cities, mostly selling products or doing political advertisement. Gambiocycle, however, subverts this logic by gathering elements of performance, happening, electronic art, graffiti and “gambiarra” (makeshift, kludge): what it advertises is only a new era of straight democratic dialogue between people who participate of the interventions and their city.

Ivan Puig & Andrés Padilla Domené (Mexico)
SEFT-1
at The Albuquerque Museum
SEFT-1, by Mexican artists Ivan Puig and Andrés Padilla Domené is one of the most important projects working in the art, technology and society field in Mexico. This “Manned Railway Exploration Probe” is a vehicle equipped with a Hi-Rail system, a metal wheel mechanism that enables it to move on rails. Mexico’s trains once formed a network of connections between big cities and tiny pueblos throughout the country. This exploratory probe travels abandoned railways using photography, video, audio and text to record contemporary people, landscape and infrastructure in largely remote areas of the country, creating a futuristic exploration of Mexico’s past. The information recorded is continuously uploaded to the project’s website where the public can follow the SEFT’s progress. For ISEA2012, the SEFT will make a historic journey from the U.S./Mexico border to Albuquerque. The vehicle will be displayed as part of the ISEA2012 exhibition, and the artists will speak at the Latin American Forum. The journey of the SEFT-1 to El Paso for pre-conference activities is sponsored by The Stanlee and Gerald Rubin Center for the Visual Arts, University of Texas, El Paso.

Sampling of Performances

Idris Goodwin (USA)
Instant Messages
performed during ISEA2012 Intel Education Day
Hip Hop playwright Idris Goodwin will create an original, collaborative, multi-media performance work built entirely from public conversations and debates sampled from various social networking sites. Youth participants rom the National Hispanic Cultural Center’s Voces program will cross-reference more than 500 Facebook statuses, comments and Twitter feeds based on specific generic dramatic tropes. The project will interweave hundreds of digital dialogues to dramatize the human interactions of a virtual society. Youth, being the key pioneers of the virtual landscape, are integral to the process of creation.

Miguel Palma (Portugal)
remote Desert Exploration Vehicle
performed at the Downtown Block Party
In collaboration with engineers, robotics experts, geographers, car enthusiasts, military historians and other, Portuguese artist Miguel Palma will convert a former military vehicle into a remote exploration vehicle that will explore desert surroundings during the day and return to urban areas in the evening to project the desert imagery on buildings and other spaces at night. This project is sponsored by ASU Art Museum and the Desert Initiative.

Here’s a sampling from the Speakers & Panels page,

Public Dialogue: A Conversation with Prominent Brazilian artists and curators
For the ISEA2012 Latin American Forum, artist Giselle Beiguelman and curator Priscila Arantes, mediated by Simone Osthoff, will speak on the international art scene, offering the public a chance to see dynamic dialogues about contemporary media art from first-hand perspectives and experiences. Giselle Beiguelman guest juror of ISEA2012, is an international new media artist and multimedia essayist born and based in São Paulo, Brazil. She received a PhD in History from the University of São Paulo and is a former fellow of the VITAE Foundation. Priscila Arantes, Adjunct Director of MIS [Museum of Image and Sound] São Paulo, since 2010, the director of the Paço das Artes also in São Paulo, is a researcher and curator in the field of media art. Simone Osthoff is a Brazilian born artist and writer based in the U.S. since 1988. She is Associate Professor of Critical Studies in the School of Visual Arts at the Pennsylvania State University and the author of Performing the Archive: The Transformation of the Archive in Contemporary Art From Repository of Documents to Art Medium (Atropos Press, 2009).

Lea Rekow & Marc Schmitz
Mapping Contested Territory
For The Cosmos: Radical Cosmologies theme, theme leader Lea Rekow and artist Marc Schmitz will present a dialogue that brings together critical arts practice and action geography, describing an aerial and walking survey conducted with the Navajo community of Churchrock, New Mexico. Their journey maps radioactive accidents, abandoned uranium mines, dams and mills, that lie un-reclaimed and continue to ravage Navajo land, families and culture in the region. For the ISEA2012 conference, Rekow and Schmitz will offer a co-presention/skype panel with/at the Land Art Mongolia Biennial, that simultaneously looks at the impact from mining on indigenous culture of Mongolia and elsewhere.

Caroline Woolard
For the Creative Economies: Ecotopias theme, OurGoods.org co-founder Caroline Woolard will give a talk about the problems and possibilities of non-monetary exchange. If resource sharing is a paradigm of the 21st century, how do we build trust and communicate effectively at intimate-distance? This talk will explore the subjectivities made (im)possible by alternative economies, both analog and digital. Culled from three years of research and development as a co-founder of OurGoods.org and Trade School, two barter networks for cultural producers, Woolard’s talk reflects upon a contemporary fumbling for sharing relationships. Caroline Woolard is a Brooklyn based, post-media artist exploring civic engagement and communitarianism. Her work is collaborative and often takes the form of sculptures, websites and workshops.

There are a number of residencies and special projects,

ISEA2012 includes an array of residencies and special projects hosted by partnering organizations around the New Mexico and the region. They include artist-scientist residencies, site projects, artworks, performances and presentations, with schools, arts organizations, environmental organizations and the scientific and technological community. Some of the residencies and off-site projects feature a gallery component as part of the main ISEA2012 exhibition and/or a presentation at the conference.

Amongst other residencies, I noticed one for e-poetry, which I believe is still open for submissions. Here’s more about the residency (from the e-poetry residency [Local Poets’ Guild] page,

Local Poets’ Guild (LPG) is offering a poet re-envisioning art, technology and nature a two-week residency from September 4 – 18, 2012. LPG is specifically looking for poetry using electronic art forms with at least one component that will be accessible on the web. The writer selected will stay in a house on 3.6 acres in the high desert, located down three miles of dirt roads near the town of Moriarty, New Mexico, about 35 miles from Albuquerque. The residency may be extended for up to two weeks at no additional expense.

Project resources:
The poet who receives the residency will be offered a $400 honorarium from the Local Poets Guild and invited to share their work as an Internet present e-poem and in a reading at 516 ARTS as part of the ISEA 2012 conference.

The modest cabin is furnished and has full kitchen, bath, laundry, bedroom and workspace. The structure is nestled amid piñon and juniper trees, abuts an old windmill, and is backed up to 11,000 acres of forested ranchland, which is accessible to hiking. Expect coyotes, owls, nighthawks, deer, the occasional javelina or porcupine, plus great sunlight and better stars. Writers will be expected to provide their own transportation. Couples and/or collaborators are also eligible.

Application requirements:
Please submit a 300-word bio with a 500-word project statement and a link to a prior e-poetry project. Poets who don’t have a prior e-poetry project or prefer to show new work, should submit a “.doc” file in Microsoft Word of a PDF including all information plus five pages of poetry.

Description of sponsoring organization:
The Local Poets’ Guild’s mission is to advocate for poetry, develop audience, engage poets and foster the creative process, from conception and craft to publication and performance. The Local Poets’ Guild offers programs including a rural writers residency, craft talks and workshops, featured readings, showcases, publication of books and cd’s, writing to heal and writing nonviolence workshops, plus an online information hub, all completely community driven and requiring the best efforts of the poets involved. For more information, visit http://localpoetsguild.wordpress.com/

Good luck !

Registration for the conference opened March 2, 2012. Early bird fees apply until July 25, 2012.