Tag Archives: TSMC

Aliens wreak havoc on our personal electronics

The aliens in question are subatomic particles and the havoc they wreak is low-grade according to the scientist who was presenting on the topic at the AAAS (American Association for the Advancement of Science) 2017 Annual Meeting (Feb. 16 – 20, 2017) in Boston, Massachusetts. From a Feb. 17, 2017 news item on ScienceDaily,

You may not realize it but alien subatomic particles raining down from outer space are wreaking low-grade havoc on your smartphones, computers and other personal electronic devices.

When your computer crashes and you get the dreaded blue screen or your smartphone freezes and you have to go through the time-consuming process of a reset, most likely you blame the manufacturer: Microsoft or Apple or Samsung. In many instances, however, these operational failures may be caused by the impact of electrically charged particles generated by cosmic rays that originate outside the solar system.

“This is a really big problem, but it is mostly invisible to the public,” said Bharat Bhuva, professor of electrical engineering at Vanderbilt University, in a presentation on Friday, Feb. 17 at a session titled “Cloudy with a Chance of Solar Flares: Quantifying the Risk of Space Weather” at the annual meeting of the American Association for the Advancement of Science in Boston.

A Feb. 17, 2017 Vanderbilt University news release (also on EurekAlert), which originated the news item, expands on  the theme,

When cosmic rays traveling at fractions of the speed of light strike the Earth’s atmosphere they create cascades of secondary particles including energetic neutrons, muons, pions and alpha particles. Millions of these particles strike your body each second. Despite their numbers, this subatomic torrent is imperceptible and has no known harmful effects on living organisms. However, a fraction of these particles carry enough energy to interfere with the operation of microelectronic circuitry. When they interact with integrated circuits, they may alter individual bits of data stored in memory. This is called a single-event upset or SEU.

Since it is difficult to know when and where these particles will strike and they do not do any physical damage, the malfunctions they cause are very difficult to characterize. As a result, determining the prevalence of SEUs is not easy or straightforward. “When you have a single bit flip, it could have any number of causes. It could be a software bug or a hardware flaw, for example. The only way you can determine that it is a single-event upset is by eliminating all the other possible causes,” Bhuva explained.

There have been a number of incidents that illustrate how serious the problem can be, Bhuva reported. For example, in 2003 in the town of Schaerbeek, Belgium a bit flip in an electronic voting machine added 4,096 extra votes to one candidate. The error was only detected because it gave the candidate more votes than were possible and it was traced to a single bit flip in the machine’s register. In 2008, the avionics system of a Qantus passenger jet flying from Singapore to Perth appeared to suffer from a single-event upset that caused the autopilot to disengage. As a result, the aircraft dove 690 feet in only 23 seconds, injuring about a third of the passengers seriously enough to cause the aircraft to divert to the nearest airstrip. In addition, there have been a number of unexplained glitches in airline computers – some of which experts feel must have been caused by SEUs – that have resulted in cancellation of hundreds of flights resulting in significant economic losses.

An analysis of SEU failure rates for consumer electronic devices performed by Ritesh Mastipuram and Edwin Wee at Cypress Semiconductor on a previous generation of technology shows how prevalent the problem may be. Their results were published in 2004 in Electronic Design News and provided the following estimates:

  • A simple cell phone with 500 kilobytes of memory should only have one potential error every 28 years.
  • A router farm like those used by Internet providers with only 25 gigabytes of memory may experience one potential networking error that interrupts their operation every 17 hours.
  • A person flying in an airplane at 35,000 feet (where radiation levels are considerably higher than they are at sea level) who is working on a laptop with 500 kilobytes of memory may experience one potential error every five hours.

Bhuva is a member of Vanderbilt’s Radiation Effects Research Group, which was established in 1987 and is the largest academic program in the United States that studies the effects of radiation on electronic systems. The group’s primary focus was on military and space applications. Since 2001, the group has also been analyzing radiation effects on consumer electronics in the terrestrial environment. They have studied this phenomenon in the last eight generations of computer chip technology, including the current generation that uses 3D transistors (known as FinFET) that are only 16 nanometers in size. The 16-nanometer study was funded by a group of top microelectronics companies, including Altera, ARM, AMD, Broadcom, Cisco Systems, Marvell, MediaTek, Renesas, Qualcomm, Synopsys, and TSMC

“The semiconductor manufacturers are very concerned about this problem because it is getting more serious as the size of the transistors in computer chips shrink and the power and capacity of our digital systems increase,” Bhuva said. “In addition, microelectronic circuits are everywhere and our society is becoming increasingly dependent on them.”

To determine the rate of SEUs in 16-nanometer chips, the Vanderbilt researchers took samples of the integrated circuits to the Irradiation of Chips and Electronics (ICE) House at Los Alamos National Laboratory. There they exposed them to a neutron beam and analyzed how many SEUs the chips experienced. Experts measure the failure rate of microelectronic circuits in a unit called a FIT, which stands for failure in time. One FIT is one failure per transistor in one billion hours of operation. That may seem infinitesimal but it adds up extremely quickly with billions of transistors in many of our devices and billions of electronic systems in use today (the number of smartphones alone is in the billions). Most electronic components have failure rates measured in 100’s and 1,000’s of FITs.

chart

Trends in single event upset failure rates at the individual transistor, integrated circuit and system or device level for the three most recent manufacturing technologies. (Bharat Bhuva, Radiation Effects Research Group, Vanderbilt University)

“Our study confirms that this is a serious and growing problem,” said Bhuva.“This did not come as a surprise. Through our research on radiation effects on electronic circuits developed for military and space applications, we have been anticipating such effects on electronic systems operating in the terrestrial environment.”

Although the details of the Vanderbilt studies are proprietary, Bhuva described the general trend that they have found in the last three generations of integrated circuit technology: 28-nanometer, 20-nanometer and 16-nanometer.

As transistor sizes have shrunk, they have required less and less electrical charge to represent a logical bit. So the likelihood that one bit will “flip” from 0 to 1 (or 1 to 0) when struck by an energetic particle has been increasing. This has been partially offset by the fact that as the transistors have gotten smaller they have become smaller targets so the rate at which they are struck has decreased.

More significantly, the current generation of 16-nanometer circuits have a 3D architecture that replaced the previous 2D architecture and has proven to be significantly less susceptible to SEUs. Although this improvement has been offset by the increase in the number of transistors in each chip, the failure rate at the chip level has also dropped slightly. However, the increase in the total number of transistors being used in new electronic systems has meant that the SEU failure rate at the device level has continued to rise.

Unfortunately, it is not practical to simply shield microelectronics from these energetic particles. For example, it would take more than 10 feet of concrete to keep a circuit from being zapped by energetic neutrons. However, there are ways to design computer chips to dramatically reduce their vulnerability.

For cases where reliability is absolutely critical, you can simply design the processors in triplicate and have them vote. Bhuva pointed out: “The probability that SEUs will occur in two of the circuits at the same time is vanishingly small. So if two circuits produce the same result it should be correct.” This is the approach that NASA used to maximize the reliability of spacecraft computer systems.

The good news, Bhuva said, is that the aviation, medical equipment, IT, transportation, communications, financial and power industries are all aware of the problem and are taking steps to address it. “It is only the consumer electronics sector that has been lagging behind in addressing this problem.”

The engineer’s bottom line: “This is a major problem for industry and engineers, but it isn’t something that members of the general public need to worry much about.”

That’s fascinating and I hope the consumer electronics industry catches up with this ‘alien invasion’ issue. Finally, the ‘bit flips’ made me think of the 1956 movie ‘Invasion of the Body Snatchers‘.

Book announcement: Atomistic Simulation of Quantum Transport in Nanoelectronic Devices

For anyone who’s curious about where we go after creating chips at the 7nm size, this may be the book for you. Here’s more from a July 27, 2016 news item on Nanowerk,

In the year 2015, Intel, Samsung and TSMC began to mass-market the 14nm technology called FinFETs. In the same year, IBM, working with Global Foundries, Samsung, SUNY, and various equipment suppliers, announced their success in fabricating 7nm devices. A 7nm silicon channel is about 50 atomic layers and these devices are truly atomic! It is clear that we have entered an era of atomic scale transistors. How do we model the carrier transport in such atomic scale devices?

One way is to improve existing device models by including more and more parameters. This is called the top-down approach. However, as device sizes shrink, the number of parameters grows rapidly, making the top-down approach more and more sophisticated and challenging. Most importantly, to continue Moore’s law, electronic engineers are exploring new electronic materials and new operating mechanisms. These efforts are beyond the scope of well-established device models — hence significant changes are necessary to the top-down approach.

An alternative way is called the bottom-up approach. The idea is to build up nanoelectronic devices atom by atom on a computer, and predict the transport behavior from first principles. By doing so, one is allowed to go inside atomic structures and see what happens from there. The elegance of the approach comes from its unification and generality. Everything comes out naturally from the very basic principles of quantum mechanics and nonequilibrium statistics. The bottom-up approach is complementary to the top-down approach, and is extremely useful for testing innovative ideas of future technologies.

A July 27, 2016 World Scientific news release on EurekAlert, which originated the news item, delves into the topics covered by the book,

In recent decades, several device simulation tools using the bottom-up approach have been developed in universities and software companies. Some examples are McDcal, Transiesta, Atomistic Tool Kit, Smeagol, NanoDcal, NanoDsim, OpenMX, GPAW and NEMO-5. These software tools are capable of predicting electric current flowing through a nanostructure. Essentially the input is the atomic coordinates and the output is the electric current. These software tools have been applied extensively to study emerging electronic materials and devices.

However, developing such a software tool is extremely difficult. It takes years-long experiences and requires knowledge of and techniques in condensed matter physics, computer science, electronic engineering, and applied mathematics. In a library, one can find books on density functional theory, books on quantum transport, books on computer programming, books on numerical algorithms, and books on device simulation. But one can hardly find a book integrating all these fields for the purpose of nanoelectronic device simulation.

“Atomistic Simulation of Quantum Transport in Nanoelectronic Devices” (With CD-ROM) fills the chasm. Authors Yu Zhu and Lei Liu have experience in both academic research and software development. Yu Zhu is the project manager of NanoDsim, and Lei Liu is the project manager of NanoDcal. The content of the book is based Zhu and Liu’s combined R&D experiences of more than forty years.

In this book, the authors conduct an experiment and adopt a “paradigm” approach. Instead of organizing materials by fields, they focus on the development of one particular software tool called NanoDsim, and provide relevant knowledge and techniques whenever needed. The black of box of NanoDsim is opened, and the complete procedure from theoretical derivation, to numerical implementation, all the way to device simulation is illustrated. The affilicated source code of NanoDsim also provides an open platform for new researchers.

I’m not recommending the book as I haven’t read it but it does seem intriguing. For anyone who wishes to purchase it, you can do that here.

I wrote about IBM and its 7nm chip in a July 15, 2015 post.

Canon-Molecular Imprints deal and its impact on shrinking chips (integrated circuits)

There’s quite an interesting April 20, 2014 essay on Nanotechnology Now which provides some insight into the nanoimprinting market. I recommend reading it but for anyone who is not intimately familiar with the scene, here are a few excerpts along with my attempts to decode this insider’s (from Martini Tech) view,

About two months ago, important news shook the small but lively Japanese nanoimprint community: Canon has decided to acquire, making it a wholly-owned subsidiary, Texas-based Molecular Imprints, a strong player in the nanotechnology industry and one of the main makers of nanoimprint devices such as the Imprio 450 and other models.

So, Canon, a Japanese company, has made a move into the nanoimpriting sector by purchasing Molecular Imprints, a US company based in Texas, outright.

This next part concerns the expiration of Moore’s Law (i.e., every 18 months computer chips get smaller and faster) and is why the major chip makers are searching for new solutions as per the fifth paragraph in this excerpt,

Molecular Imprints` devices are aimed at the IC [integrated circuits, aka chips, I think] patterning market and not just at the relatively smaller applications market to which nanoimprint is usually confined: patterning of bio culture substrates, thin film applications for the solar industry, anti-reflection films for smartphone and LED TV screens, patterning of surfaces for microfluidics among others.

While each one of the markets listed above has the potential of explosive growth in the medium-long term future, at the moment none of them is worth more than a few percentage points, at best, of the IC patterning market.

The mainstream technology behind IC patterning is still optical stepper lithography and the situation is not likely to change in the near term future.

However, optical lithography has its limitations, the main challenge to its 40-year dominance not coming only from technological and engineering issues, but mostly from economical ones.

While from a strictly technological point of view it may still be possible for the major players in the chip industry (Intel, GF, TSMC, Nvidia among others) to go ahead with optical steppers and reach the 5nm node using multi-patterning and immersion, the cost increases associated with each die shrink are becoming staggeringly high.

A top-of-the-notch stepper in the early 90s could have been bought for a few millions of dollars, now the price has increased to some tens of millions for the top machines

The essay describes the market impact this acquisition may have for Canon,

Molecular Imprints has been a company on the forefront of commercialization of nanoimprint-based solutions for IC manufacturing, but so far their solutions have yet to become a viable alternative HVM IC manufacturing market.

The main stumbling blocks for IC patterning using nanoimprint technology are: the occurrence of defects on the mask that inevitably replicates them on each substrate and the lack of alignment precision between the mold and the substrate needed to pattern multi-layered structures.

Therefore, applications for nanoimprint have been limited to markets where no non-periodical structure patterning is needed and where one-layered patterning is sufficient.

But the big market where everyone is aiming for is, of course, IC patterning and this is where much of the R&D effort goes.

While logic patterning with nanoimprint may still be years away, simple patterning of NAND structures may be feasible in the near future, and the purchase of Molecular Imprints by Canon is a step in this direction

Patterning of NAND structures may still require multi-layered structures, but the alignment precision needed is considerably lower than logic.

Moreover, NAND requirements for defectivity are more relaxed than for logic due to the inherent redundancy of the design, therefore, NAND manufacturing is the natural first step for nanoimprint in the IC manufacturing market and, if successful, it may open a whole new range of opportunities for the whole sector.

Assuming I’ve read the rest of this essay rightly, here’s my summary: there are a number of techniques being employed to make chips smaller and more efficient. Canon has purchased a company that is versed in a technique that creates NAND (you can find definitions here) structures in the hope that this technique can be commercialized so that Canon becomes dominant in the sector because (1) they got there first and/or because (2) NAND manufacturing becomes a clear leader, crushing competition from other technologies. This could cover short-term goals and, I imagine Canon hopes, long-term goals.

It was a real treat coming across this essay as it’s an insider’s view. So, thank you to the folks at Martini Tech who wrote this. You can find Molecular Imprints here.

IBM, Intel, and New York state

$4.4B is quite the investment(especially considering the current international economic gyrations) and it’s the amount that IBM (International Business Machines), Intel, and three other companies announced that they are investing to “create the next generation of computer chip technology.” From the Sept. 28, 2011 news item on Nanowerk,

The five companies involved are Intel, IBM, GLOBALFOUNDRIES, TSMC and Samsung. New York State secured the investments in competition with countries in Europe, Asia and the Middle East. The agreements mark an historic level of private investment in the nanotechnology sector in New York. [emphasis mine]

Research and development facilities will be located in Albany, Canandaigua, Utica, East Fishkill and Yorktown Heights. In addition, Intel separately agreed to establish its 450mm East Coast Headquarters to support the overall project management in Albany. [emphasis mine]

The money is being spent on two projects,

The investment in the state is made up of two projects. The first project, which will be led by IBM and its partners, will focus on making the next two generations of computer chips. These new chips will power advanced systems of all sizes, including, among other things computers and national security applications. This new commitment by IBM brings its total investment in chip technology in New York to more than $10 billion in the last decade.

The second project, which is a joint effort by Intel, IBM, TSMC, Global Foundries and Samsung, will focus on transforming existing 300mm technology into the new 450mm technology. [emphasis mine] The new technology will produce more than twice the number of chips processed on today’s 300 mm wafers thus lowering costs to deliver future generations of technology with greater value and lower environmental impact.

I had to read that bit about increasing the size of the chips a few times since the news items I come across usually crow about decreasing the size.

I have been intermittently following news about the nanotechnology sector in New York state for some time (scroll about 1/2 way down my January 29, 2010 posting). In 2008, IBM announced a $1.5B investment toward the nanotechnology sector in that state.

I wish there had been some description of the investments in the nanotechnology sector as opposed to the generalized statements about jobs, purchasing ‘Made in NY’ technology, and the reference to millimeter (mm) scale computer chips. As for the “450mm East Coast Headquarters,” they may want to rethink that name.