Tag Archives: touch

Atomic force microscope (AFM) shrunk down to a dime-sized device?

Before getting to the announcement, here’s a little background from Dexter Johnson’s Feb. 21, 2017 posting on his NanoClast blog (on the IEEE [Institute of Electrical and Electronics Engineers] website; Note: Links have been removed),

Ever since the 1980s, when Gerd Binnig of IBM first heard that “beautiful noise” made by the tip of the first scanning tunneling microscope (STM) dragging across the surface of an atom, and he later developed the atomic force microscope (AFM), these microscopy tools have been the bedrock of nanotechnology research and development.

AFMs have continued to evolve over the years, and at one time, IBM even looked into using them as the basis of a memory technology in the company’s Millipede project. Despite all this development, AFMs have remained bulky and expensive devices, costing as much as $50,000 [or more].

Now, here’s the announcement in a Feb. 15, 2017 news item on Nanowerk,

Researchers at The University of Texas at Dallas have created an atomic force microscope on a chip, dramatically shrinking the size — and, hopefully, the price tag — of a high-tech device commonly used to characterize material properties.

“A standard atomic force microscope is a large, bulky instrument, with multiple control loops, electronics and amplifiers,” said Dr. Reza Moheimani, professor of mechanical engineering at UT Dallas. “We have managed to miniaturize all of the electromechanical components down onto a single small chip.”

A Feb. 15, 2017 University of Texas at Dallas news release, which originated the news item, provides more detail,

An atomic force microscope (AFM) is a scientific tool that is used to create detailed three-dimensional images of the surfaces of materials, down to the nanometer scale — that’s roughly on the scale of individual molecules.

The basic AFM design consists of a tiny cantilever, or arm, that has a sharp tip attached to one end. As the apparatus scans back and forth across the surface of a sample, or the sample moves under it, the interactive forces between the sample and the tip cause the cantilever to move up and down as the tip follows the contours of the surface. Those movements are then translated into an image.

“An AFM is a microscope that ‘sees’ a surface kind of the way a visually impaired person might, by touching. You can get a resolution that is well beyond what an optical microscope can achieve,” said Moheimani, who holds the James Von Ehr Distinguished Chair in Science and Technology in the Erik Jonsson School of Engineering and Computer Science. “It can capture features that are very, very small.”

The UT Dallas team created its prototype on-chip AFM using a microelectromechanical systems (MEMS) approach.

“A classic example of MEMS technology are the accelerometers and gyroscopes found in smartphones,” said Dr. Anthony Fowler, a research scientist in Moheimani’s Laboratory for Dynamics and Control of Nanosystems and one of the article’s co-authors. “These used to be big, expensive, mechanical devices, but using MEMS technology, accelerometers have shrunk down onto a single chip, which can be manufactured for just a few dollars apiece.”

The MEMS-based AFM is about 1 square centimeter in size, or a little smaller than a dime. It is attached to a small printed circuit board, about half the size of a credit card, which contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device.

Conventional AFMs operate in various modes. Some map out a sample’s features by maintaining a constant force as the probe tip drags across the surface, while others do so by maintaining a constant distance between the two.

“The problem with using a constant height approach is that the tip is applying varying forces on a sample all the time, which can damage a sample that is very soft,” Fowler said. “Or, if you are scanning a very hard surface, you could wear down the tip,”

The MEMS-based AFM operates in “tapping mode,” which means the cantilever and tip oscillate up and down perpendicular to the sample, and the tip alternately contacts then lifts off from the surface. As the probe moves back and forth across a sample material, a feedback loop maintains the height of that oscillation, ultimately creating an image.

“In tapping mode, as the oscillating cantilever moves across the surface topography, the amplitude of the oscillation wants to change as it interacts with sample,” said Dr. Mohammad Maroufi, a research associate in mechanical engineering and co-author of the paper. “This device creates an image by maintaining the amplitude of oscillation.”

Because conventional AFMs require lasers and other large components to operate, their use can be limited. They’re also expensive.

“An educational version can cost about $30,000 or $40,000, and a laboratory-level AFM can run $500,000 or more,” Moheimani said. “Our MEMS approach to AFM design has the potential to significantly reduce the complexity and cost of the instrument.

“One of the attractive aspects about MEMS is that you can mass produce them, building hundreds or thousands of them in one shot, so the price of each chip would only be a few dollars. As a result, you might be able to offer the whole miniature AFM system for a few thousand dollars.”

A reduced size and price tag also could expand the AFMs’ utility beyond current scientific applications.

“For example, the semiconductor industry might benefit from these small devices, in particular companies that manufacture the silicon wafers from which computer chips are made,” Moheimani said. “With our technology, you might have an array of AFMs to characterize the wafer’s surface to find micro-faults before the product is shipped out.”

The lab prototype is a first-generation device, Moheimani said, and the group is already working on ways to improve and streamline the fabrication of the device.

“This is one of those technologies where, as they say, ‘If you build it, they will come.’ We anticipate finding many applications as the technology matures,” Moheimani said.

In addition to the UT Dallas researchers, Michael Ruppert, a visiting graduate student from the University of Newcastle in Australia, was a co-author of the journal article. Moheimani was Ruppert’s doctoral advisor.

So, an AFM that could cost as much as $500,000 for a laboratory has been shrunk to this size and become far less expensive,

A MEMS-based atomic force microscope developed by engineers at UT Dallas is about 1 square centimeter in size (top center). Here it is attached to a small printed circuit board that contains circuitry, sensors and other miniaturized components that control the movement and other aspects of the device. Courtesy: University of Texas at Dallas

Of course, there’s still more work to be done as you’ll note when reading Dexter’s Feb. 21, 2017 posting where he features answers to questions he directed to the researchers.

Here’s a link to and a citation for the paper,

On-Chip Dynamic Mode Atomic Force Microscopy: A Silicon-on-Insulator MEMS Approach by  Michael G. Ruppert, Anthony G. Fowler, Mohammad Maroufi, S. O. Reza Moheimani. IEEE Journal of Microelectromechanical Systems Volume: 26 Issue: 1  Feb. 2017 DOI: 10.1109/JMEMS.2016.2628890 Date of Publication: 06 December 2016

This paper is behind a paywall.

Feeling with a bionic finger

From what I understand one of the most difficult aspects of an amputation is the loss of touch, so, bravo to the engineers. From a March 8, 2016 news item on ScienceDaily,

An amputee was able to feel smoothness and roughness in real-time with an artificial fingertip that was surgically connected to nerves in his upper arm. Moreover, the nerves of non-amputees can also be stimulated to feel roughness, without the need of surgery, meaning that prosthetic touch for amputees can now be developed and safely tested on intact individuals.

The technology to deliver this sophisticated tactile information was developed by Silvestro Micera and his team at EPFL (Ecole polytechnique fédérale de Lausanne) and SSSA (Scuola Superiore Sant’Anna) together with Calogero Oddo and his team at SSSA. The results, published today in eLife, provide new and accelerated avenues for developing bionic prostheses, enhanced with sensory feedback.

A March 8, 2016 EPFL press release (also on EurekAlert), which originated the news item, provides more information about Sorenson’s experience and about the other tests the research team performed,

“The stimulation felt almost like what I would feel with my hand,” says amputee Dennis Aabo Sørensen about the artificial fingertip connected to his stump. He continues, “I still feel my missing hand, it is always clenched in a fist. I felt the texture sensations at the tip of the index finger of my phantom hand.”

Sørensen is the first person in the world to recognize texture using a bionic fingertip connected to electrodes that were surgically implanted above his stump.

Nerves in Sørensen’s arm were wired to an artificial fingertip equipped with sensors. A machine controlled the movement of the fingertip over different pieces of plastic engraved with different patterns, smooth or rough. As the fingertip moved across the textured plastic, the sensors generated an electrical signal. This signal was translated into a series of electrical spikes, imitating the language of the nervous system, then delivered to the nerves.

Sørensen could distinguish between rough and smooth surfaces 96% of the time.

In a previous study, Sorensen’s implants were connected to a sensory-enhanced prosthetic hand that allowed him to recognize shape and softness. In this new publication about texture in the journal eLife, the bionic fingertip attains a superior level of touch resolution.

Simulating touch in non-amputees

This same experiment testing coarseness was performed on non-amputees, without the need of surgery. The tactile information was delivered through fine, needles that were temporarily attached to the arm’s median nerve through the skin. The non-amputees were able to distinguish roughness in textures 77% of the time.

But does this information about touch from the bionic fingertip really resemble the feeling of touch from a real finger? The scientists tested this by comparing brain-wave activity of the non-amputees, once with the artificial fingertip and then with their own finger. The brain scans collected by an EEG cap on the subject’s head revealed that activated regions in the brain were analogous.

The research demonstrates that the needles relay the information about texture in much the same way as the implanted electrodes, giving scientists new protocols to accelerate for improving touch resolution in prosthetics.

“This study merges fundamental sciences and applied engineering: it provides additional evidence that research in neuroprosthetics can contribute to the neuroscience debate, specifically about the neuronal mechanisms of the human sense of touch,” says Calogero Oddo of the BioRobotics Institute of SSSA. “It will also be translated to other applications such as artificial touch in robotics for surgery, rescue, and manufacturing.”

Here’s a link to and a citation for the paper,

Intraneural stimulation elicits discrimination of textural features by artificial fingertip in intact and amputee humans by Calogero Maria Oddo, Stanisa Raspopovic, Fiorenzo Artoni, Alberto Mazzoni, Giacomo Spigler, Francesco Petrini, Federica Giambattistelli, Fabrizio Vecchio, Francesca Miraglia, Loredana Zollo, Giovanni Di Pino, Domenico Camboni, Maria Chiara Carrozza, Eugenio Guglielmelli, Paolo Maria Rossini, Ugo Faraguna, Silvestro Micera. eLife, 2016; 5 DOI: 10.7554/eLife.09148 Published March 8, 2016

This paper appears to be open access.

The sense of touch via artificial skin

Scientists have been working for years to allow artificial skin to transmit what the brain would recognize as the sense of touch. For anyone who has lost a limb and gotten a prosthetic replacement, the loss of touch is reputedly one of the more difficult losses to accept. The sense of touch is also vital in robotics if the field is to expand and include activities reliant on the sense of touch, e.g., how much pressure do you use to grasp a cup; how much strength  do you apply when moving an object from one place to another?

For anyone interested in the ‘electronic skin and pursuit of touch’ story, I have a Nov. 15, 2013 posting which highlights the evolution of the research into e-skin and what was then some of the latest work.

This posting is a 2015 update of sorts featuring the latest e-skin research from Stanford University and Xerox PARC. (Dexter Johnson in an Oct. 15, 2015 posting on his Nanoclast blog (on the IEEE [Institute of Electrical and Electronics Engineering] site) provides a good research summary.) For anyone with an appetite for more, there’s this from an Oct. 15, 2015 American Association for the Advancement of Science (AAAS) news release on EurekAlert,

Using flexible organic circuits and specialized pressure sensors, researchers have created an artificial “skin” that can sense the force of static objects. Furthermore, they were able to transfer these sensory signals to the brain cells of mice in vitro using optogenetics. For the many people around the world living with prosthetics, such a system could one day allow them to feel sensation in their artificial limbs. To create the artificial skin, Benjamin Tee et al. developed a specialized circuit out of flexible, organic materials. It translates static pressure into digital signals that depend on how much mechanical force is applied. A particular challenge was creating sensors that can “feel” the same range of pressure that humans can. Thus, on the sensors, the team used carbon nanotubes molded into pyramidal microstructures, which are particularly effective at tunneling the signals from the electric field of nearby objects to the receiving electrode in a way that maximizes sensitivity. Transferring the digital signal from the artificial skin system to the cortical neurons of mice proved to be another challenge, since conventional light-sensitive proteins used in optogenetics do not stimulate neural spikes for sufficient durations for these digital signals to be sensed. Tee et al. therefore engineered new optogenetic proteins able to accommodate longer intervals of stimulation. Applying these newly engineered optogenic proteins to fast-spiking interneurons of the somatosensory cortex of mice in vitro sufficiently prolonged the stimulation interval, allowing the neurons to fire in accordance with the digital stimulation pulse. These results indicate that the system may be compatible with other fast-spiking neurons, including peripheral nerves.

And, there’s an Oct. 15, 2015 Stanford University news release on EurkeAlert describing this work from another perspective,

The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake.

Five years ago, Bao’s [Zhenan Bao, a professor of chemical engineering at Stanford,] team members first described how to use plastics and rubbers as pressure sensors by measuring the natural springiness of their molecular structures. They then increased this natural pressure sensitivity by indenting a waffle pattern into the thin plastic, which further compresses the plastic’s molecular springs.

To exploit this pressure-sensing capability electronically, the team scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

This allowed the plastic sensor to mimic human skin, which transmits pressure information as short pulses of electricity, similar to Morse code, to the brain. Increasing pressure on the waffled nanotubes squeezes them even closer together, allowing more electricity to flow through the sensor, and those varied impulses are sent as short pulses to the sensing mechanism. Remove pressure, and the flow of pulses relaxes, indicating light touch. Remove all pressure and the pulses cease entirely.

The team then hooked this pressure-sensing mechanism to the second ply of their artificial skin, a flexible electronic circuit that could carry pulses of electricity to nerve cells.

Importing the signal

Bao’s team has been developing flexible electronics that can bend without breaking. For this project, team members worked with researchers from PARC, a Xerox company, which has a technology that uses an inkjet printer to deposit flexible circuits onto plastic. Covering a large surface is important to making artificial skin practical, and the PARC collaboration offered that prospect.

Finally the team had to prove that the electronic signal could be recognized by a biological neuron. It did this by adapting a technique developed by Karl Deisseroth, a fellow professor of bioengineering at Stanford who pioneered a field that combines genetics and optics, called optogenetics. Researchers bioengineer cells to make them sensitive to specific frequencies of light, then use light pulses to switch cells, or the processes being carried on inside them, on and off.

For this experiment the team members engineered a line of neurons to simulate a portion of the human nervous system. They translated the electronic pressure signals from the artificial skin into light pulses, which activated the neurons, proving that the artificial skin could generate a sensory output compatible with nerve cells.

Optogenetics was only used as an experimental proof of concept, Bao said, and other methods of stimulating nerves are likely to be used in real prosthetic devices. Bao’s team has already worked with Bianxiao Cui, an associate professor of chemistry at Stanford, to show that direct stimulation of neurons with electrical pulses is possible.

Bao’s team envisions developing different sensors to replicate, for instance, the ability to distinguish corduroy versus silk, or a cold glass of water from a hot cup of coffee. This will take time. There are six types of biological sensing mechanisms in the human hand, and the experiment described in Science reports success in just one of them.

But the current two-ply approach means the team can add sensations as it develops new mechanisms. And the inkjet printing fabrication process suggests how a network of sensors could be deposited over a flexible layer and folded over a prosthetic hand.

“We have a lot of work to take this from experimental to practical applications,” Bao said. “But after spending many years in this work, I now see a clear path where we can take our artificial skin.”

Here’s a link to and a citation for the paper,

A skin-inspired organic digital mechanoreceptor by Benjamin C.-K. Tee, Alex Chortos, Andre Berndt, Amanda Kim Nguyen, Ariane Tom, Allister McGuire, Ziliang Carter Lin, Kevin Tien, Won-Gyu Bae, Huiliang Wang, Ping Mei, Ho-Hsiu Chou, Bianxiao Cui, Karl Deisseroth, Tse Nga Ng, & Zhenan Bao. Science 16 October 2015 Vol. 350 no. 6258 pp. 313-316 DOI: 10.1126/science.aaa9306

This paper is behind a paywall.

A wearable book (The Girl Who Was Plugged In) makes you feel the protagonists pain

A team of students taking an MIT (Massachusetts Institute of Technology) course called ‘Science Fiction to Science Fabrication‘ have created a new kind of category for books, sensory fiction.  John Brownlee in his Feb. 10, 2014 article for Fast Company describes it this way,

Have you ever felt your pulse quicken when you read a book, or your skin go clammy during a horror story? A new student project out of MIT wants to deepen those sensations. They have created a wearable book that uses inexpensive technology and neuroscientific hacking to create a sort of cyberpunk Neverending Story that blurs the line between the bodies of a reader and protagonist.

Called Sensory Fiction, the project was created by a team of four MIT students–Felix Heibeck, Alexis Hope, Julie Legault, and Sophia Brueckner …

Here’s the MIT video demonstrating the book in use (from the course’s sensory fiction page),

Here’s how the students have described their sensory book, from the project page,

Sensory fiction is about new ways of experiencing and creating stories.

Traditionally, fiction creates and induces emotions and empathy through words and images.  By using a combination of networked sensors and actuators, the Sensory Fiction author is provided with new means of conveying plot, mood, and emotion while still allowing space for the reader’s imagination. These tools can be wielded to create an immersive storytelling experience tailored to the reader.

To explore this idea, we created a connected book and wearable. The ‘augmented’ book portrays the scenery and sets the mood, and the wearable allows the reader to experience the protagonist’s physiological emotions.

The book cover animates to reflect the book’s changing atmosphere, while certain passages trigger vibration patterns.

Changes in the protagonist’s emotional or physical state triggers discrete feedback in the wearable, whether by changing the heartbeat rate, creating constriction through air pressure bags, or causing localized temperature fluctuations.

Our prototype story, ‘The Girl Who Was Plugged In’ by James Tiptree showcases an incredible range of settings and emotions. The main protagonist experiences both deep love and ultimate despair, the freedom of Barcelona sunshine and the captivity of a dark damp cellar.

The book and wearable support the following outputs:

  • Light (the book cover has 150 programmable LEDs to create ambient light based on changing setting and mood)
  • Sound
  • Personal heating device to change skin temperature (through a Peltier junction secured at the collarbone)
  • Vibration to influence heart rate
  • Compression system (to convey tightness or loosening through pressurized airbags)

One of the earliest stories about this project was a Jan. 28,2014 piece written by Alison Flood for the Guardian where she explains how vibration, etc. are used to convey/stimulate the reader’s sensations and emotions,

MIT scientists have created a ‘wearable’ book using temperature and lighting to mimic the experiences of a book’s protagonist

The book, explain the researchers, senses the page a reader is on, and changes ambient lighting and vibrations to “match the mood”. A series of straps form a vest which contains a “heartbeat and shiver simulator”, a body compression system, temperature controls and sound.

“Changes in the protagonist’s emotional or physical state trigger discrete feedback in the wearable [vest], whether by changing the heartbeat rate, creating constriction through air pressure bags, or causing localised temperature fluctuations,” say the academics.

Flood goes on to illuminate how science fiction has explored the notion of ‘sensory books’ (Note: Links have been removed) and how at least one science fiction novelist is responding to this new type of book,,

The Arthur C Clarke award-winning science fiction novelist Chris Beckett wrote about a similar invention in his novel Marcher, although his “sensory” experience comes in the form of a video game:

Adam Roberts, another prize-winning science fiction writer, found the idea of “sensory” fiction “amazing”, but also “infantalising, like reverting to those sorts of books we buy for toddlers that have buttons in them to generate relevant sound-effects”.

Elise Hu in her Feb. 6, 2014 posting on the US National Public Radio (NPR) blog, All Tech Considered, takes a different approach to the topic,

The prototype does work, but it won’t be manufactured anytime soon. The creation was only “meant to provoke discussion,” Hope says. It was put together as part of a class in which designers read science fiction and make functional prototypes to explore the ideas in the books.

If it ever does become more widely available, sensory fiction could have an unintended consequence. When I shared this idea with NPR editor Ellen McDonnell, she quipped, “If these device things are helping ‘put you there,’ it just means the writing won’t have to be as good.”

I hope the students are successful at provoking discussion as so far they seem to have primarily provoked interest.

As for my two cents, I think that in a world where it seems making personal connections  is increasingly difficult (i.e., people becoming more isolated) that sensory fiction which stimulates people into feeling something as they read a book seems a logical progression.  It’s also interesting to me that all of the focus is on the reader with no mention as to what writers might produce (other than McDonnell’s cheeky comment) if they knew their books were going to be given the ‘sensory treatment’. One more musing, I wonder if there might a difference in how males and females, writers and readers, respond to sensory fiction.

Now for a bit of wordplay. Feeling can be emotional but, in English, it can also refer to touch and researchers at MIT have also been investigating new touch-oriented media.  You can read more about that project in my Reaching beyond the screen with the Tangible Media Group at the Massachusetts Institute of Technology (MIT) posting dated Nov. 13, 2013. One final thought, I am intrigued by how interested scientists at MIT seem to be in feelings of all kinds.

Do you hear what I hear?

It’s coming up Christmas time and as my thoughts turn to the music, Stanford University (California, US) researchers are focused on hearing and touch (the two are related) according to a Dec. 4, 2013 news item on Nanowerk,

Much of what is known about sensory touch and hearing cells is based on indirect observation. Scientists know that these exceptionally tiny cells are sensitive to changes in force and pressure. But to truly understand how they function, scientists must be able to manipulate them directly. Now, Stanford scientists are developing a set of tools that are small enough to stimulate an individual nerve or group of nerves, but also fast and flexible enough to mimic a realistic range of forces.

The Dec. 3, 2013 Stanford Report article by Cynthia McKelvey, which originated the news item, provides more detail about hearing and the problem the researchers are attempting to solve,

Our ability to interpret sound is largely dependent on bundles of thousands of tiny hair cells that get their name from the hair-like projections on their top surfaces. As sound waves vibrate the bundles, they force proteins in the cells’ surfaces to open and allow electrically charged molecules, called ions, to flow into the cells. The ions stimulate each hair cell, allowing it to transfer information from the sound wave to the brain. Hair bundles are more sensitive to particular frequencies of sound, which allows us to tell the difference between a siren and a subwoofer.

People with damaged or congenital defects in these delicate hair cells suffer from severe, irreversible hearing loss. Scientists remain unsure how to treat this form of hearing loss   because they do not know how to repair or replace a damaged hair cell. Physical manipulation of the cells is key to exploring the fine details of how they function. This new probe is the first tool nimble enough to do it.

The article also goes on to describe the ‘nano’ probe,

The new force probe represents several advantages over traditional glass force probes. At 300 nanometers thick, Pruitt’s [Beth Pruitt, an associate professor of mechanical engineering] probe is just three-thousandths the width of a human hair. Made of flexible silicon, the probe can mimic a much wider range of sound wave frequencies than rigid glass probes, making it more practical for studying hearing. The probe also measures the force it exerts on hair cells as it pushes, a new achievement for high-speed force probes at such small sizes.

Manipulating the probe requires a gentle touch, said Pruitt’s collaborator, Anthony Ricci, a professor of otolaryngology at the Stanford School of Medicine. The tissue samples – in this case, hair cells from a rat’s ear – sit under a microscope on a stage floating on a cushion of air that keeps it isolated from vibrations.

The probe is controlled using three dials that function similarly to an Etch-a-Sketch. The first step of the experiment involves connecting a tiny, delicate glass electrode to the body of a single hair cell.

Using a similar manipulator, Ricci and his team then press the force probe on a single hair cell, and the glass electrode records the changes in the cell’s electrical output. Pruitt and Ricci say that understanding how physical changes prompt electrical responses in hair cells can lead to a better understanding of how people lose their hearing following damage to the hair cells.

The force probe has the potential to catalyze future research on sensory science, Ricci said.

Up to now, limits in technology have held scientists back from understanding important functions such as hearing, touch, and balance. Like hair cells in the ear, cells involved in touch and balance react to the flexing and stretching of their cell membranes. The force probe can be used to study those cells in the same manner that Pruitt and Ricci are using it to study hair cells.

Understanding the mechanics of how cells register these sensory inputs could lead to innovative new treatments and prosthetics. For example, Pruitt and Ricci think their research could help bioengineers build a better hair cell for people with impaired hearing from damage to their natural hair cells.

Stanford has produced a video about this work,

I find it fascinating that hearing and touch are related although I haven’t yet seen anything that describes or explains the relationship. As for anyone hoping for a Christmas carol, I think I’m going to hold off until later in the season.

Touchy feely breakthrough at the nano scale

This first posting back after a three week hiatus (I’m baaack) concerns a study in Sweden where scientists found that people can discern nano wrinkles with their fingertips. From the Sept. 16, 2013 news item on Nanowerk,

In a ground-breaking study, Swedish scientists have shown that people can detect nano-scale wrinkles while running their fingers upon a seemingly smooth surface. The findings could lead such advances as touch screens for the visually impaired and other products, says one of the researchers from KTH Royal Institute of Technology.

The study marks the first time that scientists have quantified how people feel, in terms of a physical property. One of the authors, Mark Rutland, Professor of Surface Chemistry, says that the human finger can discriminate between surfaces patterned with ridges as small as 13 nanometres in amplitude and non-patterned surfaces.

The KTH Sept. 16, 2013 news release by David Callahan, which originated the news item, describes the new understanding of touch and its possible applications,

The study highlights the importance of surface friction and wrinkle wavelength, or wrinkle width – in the tactile perception of fine textures.

When a finger is drawn over a surface, vibrations occur in the finger. People feel these vibrations differently on different structures. The friction properties of the surface control how hard we press on the surface as we explore it. A high friction surface requires us to press less to achieve the optimum friction force.

“This is the breakthrough that allows us to design how things feel and are perceived,” he says. “It allows, for example, for a certain portion of a touch screen on a smartphone to be designed to feel differently by vibration.”

The research could inform the development of the sense of touch in robotics and virtual reality. A plastic touch screen surface could be made to feel like another material, such as fabric or wood, for example. The findings also enable differentiation in product packaging, or in the products themselves. A shampoo, for example, can be designed to change the feel of one’s hair.

The news release goes on to describe how the research was conducted,

With the collaboration of National Institute of Standards and Technology (NIST) material science labs, Rutland and his colleagues produced 16 chemically-identical surfaces with wrinkle wavelengths (or wrinkle widths) ranging from 300 nanometres to 90 micrometres, and amplitudes (or wrinkle heights) of between seven nanometres and 4.5 micrometres, as well as two non-patterned surfaces. The participants were presented with random pairs of surfaces and asked to run their dominant index finger across each one in a designated direction, which was perpendicular to the groove, before rating the similarity of the two surfaces.

The smallest pattern that could be distinguished from the non-patterned surface had grooves with a wavelength of 760 nanometres and an amplitude of only 13 nanometres.

Rutland says that by bringing together professors and PhD students from two different disciplines – surface chemistry and psychology – the team succeeded in creating “a truly psycho-physical study.”

“The important thing is that touch was previously the unknown sense,” Rutland says. “To make the analogy with vision, it is as if we have just revealed how we perceive colour.

“Now we can start using this knowledge for tactile aesthetics in the same way that colours and intensity can be combined for visual aesthetics.”

Here’s a citation for and link to the researchers’ study,

Feeling Small: Exploring the Tactile Perception Limits by Lisa Skedung, Martin Arvidsson, Jun Young Chung, Christopher M. Stafford, Birgitta Berglund & Mark W. Rutland. Scientific Reports 3, Article number: 2617 doi: 10.1038/srep02617 Published 12 September 2013

This paper is open access.

Reimagining prosthetic arms; touchable holograms and brief thoughts on multimodal science communication; and nanoscience conference in Seattle

Reimagining the prosthetic arm, an article by Cliff Kuang in Fast Company (here) highlights a student design project at New York’s School of Visual Arts. Students were asked to improve prosthetic arms and were given four categories: decorative, playful, utilitarian, and awareness. This one by Tonya Douraghey and Carli Pierce caught my fancy, after all, who hasn’t thought of growing wings? (Rrom the Fast Company website),

Feathered cuff and wing arm

Feathered cuff and wing arm

I suggest reading Kuang’s article before heading off to the project website to see more student projects.

At the end of yesterday’s posting about MICA and multidimensional data visualization in spaces with up to 12 dimensions (here)  in virtual worlds such as Second Life, I made a comment about multimodal discourse which is something I think will become increasingly important. I’m not sure I can imagine 12 dimensions but I don’t expect that our usual means of visualizing or understanding data are going to be sufficient for the task. Consequently, I’ve been noticing more projects which engage some of our other senses, notably touch. For example, the SIGGRAPH 2009 conference in New Orleans featured a hologram that you can touch. This is another article by Cliff Kuang in Fast Company, Holograms that you can touch and feel. For anyone unfamiliar with SIGGRAPH, the show has introduced a number of important innovations, notably, clickable icons. It’s hard to believe but there was a time when everything was done by keyboard.

My August newsletter from NISE Net (Nanoscale Informal Science Education Network) brings news of a conference in Seattle, WA at the Pacific Science Centre, Sept. 8 – 11, 2009. It will feature (from the NISE Net blog),

Members of the NISE Net Program group and faculty and students at the Center for Nanotechnology in Society at Arizona State University are teaming up to demonstrate and discuss potential collaborations between the social science community and the informal science education community at a conference of the Society for the Study of Nanoscience and Emerging Technologies in Seattle in early September.

There’s more at the NISE Net blog here including a link to the conference site. (I gather the Society for the Study of Nanoscience and Emerging Nanotechnologies is in its very early stages of organizing so this is a fairly informal call for registrants.)

The NISE Net nano haiku this month is,

Nanoparticles

Surface plasmon resonance
Silver looks yellow

by Dr. Katie D. Cadwell of the University of Wisconsin-Madison MRSEC.

Have a nice weekend!