Tag Archives: machine vision’

UK’s National Physical Laboratory reaches out to ‘BioTouch’ MIT and UCL

This March 27, 2014 news item on Azonano is an announcement for a new project featuring haptics and self-assembly,

NPL (UK’s National Physical Laboratory) has started a new strategic research partnership with UCL (University College of London) and MIT (Massachusetts Institute of Technology) focused on haptic-enabled sensing and micromanipulation of biological self-assembly – BioTouch.

The NPL March 27, 2014 news release, which originated the news item, is accompanied by a rather interesting image,

A computer operated dexterous robotic hand holding a microscope slide with a fluorescent human cell (not to scale) embedded into a synthetic extracellular matrix. Courtesy: NPL

A computer operated dexterous
robotic hand holding a microscope
slide with a fluorescent human cell
(not to scale) embedded into a
synthetic extracellular matrix. Courtesy: NPL

The news release goes on to describe the BioTouch project in more detail (Note: A link has been removed),

The project will probe sensing and application of force and related vectors specific to biological self-assembly as a means of synthetic biology and nanoscale construction. The overarching objective is to enable the re-programming of self-assembled patterns and objects by directed micro-to-nano manipulation with compliant robotic haptic control.

This joint venture, funded by the European Research Council, EPSRC and NPL’s Strategic Research Programme, is a rare blend of interdisciplinary research bringing together expertise in robotics, haptics and machine vision with synthetic and cell biology, protein design, and super- and high-resolution microscopy. The research builds on the NPL’s pioneering developments in bioengineering and imaging and world-leading haptics technologies from UCL and MIT.

Haptics is an emerging enabling tool for sensing and manipulation through touch, which holds particular promise for the development of autonomous robots that need to perform human-like functions in unstructured environments. However, the path to all such applications is hampered by the lack of a compliant interface between a predictably assembled biological system and a human user. This research will enable human directed micro-manipulation of experimental biological systems using cutting-edge robotic systems and haptic feedback.

Recently the UK government has announced ‘eight great technologies’ in which Britain is to become a world leader. Robotics, synthetic biology, regenerative medicine and advanced materials are four of these technologies for which this project serves as a merging point providing thus an excellent example of how multidisciplinary collaborative research can shape our future.

If it read this rightly, it means they’re trying to design systems where robots will work directly with materials in the labs while humans direct the robots’ actions from a remote location. My best example of this (it’s not a laboratory example) would be of a surgery where a robot actually performs the work while a human directs the robot’s actions based on haptic (touch) information the human receives from the robot. Surgeons don’t necessarily see what they’re dealing with, they may be feeling it with their fingers (haptic information). In effect, the robot’s hands become an extension of the surgeon’s hands. I imagine using a robot’s ‘hands’ would allow for less invasive procedures to be performed.