A team of engineers from the University of Glasgow has developed an electronic skin that can learn from feeling ‘pain’. According to the skin’s developers, it could be used to create a new generation of smart robots with human-like sensitivity.
A robot hand that incorporates the smart skin, which uses a new type of processing system based on ‘synaptic transistors’ that mimic the brain’s neural pathways in order to learn, shows a remarkable ability to learn to react to external stimuli.
Scientists have been working for decades to create artificial skin with touch sensitivity. One widely explored method is to spread an array of contact or pressure sensors across the electronic skin’s surface to allow it to detect when it comes into contact with an object. Data from the sensors are then sent to a computer to be processed and interpreted. The sensors typically produce a large volume of data that can take time to be properly processed and responded to, introducing delays that reduce the skin’s potential effectiveness in real-world tasks.
The researchers took a slightly different approach that drew inspiration from the way in which the human peripheral nervous system interprets signals from skin in order to eliminate latency and power consumption. As soon as human skin receives an input, the peripheral nervous system begins to process it at the point of contact, reducing it to only the vital information before it’s sent to the brain. That reduction of sensory data allows for the efficient use of the communication channels needed to send the data to the brain, which then responds almost immediately so that the body can react appropriately.
The researchers created the skin sensor by printing a grid of 168 synaptic transistors made from zinc-oxide nanowires directly onto the surface of a piece of flexible plastic. They then connected the sensor to the palm of a fully articulated, human-shaped robot hand.
When the sensor is touched, it registers a change in its electrical resistance; a small change corresponds to a light touch while a harder touch creates a larger change in resistance. This input is designed to mimic the way in which sensory neurons work in the human body.
In earlier versions of electronic skin, those input data would be sent to a computer to be processed. In the new system, a circuit built into the skin acts as an artificial synapse, reducing the input down into a simple spike of voltage whose frequency varies according to the level of pressure applied to the skin, speeding up the reaction process.
The team used the varying output of the voltage spike to teach the skin appropriate responses to simulated pain, which would trigger the robot hand to react. By setting a threshold of input voltage to cause a reaction, the team could make the robot hand recoil from a sharp jab in the centre of its palm – that is, it learned to move away from a source of simulated discomfort through a process of onboard information processing that mimics how the human nervous system works.
‘We all learn early on in our lives to respond appropriately to unexpected stimuli like pain in order to prevent us from hurting ourselves again,’ said Professor Ravinder Dahiya from the University of Glasgow’s James Watt School of Engineering. ‘Of course, the development of this new form of electronic skin didn’t really involve inflicting pain as we know it – it’s simply a shorthand way to explain the process of learning from external stimulus.
‘What we’ve been able to create through this process is an electronic skin capable of distributed learning at the hardware level, which doesn’t need to send messages back and forth to a central processor before taking action,’ he continued. ‘Instead, it greatly accelerates the process of responding to touch by cutting down the amount of computation required. We believe that this is a real step forward in our work towards creating large-scale neuromorphic printed electronic skin capable of responding appropriately to stimuli.’
‘In the future, this research could be the basis for a more advanced electronic skin that enables robots capable of exploring and interacting with the world in new ways, or building prosthetic limbs that are capable of near-human levels of touch sensitivity,’ said Fengyuan Liu, a lecturer in the Department of Mechanical Engineering who also worked on the project.
The research has been published in Science Robotics.