Smart glove maps the human grasp

Engineers have found a cheap way to measure the hand’s sense of touch - paving the way to better prosthetics.
31 May 2019

Smart glove 2

Smart glove

Share

Engineers have found a cheap way to measure the hand’s sense of touch - paving the way to better prosthetics.

We already have smartphones and smart cars, but now a group of engineers have created the ‘smart glove’ - and it provides the first ever complete map of the human hand’s sense of touch.

The scalable tactile glove (STAG) is an ordinary glove covered with pressure sensors. It can easily be made out of materials that cost under $10.

It’s the brainchild of MIT’s Subramanian Sundaram, who hoped to investigate how humans use our tactile sense to grasp objects - an undertaking that has been “a longstanding quest for robotics.”

The difficulty comes from trying to completely cover the hand with tactile sensors. A force-sensitive film is needed; one that’s both flexible and thin enough to allow a natural grip.

Sundaram’s innovation is a network of conductive threads, coated in a solidified liquid polymer called PDMS. A grid of ‘insulation displacement cables’ runs cross-ways to connect the threads and read the data. He said it took “a while to come up with this.”

The completed smart glove boasts a dense grid of 548 pressure sensors. These can record ‘tactile videos’ by taking around seven readings per second. The whole thing is flexible, stretchable, and made of readily-available materials that cost around $10 - or even cheaper if you buy in bulk.

Sundaram tested the glove’s data-gathering abilities in an experiment where he picked up, held, and lifted 26 different objects. These ranged from ordinary desk items like a pen, to an unusual piece of fruit called a ‘horned melon’ or ‘kiwano’.

He collected over five hours of these ‘tactile videos’, containing more than 135,000 total ‘frames’ of data. He then separated data into half for training and half for testing.

Using the first half of the data, he trained a neural network to associate patterns with individual objects. Then, to see if it had learned correctly, he tested it on the other half. With enough ‘frames’ of data, the AI could identify each object from just how it felt.

The patterns used by the neural network as it was learning were a breakthrough; they show how certain parts of the hand work together when it grips objects in different ways. Thanks to the smart glove, we have new insights into the complex strategies used by the human hand.

In addition, our own brains may identify objects by touch in the same way that the neural network did: using a virtual map of the hand. We know that the brain uses maps like these for sight, and neuroscience suggests that touch may work in the same way.

Sundaram said that one of the most exciting results was showing how different hand regions work together: “the collaborative nature of human grasp”. In the future this could help to build prosthetics with touch sensors, by placing them at the most useful points on the hand; alternatively, the knowledge could help to design advanced arms for robotics.

Comments

Add a comment