Home » GENERAL » Robots can read your face now, but via wearable device

Robots can read your face now, but via wearable device

South Korean researchers are a step ahead of their Japanese counterparts when it comes to Internet and future technology options, especially robotics.

Making use of their foray into visual medium tech, they have developed a stretchable and transparent sensor that could be installed in robots which can read human facial expressions.

The ultra-sensitive sensor, which can be worn by robots, has a layer of carbon nanotube film on two types of electrically-conductive elastomers. With these robots can analyze human expressions from smling, frowing, brow-raising, brow-furrowing to eye-rolling. It can also find out where they are looking and a slight change in one’s gaze gets noticed.

The sensors developed by Nae-Eung Lee from the Sungkyunkwan University, Seoul and his colleagues have numerous other applications as they “could be used to monitor heartbeats, breathing, dysphagia (differently swallowing) and other health-related cues,” said Lee.

However, humans too need to wear the attachable, or implantable platforms, to increase interface with the machine.

The stretchable, transparent, ultrasensitive, and patchable strain sensor that is made of a novel sandwich-like stacked piezoresisitive nanohybrid film needs to be worn to send signals to the robot making it more interactive and intuitive with humans, the ability to read their users’ emotions and respond with a computerised version of empathy.

This sensor, which can detect small strains on human skin, was made of water-based solution processing. “We attributed the tunability of strain sensitivity, stability, and optical transparency to enhanced formation of percolating networks” said researchers.

“The mechanical stability, high stretchability of up to 100%, optical transparency of 62%, and gauge factor of 62 suggested that when attached to the skin of the face, this sensor would be able to detect small strains induced by emotional expressions such as laughing and crying, as well as eye movement, and we confirmed this experimentally,” wrote researchers in their abstract.

The findings, published in ACS Nano will make future human lives different and robots can replace parental or friendly care with their tuned in reactions.

Leave a Reply

Your email address will not be published. Required fields are marked *