If humanoid robots are to fully integrate into society, they will need the ability to read human emotional states and respond appropriately. A new wearable from South Korean researchers could help do just that.
Robots are good at so many things. They can lift impressive loads, learn surprisingly quickly, and even fly airplanes.
But when it comes to truly understanding us—understanding our messy human emotions, mood swings, and inner neediness—they're still as good as a toaster at making art (though Some argue that the perfect toast is a kind of toast (art, but I digress). But this is slowly changing over time, and a new system announced by researchers at South Korea's Ulsan Institute of Science and Technology (UNIST) could further accelerate the emotional intelligence of our technology. there is.
The team there has developed a stretchable, wearable facial system that uses skin friction and vibration monitoring to assess human emotions and generate its own power. And yes, it's as weird as it sounds.
The wearable consists of a series of thin, transparent, flexible sensors that are attached to the left and right sides of the head. The majority of each sensor pierces between the eyes and ears, with branches extending above and above each eye, down the jaw, and into the back of the head. The researchers say the sensor can be custom-made to fit any face.
Once installed, the sensors are connected to an integrated system trained to decipher human emotions based on facial tension patterns and voice vibrations. Unlike other systems using similar technology, this system is completely self-powered by the stretching and contraction of the sensor material using piezoelectric principles. This means you can wear it all day long (as if you wanted to) without worrying about needing to charge it. UNIST researchers say this is the first time a completely independent wearable emotion recognition system has been developed.
While face-based stickers are unlikely to become widespread as everyday wearables, the UNIST team is incorporating the technology into VR environments, and it's easy to imagine success there. Imagine developing a more comprehensive VR headset that can monitor our emotions and adjust the virtual world accordingly. In fact, during testing, researchers used a new emotion-sensing system to provide book, music, and movie recommendations in various virtual settings based on the wearer's mood.
you just catch me
UNIST's work comes as the latest in a series of efforts aimed at making technology more sensitive to the humans who use it.
We've seen necklaces that can read facial expressions and infer emotional states. A robot head that can reflect human facial expressions. A smart speaker that suggests songs based on your mood derived from audio analysis. and an AI system that allows self-driving cars to predict the actions of other drivers based on their personalities. There was even an effort in 2015 that perhaps foreshadowed his new UNIST research. It's a face sticker that helps robots understand human emotions. Moreover, who can forget the huge success of Pepper, an emotion-reading robot from Japan that was launched in 2015 and is now deployed in over 2,000 companies around the world?
As technology becomes better at understanding our emotional states, not only will androids be able to use our moods against us to take over the world (just kidding), but such advances will could break down some of the remaining walls between humans and robots.
Imagine the impact medical companion robots would have on the elderly. Instead of annoying bots that spin around three times a day and tell you to take your medication or drink more water with a flat mechanical voice, these machines can join conversations, gauge your mood, and recommend the right type of drink. You can use flattery. Conversation strategies to overcome stubborn resistance to self-care.
Emotionally smart robots might be able to help kids deal with bullying problems at school by vaporizing bullies (again, we're kids). But it can be a safe space for children to discuss topics that are difficult to discuss with their human peers. Because these bots stay calm and don't get their “buttons pushed,” in sarcastic terms, they can offer clear-sighted advice in a way that frustrated parents can't. .
In a more sinister imagination, emotion-reading technology could function as a kind of sophisticated lie detector, deciphering how humans really feel, regardless of how they feel. To tell they feel.
The impact of emotionally intelligent technology on our lives is as limitless as the range of emotions we experience every day as a species. And while putting adhesive sensors on our faces may not be a step forward, UNIST's efforts certainly help add another step on the big climb to the “get us” machine.
Or, as lead researcher Jiyun Kim puts it: “For effective interaction between humans and machines, human-machine interface (HMI) devices must be able to collect diverse data types and process complex integrated information. It illustrates the potential of harnessing emotion, a complex form of human information in next-generation wearable systems.”
This study was published in the journal nature communications.
Source: UNIST