Augmented reality puts the squeeze into virtual hugs

Afp
Wednesday 07 April 2010 00:00 BST
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Now you really can reach out and touch someone through the Internet, with the help of a wearable robot designed by a husband-and-wife team of scientists based in Japan.

Five years in the making, the device aims to inject a little physicality into online chatter, boosting the emotional quotient of virtual exchanges between flesh-and-blood people.

Forget emoticons, those annoying little smiley :-) or frowning :-( faces added to text messages with key strokes.

The quickened thump of an angry heart beat, a spine-tingling chill of fear, or that warm-all-over sensation sparked by true love - all can be felt even as your eyes stay glued to a computer screen.

The proof-of-concept robot, dubbed iFeel_IM! ("I feel therefore I am"), was presented Saturday at the first Augmented Human International Conference, held in the French Alps ski resort of Megeve.

A two-day gathering of engineers and scientists, many from Japan, compared notes on cutting edge research in a field called augmented reality, the realtime enhancement of experience through virtual, interactive technology.

Smart phones that tell you not just where you are but what you are looking at, or Terminator-like visual overlays of data for soldiers in battle are both examples.

Several research teams in Megeve also unveiled breakthroughs in the use of brain waves - captured by electrodes placed on the head - to operate computers or decipher emotions.

But Dzmitry Tsetserukou, an assistant professor at Toyohashi University of Technology in Japan, said his aim was to boost feeling, to add a human-like sense of touch to the incorporeal ether of cyberspace.

"We are steeped in computer-mediated communication - SMS, e-mail, Twitter, Instant Messaging, 3-D virtual worlds - but many people don't connect emotionally," he said in an interview.

"I am looking to create a deep immersive experience, not just a vibration in your shirt triggered by an SMS. Emotion is what give communication life."

For now, his prototype robot is a collection of sensors, small motors, vibrators and speakers woven into a series of straps similar to a parachute harness, minus the parachute.

Connected to a computer, the device can simulate several types of heart beat, a realistic hug, the tickling sensation of a butterfly stomach, and a tingling feeling along the spine. It can also generate warmth.

While he could have added a mechanism for sexual arousal, Tsetserukou decided doing so would ultimately distract from his focus on emotion boosting.

Software written by his colleague (and wife) Alena Neviarouskaya, a researcher at the University of Tokyo, ferrets out the emotional messages embedded in written text, triggering the appropriate touch sensation in the robot in realtime.

It distinguishes joy, fear, anger and sadness with 90 percent accuracy, and can parse nine emotions - adding shame, guilt, disgust, interest and surprise - nearly four out of five times, according to a peer-review study presented at the conference.

"This is really state of the art, there is nothing this accurate," said Tsetserukou.

Subjects tested the system in the online, three-dimensional environment known as Second Life, inhabited by avatars manipulated by individuals sitting before their computers.

In a demonstration, two people wearing iFeel_IM! robots communicated at distance through the medium of their avatars.

The words "I am happy to see you" triggers a warm sensation in the person spoken to, and as the avatars hug in their virtual world, the act is mirrored in reality by a squeezing sensation around the waist.

Tsetserukou compared the system to the blockbuster Avatar, and especially the film Surrogates, set in a future when humans stay at home plugged into a cocoon while their healthier, more handsome doppelgangers venture forth into the real world.

"In a few years, this could be a mobile system integrated into a suit or jacket," he said. "It's not that far away."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in