#Open journalism No news is bad news

Your contributions will help us continue to deliver the stories that are important to you

Support The Journal
Dublin: 12°C Sunday 24 October 2021

This Google Glass app uses facial expressions to tell how you're feeling

SHORE analyses facial expressions in real-time to determine the mood a person is in, as well as their gender and age.

Image: FraunhoferIIS/YouTube

WHEN TALKING ABOUT Google Glass, the main concern that is brought up is privacy.

The idea that someone wearing them is recording your every move is often cited and while the reality isn’t quite like that, the concern is a real one.

Although such fears may not be quelled by the unveiling of a new app that uses facial features to determine a person’s mood.

The SHORE (Sophisticated High-speed Object Recognition Engine) Human Emotion Detector, which was created by Fraunhofer IIS, uses facial expressions to determine the mood a person is in as well as their gender and age.

It analyses emotions like happiness, sadness, anger and surprise and displays this information on screen.

The software was ‘trained’ by accessing a database of more than 10,000 annotated faces and combining that with learning algorithms, it ends up with high recognition rates.

However, the makers are keen to stress that it’s not able to determine a person’s identity. The software only analyses emotions and none of the images or information displayed leave the device.

The makers see a number of applications for the technology like helping out the visually impaired or those suffering from ASD (autism spectrum disorder).

Source: FraunhoferIIS/YouTube

Read: Everything you ever wanted to know about Google Glass (but were afraid to ask) >

Read: Should you be worried about whether your cloud data is safe? >

About the author:

Quinton O'Reilly

Read next: