Read | Can machines understand emotions?

How a visit to a science centre can be an opportunity to influence real ongoing research, and to explore how computers might help us to better understand human emotion. 

 

 

 




Lisa Whiting

Wouldn’t it be useful if we could have an aid that would translate how people might be feeling? Or how are we, really?  

We all have emotions—powerful indicators of our needs, desires, and goals.  

Being able to identify our feelings accurately can lower their perceived intensity— leading towards improvements in mental health, relationships, communication skills and much more. 

Often, we can assess other people’s emotions through inbuilt empathy, but the cognitive part of this function relies on things that people notice about the person speaking—the way they stand, the tone they use, the facial expressions they exhibit, as well as what they say. Each of these cues associated with an emotion can be a data point. 

Even humans aren’t that good at understanding emotion all of the time, so what if a computer could help us to do just that?  

A Curiosity Challenge 

 ‘Can machines understand emotions?’ was the result of the collaboration between the Dynamic Genetics Lab and We The Curious.

We worked with lab directors Dr Oliver Davis and Professor Claire Haworth, and researchers Dr Valerio Maggio and Dr Nina Di Cara to co-develop a programme of fun, interactive activities that would allow the WTC audience to participate in, and influence, research around machine learning and human emotions. 

Lisa Whiting

The project consisted of a programme of activities exploring whether an Artificial Intelligence (AI) could read human emotions from pictures, and what this could mean for our everyday lives. 

This project supports two of our key manifesto pledges here at We The Curious; to open up science in our city, and to enable diverse participation – where we aim to amplify underrepresented audiences in research.  

For further information about these pledges, please visit the Open City Research and Inclusion sections on our website.

 

What did We The Curious visitors do?

In our Open City Lab space, visitors to We The Curious were able to:  

  • Play a Pictionary style draw-an-emotion game  
  • Take part in ‘The Learning Machine’ – a touchscreen-based activity, where visitors had to train an AI by teaching it which facial expressions matched which emotions 
  • Share their ideas/questions/comments with the research team behind the activities by writing them a postcard 

This project gave participants a chance to put themselves in the Artificial Intelligence’s (AI) position and think about how humans interpret facial expressions. They were able to see how the AI learned as they taught it, becoming better at allocating images by itself. 

Nick from We The Curious Live Science Team facilitating the activities 

 

Overall visitors connected easily with the topic of emotions and were very keen to discuss the real-life applications of this type of technology. They were also able to chat to the team during the project and involve their unique perspectives in the research. 

 

What did participators say? 

We The Curious visitors had over 4000 interactions with this programme of activities. Can machines understand emotions gave them the opportunity to explore, contribute and challenge data-based active research into reading and understanding human emotions – and to question whether machines could potentially do it too. This generated a huge amount of conversations around the real-life impact of this research. 

“It's more important for a computer to be able to tell if someone is sad than when someone is happy”  

One visitor felt that some feelings were more important to recognise than others. This could help to identify deeply buried feelings that need to be actioned, things like offering a safe space or an intervention.  

Postcard for the scientist

A particularly good example of why this type of an emotion identifying aid should exist came from a visually impaired visitor, who told us about how they use technology, such as an app which reads text to them, aiding in day-to-day life.

“Something which would tell me how people are feeling would be extremely useful!”

We also sparked a conversation around neurodiversity with a group of teenagers, who said they personally used emotion wheels to help them. They said that they often have to analyse emotions in a logical way like a computer might, by scanning eyes, then scan mouth, then put information together. Again, having an interactive aid would be instrumental.

 

What was the impact on the research?

The activities developed showed creative and engaging ways to increase people wanting to engage with machine learning.

Our research partners value this blend of supervised machine learning and active learning. It is particularly important when teaching machines to recognise human traits as there are things like cultural background, family values, and many other factors to take under consideration.

And, from an ethical point of view, there is a responsibility for researchers and AI developers to strive to minimise bias in their work—and to ensure a diverse range of input. So having the opportunity for the AI to learn from a wide representation of the public has influenced this project in significant ways.

Research team evaluation sessions at We The Curious

“I enjoyed the opportunity to think about our research in a different way, and to develop a project in an area I’ve always wanted to explore—human-in-the-loop machine learning. It’s been really inspiring to see how people have responded to it.” Dr Oliver Davis.

The opportunity for public engagement allowed the researchers to test in the real word, the way that humans and machines can learn together. Thanks to projects like this we now have developed a way to include diverse perspectives in the research process.

“This experience has influenced a new direction for our research on data-driven public engagement. Seeing participants’ enjoyment in taking part reminds me of how lucky we are to be able to work as scientists making new discoveries!” Prof. Claire Haworth

 

What’s the future of AI and human emotions?

Whilst scientists might not yet be able to teach a machine true emotional empathy, they most certainly can teach a machine to process relevant data and learn from it.

“I particularly enjoyed the engagement with the public this project had: being able to convey what we do, what data science means, and the pros and cons of algorithms working with data.”  Dr Valerio Maggio

 

 

Oliver, Nina, Chris: Curiosity Challenge workshop notes May 2019

 

Important ethical issues regarding bias in the algorithms and trusting them was a common topic raised by visitors. This programme of activities allowed participatory development of research ideas, and our research partners are keen to continue the conversation with our visitors and carry on their engagement with the research. 

 

“It’s made me think more ambitiously about how I can involve the public in research and build it into my future ideas. I think it is really important for research in this area to be informed by an ongoing conversation with the public and for data science research to be more accessible and open to critique.” Dr Nina Di Cara 

 

It also opened up conversations around mental health, and the potential for mental health applications in the future—such as an algorithm that can identify people’s emotions from social media feeds.  

 

Besides generally considering how to involve public perspectives into their ongoing research, we’re currently exploring with the Dynamic Genetics Lab team what future projects we could do together—such as the ethics of Data Science, and how data games could help decision-makers in the biomedical sector—so conversations are happening, and project plans are in the pipeline!  

  

Thanks  

 

This project was made possible through a partnership with the Jean Golding Institute and the Public Engagement Team at the University of Bristol, and funded by The Alan Turing Institute.

 

We’d like to say a big thanks to the team at the Dynamic Genetics Lab, in particular:  

 

(L to R)

Dr Oliver Davis, Prof. Claire Haworth, Dr Nina Di Cara and Dr Valerio Maggio

As well as:

Zoe Reed, Chris Moreno-Stokoe, Helena Davies, Alastair Tanner and Benjamin Woolf who contributed to the earlier stages of this project.

We The Curious research team: Maca Gomez-Gutierrez, Tom Rodgers, Helen Della Nave and the Live Science Team.

 

If you’d like to know more about our Open City Research programme please visit our website.


 

 

 

You might like...