Bias in facial emotion recognition – Is AI influenced by racial or gender bias?
AI Technology

Bias in facial emotion recognition – Is AI influenced by racial or gender bias?

Stefano Bargagni

Unraveling levels of AI bias: It’s more about class than race or gender.

The common understanding is that gender and racial biases in Facial Emotion Recognition AI can arise during algorithm creation, data training, or decision-making.

Recent research upsets this assumption

Despite the widely accepted narrative assuming that race and gender biases are a significant obstacle in AI’s path to fair decision-making, a new perspective from recent research suggests a less explored narrative: the role of social class in biases of artificial intelligence that eclipses the aspect of gender and race. A study titled “Intersectionality in emotion signaling and recognition: The influence of gender, ethnicity, and social class.” delves into the interplay of gender, ethnicity, and social class in the realm of emotional reporting and recognition.

Neither gender nor ethnicity greatly influences the recognition of emotional expressions.

Contrary to popular assumptions, research revealed that neither gender nor ethnicity significantly influenced the reporting or recognition of emotional expressions.

What should we observe instead?

The spotlight has instead shifted towards social class, revealing a recognizable impact on both the reporting and interpretation of emotions.

How the research was conducted

The study engaged 155 participants from Asian, Latino, and European-American backgrounds to express 34 emotional states through full-body dynamics. Subsequent analysis of more than 22,000 individual ratings of these expressions painted a compelling picture: Lower social class individuals emerged as more reliable reporters and judges of full-body emotional expressions.

This revelation requires a larger conversation. In our attempt to eradicate gender bias in AI, are we overlooking potentially more pervasive class biases? This is not a call to abandon the fight against gender and ethnic bias, but rather a call to broaden the lens through which we examine bias in artificial intelligence. It is a call for a broader conversation, one that considers the multifaceted nature of prejudice, one that extends beyond gender and race to include social class and perhaps other socio-economic factors.

A deeper conversation about bias in Facial Emotion Recognition

The findings of this recent study serve as a catalyst for a more nuanced discussion about AI bias. It pushes us to dig deeper, analyze layers of bias, and foster a more inclusive AI ecosystem. By redirecting some of our attention towards social class, we have the opportunity to create a more equitable AI landscape that reflects the intricate tapestry of human society.

The journey towards impartial, ethical artificial intelligence is a complex and multidimensional undertaking. As we move forward, let’s ensure our narrative is as inclusive and comprehensive as the solutions we seek to create. 

At MorphCast we are extremely motivated by improving our algorithms and cultivating a company culture of accountability, transparency and clarity.

References

  • Monroy, M., Cowen, A. S., & Keltner, D. (2022). Intersectionality in emotion signaling and recognition: The influence of gender, ethnicity, and social class. Emotion, 22(8), 1980–1988.
Share on:

Get our Emotion AI SDK now and try it for free, no credit card required

Get the Licence MorphCast Facial Emotion AI

Informations about
the Author

Stefano Bargagni profile pic
Stefano Bargagni

Internet serial entrepreneur with a background in computer science (hardware and software), Stefano codified the e-commerce platform and founded the online retailer CHL Spa in early 1993, one year before Amazon. He is the Founder and CEO of MorphCast.