Is Emotion AI Prohibited by the January 24, 2024 – Draft Proposal for AI Act?
AI Regulations

Is Emotion AI Prohibited by the January 24, 2024 – Draft Proposal for AI Act?

Stefano Bargagni

Emotion AI and AI Act Compliance: Understanding Restrictions and Building Emotion AI Ethical Applications

On January 24, 2024, the Council Of European Union released the Proposal for a Regulation of the European Parliament and of the Council laying down harmonized rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts – Analysis of the final compromise text with a view to agreement (5662/24): https://data.consilium.europa.eu/doc/document/ST-5662-2024-INIT/en/pdf. This article examines the AI Act compliance of projects from my professional experience, offering insights for others.

Introduction

Artificial intelligence (AI) is a rapidly evolving field, and at Morphcast, we are at the forefront of exploring its potential. Emotion AI, in particular, excites us for its ability to revolutionize the way we interact with people and technologies. But do Emotion AI and AI Act Compliance get along?

However, we are also aware of the risks associated with Emotion AI. The potential to manipulate or discriminate against people based on their emotions is a reality we cannot ignore.

That’s why we are closely following the debate on the new EU AI regulation. We believe it is crucial to find a balance between innovation and the protection of individual rights.

As the Founder and CEO of Morphcast, I feel compelled to contribute to this debate. My company is committed to developing ethical and responsible Emotion AI technologies, and the new EU regulation will have a significant impact on our work.

But now let’s dive into the heart of the proposal and understand its impact on Emotion AI. We at Morphcast need to figure out how to provide technology that complies with the AI Act and how to monitor our clients’ use of our technology.

As we read through the draft legislation together, we’ll discover that the primary concerns and restrictions surround the specific applications of this exciting and useful technology. It’s not just about how it’s built and developed by us (though that’s crucial, and something we already consider and execute carefully).

Key Points

  • Emotion Recognition in Specific Contexts: The AI Act includes restrictions on the use of emotion recognition, specifically prohibiting it in the workplace and educational institutions. However, exceptions may exist for safety and medical reasons. ([See Annex III, (6)(a)])
  • Prohibited Practice: The AI Act explicitly prohibits creating or expanding biometric recognition databases by indiscriminately scraping facial images in publicly accessible spaces. ([See Article 5(1)(d)])
  • Biometric Categorization Restrictions: There’s a limited prohibition against biometric categorization of individuals based on characteristics that would reveal ethnicity, gender, political or sexual orientations, and other similar traits. ([See Article 5(1)(e)])
  • Post-Remote Biometric Identification: This practice must have a basis in a criminal act, proceeding, or a clear, defined threat (not used untargeted). This likely has implications for using emotion recognition to profile individuals.

Important Considerations

  • Breakdown of the implementation timeline for the EU AI Act:
    • Provisional Agreement: December 2023
    • Formal Adoption: The EU Parliament and Council need to formally adopt the agreed text for it to become a law. This step could take a few months.
    • Transition Period: Once adopted, there’s likely to be a transition period before the AI Act is directly applicable. The current information suggests this could be around two years after the law comes into force.
    • Earliest Application: Based on the currently available information, 2025 (at the earliest) is when the AI Act would actually start being enforced.
  • Evolving Interpretations: Legal documents like the AI Act are subject to ongoing interpretation. What is deemed compliant or non-compliant can change over time.
  • High-Risk AI Systems: Emotion recognition technology, depending on its implementation and use, could potentially fall under the “high-risk” AI category as outlined in the AI Act. This would entail significant obligations regarding transparency, governance, quality control, and risk management.

Questions that Arise

To properly determine if a specific instance of emotion AI or facial emotion recognition technology could be compliant with the AI Act, several key questions need to be considered:

  1. What is its Purpose? Is emotion recognition being used in a prohibited context (workplace/education) without any justifiable exceptions?
  2. How is Data Collected and Processed? Does the system involve indiscriminate collection of facial images or violate biometrics-related restrictions?
  3. Does it Fall Under the AI Act’s High-Risk Classification? Does it potentially have a significant impact on individuals’ rights and opportunities?

A Direct and Informative Approach

“The AI Act is a living document; interpretations may shift over time. With full implementation in two years, businesses and consumers should monitor updates to ensure continued compliance.”

Use Cases and examples

I apologize for the technical nature of the article, but since I had to delve into the details of the law, I had to read the entire text, carefully understand each passage concerning Emotion AI and in particular Facial Emotion Recognition, which I deal with professionally, and dissect each individual point.

Next, I analyzed some frequently encountered or under developing use cases, elaborating on them in detail to examine their potential compliance with the emerging AI Act law. Below, I present several cases along with my reviewed compliance considerations made by Gemini Advanced by Google.  While  this isn’t a formal audit and legal expertise is crucial for a full AI Act compliance assessment, this exercise provides valuable insights. Understanding the law’s dynamics, a comprehensive legal audit with its accompanying expenses may have diminished returns until the legislation stabilizes.

Above all, it helps me to deeply understand the spirit of the law and to better direct my reflections in order to make decisions on the development of MorphCast products on solid foundations and in line with my ethics and those indicated by the European Parliament.

Under these conclusions, you will find a list of use case analyses and related conclusions regarding compliance with the AI Act. I developed them with the intention of using practical examples to explore the spirit of the law, the critical points, and the challenges to be faced.

Beyond Compliance: Towards Responsible Innovation in AI

The development of Emotion AI presents exciting possibilities for innovation and progress in fields like education and human-computer interaction. However, as demonstrated through our analysis with Gemini, we face an interconnected responsibility as technology creators. While crafting secure, reliable, and objective AI solutions is paramount, a true, ethically sound approach extends beyond the products themselves. We must empower our clients to wield these tools in a responsible, compliant, and beneficial manner.

At MorphCast, we recognize this multifaceted obligation. Our commitment reaches beyond technical excellence, encompassing comprehensive guidance and education for our clients. Our goal is to cultivate an understanding of the AI Act and the complexities of ethical AI implementation, promoting informed decision-making that actively mitigates potential discrimination, societal harms, or violations of basic rights. Only through proactive collaboration can we forge a future where AI advancements align with societal values and the principles of human well-being.

It’s vital to keep in mind that the AI landscape is in continuous flux as both regulations and the technology itself evolve. Stay informed about changes in best practices and interpretations of the AI Act to ensure proactive adaptability in your AI offerings. Together, let’s strive to unlock the transformative potential of Emotion AI while upholding the highest ethical standards.

Where to Keep Updated:

This article is an update on the:

June 1, 2023: AI Act: A Step Towards the First Rules on Artificial Intelligence

March 1, 2023:The Emotion AI Debate in 2023: Regulatory Proposals and the Impact of MorphCast Technology. Insights from CEO and Founder Stefano Bargagni

Use Case Analysis #1: Facial Emotion Recognition in a Webinar & AI Act compliance

1. Use Case Description:

  • Use Case Name: Facial Emotion Recognition in a Webinar
  • Use Case Purpose:
    • To measure the attention and engagement of webinar participants.
    • To provide the performer with real-time feedback on the progress of the webinar.
    • To provide the performer with a dashboard summarizing the progress of the webinar in terms of aggregate attention and engagement tracked over time.
  • Technology Used: MorphCast Facial Emotion Recognition (FER)
  • Use Case Process:
    • Webinar participants consent to real-time facial analysis.
    • MorphCast’s AI analyzes frames directly streamed from the user’s webcam within the browser or app.
    • Image frames are NOT stored; they are solely used for on-the-fly analysis.
    • The AI generates two numerical scores per second: attention level and engagement level.
    • The scores are aggregated on the webinar’s central server to generate historical data points for the dashboard.
    • The scores increment anonymous counters representing overall audience metrics.
    • The performer views both a real-time graph and a historical dashboard, showing attention and engagement levels over time.
    • Users can instantly revoke FER analysis, stopping the process on their device and preventing additional data points from being used in the historical graph or dashboard.
  • Actors Involved:
    • Performer
    • Webinar participants
    • MorphCast AI
    • Webinar central server

2. Analysis of Compliance with the AI Act Draft:

  • Key Ethical Considerations:
    • Sensitivity of Real-Time Facial Analysis: Analysis raises sensitivities despite no direct collection of traditional biometric data.
    • Lack of Persistent Image Data Storage: No raw images are stored.
    • Focus on Aggregate, Historical Analytics: Anonymized attention and engagement metrics are used for the historical dashboard.
    • Individual Control: Retain strong control through immediate opt-out capability.
  • Requirements of the AI Act Draft:
    • Free, specific, and informed consent
    • Transparency on the technology’s function, analysis process, and use of historical data for dashboard.
    • Data minimization (images discarded, non-identifiable metrics stored)
    • Security: Secure processing, communication of analytical scores, and storage of historical data points.
    • Right of access, rectification, erasure, and restriction of processing (opt-out addresses restrictions; historical data presents challenges)
  • Compliance Assessment:
    • Consent: Requires full disclosure of FER’s use, anonymization, opt-out controls, and the generation of the historical dashboard.
    • Transparency: Provide details on: no image storage, aggregated nature of scores, FER deactivation capabilities, and how dashboard data is generated.
    • Data Minimization: Achieved through non-storage of frames; historical data minimization could be improved.
    • Security: Measures for secure processing, transmission, and storage of score data, both real-time and historical.
    • User Rights: Opt-out addresses ongoing analysis restrictions; managing historical data for a user opting-out would be complex.

4. Conclusion:

While this use case has strengths in  transient data processing, anonymization, and opt-out controls, its use of historical data points for the dashboard requires careful consideration in context of the AI Act Draft.

Recommendations:

  • Emphasize consent regarding dashboard generation and highlight opt-out features.
  • Consider if shorter aggregation periods would reduce sensitivity risks while serving insights.
  • Explore how to handle user requests to remove historically tracked data (challenges present but worth consideration).
  • Implement rigorous security measures.
  • Stay up-to-date on guidance regarding historical data derived from transient use cases under the AI Act.

Challenge

Now, let’s shift perspectives and imagine using Emotion AI in a more high-stakes context. If I have understood the intent of the law correctly, let’s try to slightly change the use case precisely in its use. What if the analysis had the additional purpose of evaluating the performer’s performance based on the analysis data? For example by a jury?. Would Gemini’s opinion regarding compliance change? In my opinion, yes because there could be risks of discrimination linked to the quality of the tracking and a thousand other imponderable variables. Let’s try it and see what Gemini “thinks”:

Query: Focusing on the typical use case #1, can you give me your opinion on compliance with the draft law if a jury uses extracted data to evaluate a performer’s performance?

Key Ethical and Compliance Concerns:

  • Purpose Shift: This represents a major change in the application’s purpose from its original focus on providing feedback to the performer. It also enters the realm of potentially high-stakes decision-making about an individual.
  • Accuracy and Validity: Using FER-derived metrics within a formal evaluation context raises substantial concerns about whether the analysis methods, given their sensitivity to various factors, provide a sufficiently reliable and unbiased measurement of performance. There is a clear risk of misinterpretation if this were the sole or primary source of information.
  • Accountability and Fairness: There needs to be robust transparency on limitations of this technology and safeguards in place to ensure that any jury using this data understands its context and does not unfairly penalize the performer based on factors outside their control that could affect the FER results.
  • Informed Consent: Even if this purpose shift were added to the consent process, many users might not fully understand the potential impact of data use within an evaluative context and the long-term ramifications.

Likely High-Risk Classification Under the AI Act:

If it were implemented as described, this use case would almost certainly fall within the AI Act’s high-risk category, likely within the context of “employment, workers management, and access to self-employment.” As a high-risk system, it would be subject to numerous requirements including:

  • Risk Assessment: Detailed risk analysis covering ethical implications, accuracy, potential for bias, transparency, and human oversight in its use.
  • Security and Data Quality: Measures ensuring security, data provenance, and appropriate handling practices would be required.
  • Governance Structures: Strong oversight to guide and mitigate the risks and ensure ethical, and accountable operation.

Potential Challenges in Demonstrating Compliance:

This use case may face obstacles in fulfilling certain AI Act requirements:

  • Prohibition on Sole reliance on Automated Decision-Making: If juries use FER-derived metrics as the primary driver of decisions, this could contravene AI Act provisions restricting automated decision-making without human oversight.
  • Right to Explanation: It might be hard to offer clear, unbiased explanations for how automated analysis influenced scores in a way that could help a challenged jury decision.

Recommendation:

In its current form, this use case poses severe compliance risks due to its focus on evaluating individuals directly within a high-stakes context. To pursue a potentially compliant path, the model might have to be heavily revised or completely reconsidered  in terms of its intended purpose and potential consequences.

Use Case Analysis #2: Psychologist and Patient in Video Conference with MorphCast Facial Emotion Recognition

Introduction:

Emotion AI is opening new frontiers in psychological therapy. MorphCast Facial Emotion Recognition is an innovative technology that allows for real-time analysis of a patient’s emotions during a videoconference, providing the psychologist with valuable data for more accurate assessment and effective treatment.

Features and Workflow:

  • Activation and Consent: The psychologist informs the patient that they will activate Lemotion AI and requests their consent through a customizable notice that includes a URL to the privacy policy. The patient can authorize or revoke consent at any time.
  • Emotion Analysis: The analysis takes place in real time on the patient’s device, ensuring privacy. Arousal and valence are detected, which represent the two fundamental dimensions of emotion according to Russell’s circumplex model.
  • Data Visualization and Interaction: The psychologist can view the data in real time in various ways: Russell’s diagram, bars, lines, and an exportable CSV file. They can filter the data by period or focus on specific emotions.
  • Integration with Other Information: Arousal and valence data can be integrated with other information available to the psychologist (anamnesis, patient notes) by downloading a CSV file.

Specific Use Cases:

The use of MorphCast can be particularly useful in various types of therapy, such as:

  • Cognitive Behavioral Therapy: Emotion analysis can help identify negative thoughts and beliefs that influence the patient’s mood.
  • Systematic Desensitization Therapy: Emotion analysis can help monitor the patient’s response to phobic stimuli.
  • Psychodiagnostic Assessment: Arousal and valence data can provide useful information for diagnosing mental disorders.
  • Progress Monitoring: Emotion analysis over time can help evaluate the effectiveness of therapy and monitor the patient’s progress.

Ethical and Privacy Considerations:

  • Informed Consent: The patient must be informed about the use of MorphCast and give their explicit consent.
  • Data Anonymization: Arousal and valence data are associated with the patient’s name for therapeutic purposes. The psychologist is responsible for deleting them at the patient’s request.
  • Data Security: MorphCast adopts rigorous data security procedures compliant with the GDPR.

Compliance with the AI Act Draft:

MorphCast Facial Emotion Recognition complies with the principles of the AI Act draft, in particular:

  • Risk-Based Approach: The technology is classified as low-risk as it does not affect fundamental rights or vital interests.
  • Transparency: The psychologist is required to inform the patient about the use of MorphCast and its features.
  • Human Control: The psychologist maintains control over the emotion analysis and data usage.
  • Security and Accuracy: MorphCast adopts measures to ensure data security and accuracy.

Conclusions:

MorphCast Facial Emotion Recognition represents an innovative tool for psychology, offering the possibility to:

  • Improve the assessment and diagnosis of mental disorders.
  • Personalize treatment based on the patient’s emotions.
  • Monitor the patient’s progress over time.

The technology complies with the GDPR and the AI Act draft, ensuring data privacy and security.

Use Case Analysis #3:Teacher and Students in Video Conference with MorphCast Engagement & Attention Recognition

Introduction:

AI-powered learning is opening new frontiers for education. MorphCast Engagement & Attention Recognition is an innovative technology that allows for real-time analysis of student engagement and attention during a videoconference, providing teachers with valuable data for more personalized and engaging teaching.

Features and Workflow:

  • Activation and Consent: The teacher informs the students that they will activate Lemotion AI and requests their consent through a customizable notice that includes a URL to the privacy policy. Students can authorize or revoke consent at any time.
  • Engagement and Attention Analysis: The analysis takes place in real time on each student’s device, ensuring privacy. Engagement and attention are detected, two key indicators of the level of participation and learning.
  • Data Visualization and Interaction: The teacher can view the data in real time in various ways: pie charts, colored bars, heatmaps, and an exportable CSV file. They can filter the data by student, time period, or lesson topic.
  • Integration with Other Information: Engagement and attention data can be integrated with other information available to the teacher (gradebook, assignments) by downloading a CSV file.

Specific Use Cases:

The use of MorphCast can be particularly useful in various teaching situations, such as:

  • Lectures: Engagement and attention analysis can help the teacher monitor comprehension of topics and adjust the lesson based on student needs.
  • Group Activities: Engagement analysis can help the teacher assess each student’s participation in group work.
  • Distance Learning: Engagement and attention analysis can help the teacher keep students’ attention alive in a virtual environment.
  • Support for Students with Special Needs: Engagement and attention analysis can help the teacher identify students who need individualized support.

Ethical and Privacy Considerations:

  • Informed Consent: Students must be informed about the use of MorphCast and give their explicit consent.
  • Data Anonymization: Engagement and attention data are associated with the student’s name for educational purposes. The teacher is responsible for deleting them at the student’s request.
  • Data Security: MorphCast adopts rigorous data security procedures compliant with the GDPR.

Compliance with the AI Act Draft:

MorphCast Engagement & Attention Recognition complies with the principles of the AI Act draft, in particular:

  • Risk-Based Approach: The technology is classified as low-risk as it does not affect fundamental rights or vital interests.
  • Transparency: The teacher is required to inform students about the use of MorphCast and its features.
  • Human Control: The teacher maintains control over the analysis of engagement and attention and the use of data.
  • Security and Accuracy: MorphCast adopts measures to ensure data security and accuracy.

But there are some Considerations Regarding the Use Case:

  • Potential Conflict with the AI Act current form: While the technology focuses on engagement and attention, these metrics can indirectly imply emotional states. Since the AI Act lacks a precise definition of “emotion”, interpretations may vary, potentially putting this use case at odds with the regulations.
  • Ethics and Privacy: Using MorphCast in classrooms raises ethical and privacy concerns. Collecting student engagement and attention data could be perceived as intrusive surveillance. It’s crucial that students are fully informed about data collection and that their consent is freely given.
  • Benefits vs. Risks:  MorphCast could potentially benefit learning processes through personalized instruction and student progress monitoring. However, potential risks should be carefully weighed, including the stigmatization of students who exhibit lower engagement/attention and the possibility of data misuse.

Recommendations:

  • Clarity is Key: The AI Act would benefit from a clearer definition of “emotion” to prevent ambiguous interpretations about what technologies might be prohibited.
  • Open Dialogue: Educators, tech experts, policymakers, and students should engage in an open discussion about the potential benefits and risks of emotion recognition in educational settings.
  • Ethical Guidelines: The development of ethical guidelines for emotion recognition in classrooms is necessary. These guidelines should prioritize student privacy and the responsible use of this technology.

In Conclusion: The teacher-student video conference use case with MorphCast Engagement & Attention Recognition highlights interesting, potentially beneficial aspects for education.  However, the legal implications with the AI Act’s restrictions, particularly those on emotion recognition, must be carefully considered. Building a balance between potential benefits and the risks of its use within classrooms requires transparency and constructive dialogue between all those involved.

Important Note:

  • Evolving Law: The AI Act is still under development, and its final form could have revised sections relevant to emotion recognition. Continuous monitoring of ongoing updates is essential for future adjustments in how MorphCast might be used.

Conclusions:

MorphCast Engagement & Attention Recognition represents an innovative tool for education, offering the possibility to:

  • Improve the personalization of learning based on the individual needs of students.
  • Increase engagement and participation in lessons.
  • Monitor student progress over time.

The technology complies with the GDPR but may conflict with the draft AI Act.

Use Case Analysis #4: LLM Chat with MorphCast for Emotion Analysis, Including Avatars and Robotic Machines & AI Act compliance

Introduction:

Imagine an LLM (Large Language Model) chat like ChatGPT enhanced with MorphCast technology, seamlessly integrated with avatars and robotic machines.  Users interact not just through text, but through real-time emotion analysis based on voice tone, facial expressions, and body language.

  • Chat with Avatars: Avatars display emotions in real-time, reflecting the user’s emotional state, enhancing the feeling of connection and empathy within the chat experience.
  • Chat with Robotic Machines:  Robotic machines can adapt their behavior, movements, and interactions based on the user’s emotional state, creating a more natural and engaging experience.

Features and Workflow:

  1. Activation and Consent: The user informs the LLM of their consent to use MorphCast.
  2. Emotion Analysis: MorphCast analyzes the user’s emotions in real time, providing data on:
    • Primary Emotions: Joy, sadness, anger, fear, surprise, and disgust.
    • Secondary Emotions: Interest, engagement, confusion, frustration, and boredom.
  3. Adaptive Interaction: The LLM, avatars, and/or robotic machines adapt in real time, showing:
    • Empathy: Recognizing and understanding the user’s emotions.
    • Comprehension: Adapting language, tone, and expressive behaviors to the emotional context.
    • Personalization: Offering responses, suggestions, and actions tailored to the user’s emotions.

Specific Use Cases:

  • Chatbots: Chatbots provide more empathetic and engaging conversations by adapting their communication styles to the user’s emotional state.
  • Virtual Assistants: Virtual assistants better understand and anticipate user needs by recognizing emotions like frustration or confusion.
  • Educational Platforms: Personalized learning experiences are created by adapting the difficulty and pace of lessons to the user’s emotional state.
  • Entertainment Applications: More immersive and engaging experiences are generated by adapting virtual environment content, avatars, and gameplay to the user’s emotional state.

Ethical and Privacy Considerations:

  • Informed Consent: The user must be informed about the use of MorphCast and the data collected.
  • Data Anonymization: Emotional data should only be associated with the user for real-time interaction and anonymized for aggregate analysis.
  • Data Security: MorphCast must adopt strict data security measures compliant with GDPR.
  • Adherence to Ethical Guidelines: Since MorphCast has established Ethical Guidelines and an Ethics Code, its technology naturally conforms to these pre-existing principles, protecting user privacy and promoting responsible technology use.

AI Act Draft Compliance:

The use of MorphCast requires careful evaluation against the AI Act draft as emotion analysis could be considered a form of biometric recognition, subject to restrictions. Ensuring MorphCast remains fully compliant with any AI Act updates throughout its development is essential.

Conclusion:

Integrating MorphCast into an LLM chat, along with avatar and robotic machine interactions, represents a transformative development in the field of human-machine interaction. MorphCast’s established Ethical Guidelines and Ethics Code reinforce its commitment to responsible AI development, prioritizing privacy concerns and promoting genuine user trust.

Use Case Analysis #5: Job Interview Training with MorphCast

Introduction:

A-dapt’s job interview training course leverages MorphCast AI technology to enhance your interaction and presentation skills during a job interview. Through simulations with actors and real-time analysis of your facial expressions, the course helps you make a better impression and increase your chances of success, in line with the ethical and responsible principles of the AI Act.

Features and Workflow:

1. Simulations with Actors:

  • Pre-recorded scenarios with professional actors put you to the test in realistic job interviews.
  • Interaction takes place in a safe and controlled environment, fostering anxiety-free learning.

2. Real-time MorphCast AI Analysis:

  • MorphCast analyzes your facial expressions and provides feedback on:
    • Positive Expressions: Smiling, eye contact, and open posture (promoting confident and open communication).
    • Attention and Engagement: Maintaining eye contact and nodding (demonstrating interest and participation).
    • Emotion Management: Controlling anxiety and nervousness (fostering effective and calm communication).

3. Personalized Feedback:

  • An instructor provides tailored advice and suggestions to improve your performance based on the MorphCast AI analysis, with an individualized and respectful approach to your needs.

Additional Features:

  • Skill Levels: MorphCast tracks your progress over time and provides a skill score that reflects your proficiency level in different areas of nonverbal communication.
  • Personalized Tips: In addition to real-time feedback, MorphCast provides personalized tips on how to improve your skills based on your areas of weakness.
  • Report Card: Upon completion of the course, you receive a report card that summarizes your progress and your areas of strength and weakness.

AI Act Compliance:

The addition of skill levels, personalized tips, and a report card requires careful evaluation to ensure compliance with the AI Act. A-dapt is committed to:

  • Transparency: Informing participants about the use of this data and obtaining their informed consent.
  • Data Anonymization: Aggregating data and anonymizing it before using it for analysis or benchmarking purposes.
  • Data Security: Adopting strict measures for the security of participants’ data.
  • Data Minimization: Collecting only the data necessary for the additional features.
  • Proportionality: Ensuring that MorphCast AI analysis remains proportionate to the risk and is not used to make decisions that could have a significant impact on participants’ lives.

Example of Use:

Imagine you have completed a job interview simulation. MorphCast provides you with real-time feedback on your facial expressions. The instructor provides specific advice on how to improve your nonverbal communication. You can then access your report card to see your progress over time and identify areas to work on.

Ethical and Privacy Considerations:

A-dapt is committed to using MorphCast AI data in a responsible and ethical manner, ensuring:

  • Privacy Respect: Participants’ data is treated confidentially and in accordance with GDPR.
  • Non-Discrimination: MorphCast AI analysis is not used to discriminate against participants based on race, gender, religion, or other factors.
  • Fairness and Accessibility: The course is accessible to all participants, regardless of their abilities or assistive technologies.

Conclusion:

A-dapt’s job interview training course with MorphCast represents an innovative and educational method to improve your interaction and presentation skills during a job interview, aligned with the principles of the AI Act. The additional features of skill levels, personalized tips, and a report card offer participants a comprehensive and tailored learning experience, empowering them to confidently navigate the job interview process and increase their chances of success.

Use Case Analysis #6: Applying MorphCast AI for Pre-Release Advertising Evaluation & AI Act compliance

Introduction:

Using MorphCast AI for evaluating advertising videos offers an innovative method to test the effectiveness and emotional impact of commercials before their release. By real-time analysis of a test group’s facial expressions, valuable insights can be gained into their reactions and engagement levels.

Features and Workflow:

1.  Test Group Recruitment:

  • A group of individuals is voluntarily recruited to participate in the test.
  • Participants provide informed consent to the use of MorphCast AI and the collection of their facial expression data.

2.  Advertisement Viewing:

  • Participants watch the advertisement in a controlled environment.
  • Their facial expressions are recorded and analyzed in real time by MorphCast AI.

3.  Data Analysis:

  • MorphCast AI provides detailed feedback on participants’ emotional responses, including:
    • Positive Expressions: Smiling, laughter, and indicators of happiness and interest.
    • Attention and Engagement: Eye contact, nodding, and concentration levels.
    • Specific Emotions: Recognition of emotions such as surprise, sadness, anger, or disgust.

4.  Report Generation:

  • Customized reports summarize the collected data and insights on the audience’s reaction to the advertisement.

Benefits:

  • Enhanced Audience Understanding: Data helps better understand how the audience reacts to the advertisement and the emotions it evokes.
  • Message Optimization: Data is used to optimize the messaging and tone of the commercial for maximum impact.
  • Increased ROI: Pre-evaluation with MorphCast AI can improve campaign effectiveness and lead to a higher return on investment (ROI).

AI Act Compliance:

Using MorphCast AI for  advertisement evaluation must comply with the AI Act. Therefore, it’s crucial to:

  • Transparency and Consent: Inform participants about MorphCast AI usage and get their informed consent.
  • Data Anonymization: Collected data must be anonymized and aggregated before analysis or benchmarking.
  • Data Security: Take adequate measures to protect participants’ data.
  • Data Minimization: Collect only the data needed for the specific analysis.
  • Proportionality: MorphCast AI’s analysis must be proportionate to the risk and not used for decisions with a significant impact on participants’ lives.

Example of Use:

A cosmetics company is developing a new mascara commercial. Before releasing it publicly, they use MorphCast AI to test the advertisement on a focus group of women. Facial expression analysis shows  women are captivated by the actress’s performance and the  message. It also reveals a peak in interest when the product is prominently displayed. The company decides to make minor adjustments to maximize the advertisement’s emotional impact and message efficiency based on these insights.

Ethical and Privacy Considerations:

Using MorphCast AI for advertising video evaluation raises ethical and privacy concerns:

  • Respect for Privacy: It’s essential to maintain participants’ privacy and use their data responsibly.
  • Non-discrimination: MorphCast AI’s analysis shouldn’t lead to discrimination based on race, gender, religion, or other factors.
  • Fairness and Accessibility: The test should be accessible to all potential participants, regardless of their abilities or assistive technologies.

Conclusion:

In conclusion, using MorphCast AI for pre-release advertising evaluation is an innovative and valuable method for testing commercials’ effectiveness and emotional impact. By analyzing facial expressions in real-time, companies can gain valuable insights into audience reactions and optimize their messages for maximum impact and ROI across campaigns.  However, ensuring responsible AI Act compliance and respecting ethical and privacy principles when using this technology is essential.

Use Case Analysis #6: Verizon and MorphCast AI – Tom’s story,  Bringing emotions to the edge

Introduction:

Verizon partnered with MorphCast to create a series of innovative videos that use artificial intelligence to create an interactive and personalized experience for the viewer. The MorphCast AI Interactive Media Platform allows the video’s storyline to be modified based on the viewer’s real-time emotions, analyzed through MorphCast AI.

Objectives:

  • Promote diversity and inclusion in the workplace.
  • Create an engaging and interactive awareness campaign.
  • Reach a wide and diverse audience.

Solution:

Verizon used the MorphCast Interactive Video Platform to create videos that present different narrative branches. MorphCast AI’s artificial intelligence analyzes the viewer’s facial expressions in real time and determines their dominant emotion. Based on this emotion, the video branches in a specific direction, showing a different and personalized story.

Analysis and Insights:

  • MorphCast AI: Using MorphCast AI allows for a more engaging and personalized viewing experience. Real-time emotion analysis allows the video content to be adapted to the viewer’s reactions, creating a deeper emotional connection.
  • Interactive Video Platform: MorphCast’s platform allows for the creation of interactive videos with multiple narrative branches. This makes the viewing experience more dynamic and allows the viewer to take an active role in the story.
  • Applicability in Other Contexts: This use case demonstrates how artificial intelligence can be used to create interactive and personalized video content. MorphCast’s technology can be applied in various contexts, such as education, training, marketing, and entertainment.

AI Act Compliance Analysis:

The use of emotion analysis technologies like MorphCast AI must comply with the AI Act. Here are some aspects to consider:

  • Transparency and Consent: It is important to inform viewers about the use of MorphCast AI and obtain their informed consent for the analysis of their emotions.
  • Data Anonymization: Does not collect any data on viewers’ emotions; instead, it tailors the experience in real time directly within the user’s browser.
  • Data Security: It is crucial to adopt adequate measures for the security of viewers’ data, protecting it from unauthorized access, breaches, or misuse.
  • Data Minimization: It is important to collect only the data necessary for the specific analysis, avoiding the collection of unnecessary or irrelevant data.
  • Proportionality: MorphCast AI analysis must be proportionate to the risk and should not be used to make decisions that could significantly impact viewers’ lives.

Conclusion:

The use of MorphCast AI and the interactive video platform in this use case has demonstrated significant potential for creating engaging and personalized awareness campaigns. Adapting the content to the viewer’s emotions allows for a deeper and more meaningful learning experience, contributing to promoting diversity and inclusion effectively. However, it is crucial to use this technology in compliance with the AI Act and respecting ethical and privacy principles.

Further Considerations:

  • Impact Measurement: MorphCast AI analysis can be used to measure the awareness campaign’s impact, assessing whether it has led to a change in viewers’ attitudes and behaviors.
  • Ethics and Privacy: It is important to consider the ethical and privacy implications of using emotion analysis technologies. It is crucial that viewers’ data be collected and used responsibly and transparently.

Using innovative technologies like MorphCast’s can contribute to creating more engaging and effective awareness campaigns while ensuring compliance with regulations like the AI Act and respecting ethical and privacy principles.

Share on:

Concerned about your AI project's compliance with the AI Act and other regulations?

Let's discuss it AI-Act-Compliance Image 1

Do you have great ideas on how MorphCast can help you reach your goals?

Find the right product MorphCast Facial Emotion AI

Informations about
the Author

Stefano Bargagni profile pic
Stefano Bargagni

Internet serial entrepreneur with a background in computer science (hardware and software), Stefano codified the e-commerce platform and founded the online retailer CHL Spa in early 1993, one year before Amazon. He is the Founder and CEO of MorphCast.