Last updated: August 22, 2025
This page is available in your language via the drop-down menu.
Key Principles of the EU Artificial Intelligence Act
The Artificial Intelligence Act (EU Regulation 2024/1689) was adopted by the European Union. So-called Emotion AI—that is, systems intended to infer emotional states or intentions from biometric data—is explicitly regulated. The Act applies not only to uses within the territory of the Union but also extraterritorially to providers and deployers who place such systems on the EU market or whose outputs are directed at individuals located in the Union, regardless of where the provider or user is established. Example: a U.S.-based user who uses the technology to analyze the emotions of EU users is in violation of this restriction.
The regulation follows a risk-based approach, structured into: Absolute prohibitions for “unacceptable” uses and Strict obligations for “high-risk” systems.
Source: Official EU text: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32024R1689
Our Approach: Use Prohibited in the EU
MorphCast Inc. is established outside the European Union and does not place its AI systems on the EU market, nor does it offer them to users located in the Union, in accordance with Article 2(1) and 2(2) of Regulation (EU) 2024/1689.
We are currently developing technical safeguards that will automatically disable our on-device Emotion AI engine whenever it detects that the user’s device is operating within EU territory. Because all emotion analysis runs entirely in the end-user’s browser—processing camera streams locally—we can reliably locate instances of our AI engine in the EU and, if necessary, block its functionality with a clear notice that redirects users back to this policy page.
User Responsibility
Any use by organizations, entities, or individuals located in the European Union, or directed toward them, is strictly prohibited, is the sole responsibility of the user or integrator, and may be reported to the competent authorities.
MorphCast reserves the right to monitor usage patterns to detect potential violations of this policy.
Rationale Behind This Policy
MorphCast has conducted an in-depth feasibility assessment regarding compliance obligations under the EU AI Act. Our analysis conclusively determined that compliance—for permitted high-risk applications—requires operational adaptations that are prohibitively complex and costly, which the company is currently not in a position to undertake. Considering the significant financial and operational resources required, along with the substantial legal risks resulting even from minor deviations in compliance, MorphCast has prudently decided to entirely exclude any use within or targeted toward the European Union market. This approach ensures we maintain operational integrity and significantly reduces risks for both MorphCast and our clients.
For clarity, we list the adjustments we will be required to make in order to achieve compliance in the EU:
- Ongoing audits to ensure permanent compliance with the regulatory framework (Art. 62 and Art. 63);
- Extensive technical documentation, including detailed information about the functionality, architecture, and safety assessment of the AI system (Art. 11 and Annex IV), which would advantage competitors by effectively revealing proprietary trade secrets;
- Dedicated human oversight personnel, required to continuously monitor system operations and intervene when necessary (Art. 14);
- Advanced logging and recording infrastructure, to ensure complete traceability of AI activities and guarantee transparency and control (Art. 12);
- A risk management system, requiring regular evaluations and continuous updates to identify, analyze, and mitigate potential risks associated with the use of AI (Art. 9);
- Quality assurance procedures and conformity assessment, mandatory to ensure that all technical and regulatory standards are continuously met (Art. 17 and Art. 43);
- CE marking obligations, subject to a conformity assessment procedure involving a Notified Body (Art. 43 and Art. 49); the Regulation mandates that in the absence of harmonized standards or common specifications, high-risk systems must undergo assessment by an accredited Notified Body (Art. 43, procedure under Annex VII). Once involved, the Notified Body’s identification number must be placed next to the CE marking and also displayed in any promotional material (Art. 48). The list of Notified Bodies for Regulation (EU) 2024/1689 on Artificial Intelligence is publicly available in the European Commission’s NANDO database.
- Registration in the EU database for high-risk AI systems prior to being placed on the market or put into service (Art. 51);
- Transparency obligations, including the provision of understandable information to end-users about the system’s operation and limitations (Art. 13);
- Post-market monitoring, with an obligation to continuously monitor the behavior of the system after it has been placed on the market (Art. 61);
- Notification of serious incidents, to be promptly reported to the relevant authorities in the event of significant anomalies (Art. 62).
Contact
For compliance inquiries, please contact MorphCast’s Legal & Compliance Department at legal@morphcast.com
What You Need to Know
Can a European company use MorphCast Emotion AI in video conferences if all participants are located outside the EU?
Answer:
No. Under the EU AI Act, territorial scope can cover uses outside the EU if effects occur in the EU (Art. 2(1)(c)) and an EU-established company that initiates/controls the system may be deemed to be putting it into service in the EU (Art. 3). Until official guidance or case law clarifies otherwise, MorphCast prohibits this use by EU-based entities and does not authorize or support it.
Can a non-EU company use MorphCast Emotion AI if all users and operations are outside the EU?
Answer:
A: Yes—if, and only if, there is zero EU nexus. Under Art. 2(1)(c), the AI Act applies when use produces effects within the EU. It does not apply where all of the following are true:
- the company is established outside the EU;
- all users (e.g., call participants) are outside the EU;
- no data, outputs, decisions, or services are accessed from, stored in, offered to, or acted upon within the EU.
Note: Any EU link (EU users/customers/staff, EU-hosted servers, EU access to results, or targeting the EU market) can trigger applicability and corresponding obligations.
What can MorphCast guarantee to its customers regarding AI Regulation compliance, and what is beyond its control?
Answer:
MorphCast can guarantee that:
- Its Emotion AI engine is automatically blocked within the EU territory and will not operate when geolocation detects an EU IP address.
- It does not store or transmit facial images or biometric identifiers — emotion analysis is performed entirely on-device (in the user’s browser or app).
- It provides product-level configuration alerts (Customers’ Dashboard), clearly marking which products use Emotion AI (🚫) and which may not (⚠️).
- It offers legal notices and guidance to help customers understand their responsibilities under the EU AI Act.
MorphCast cannot control:
- Whether customers use dashboards or results in the EU even if the AI processing happens outside the EU.
- The legal interpretation of complex or ambiguous cases under the AI Act — only the European AI Office or courts can provide definitive rulings.
Conclusion:
MorphCast takes every reasonable technical and legal precaution to ensure compliance with the EU AI Regulation, but ultimate responsibility for the correct use of Emotion AI lies with the customer.
The conclusions and interpretations provided on this page are the result of our internal analysis. While we maintain deep and ongoing expertise on this topic — as it directly affects our core business — these insights are not intended to be authoritative or legally definitive. Given the inherent complexity of the Regulation and the many aspects still awaiting clarification from the European Union, no absolute accuracy is claimed. This page is subject to internal audit and will be updated regularly — at least once a month — to reflect any developments on the matter.