The EU Artificial Intelligence Act (AI Act) is the world’s first comprehensive AI regulation, aiming to ensure that AI systems placed on the EU market are safe, trustworthy, and respect fundamental rights. It was formally adopted in 2024 and will be fully applicable by 2026, with some provisions starting earlier. The AI Act contains 113 articles and 13 annexes.

Key Elements:
1. Risk-Based Approach:
- Prohibited AI Systems: AI uses considered unacceptable risk (e.g., social scoring, manipulative subliminal techniques).
- High-Risk AI Systems: Includes AI used in medical devices, critical infrastructure, law enforcement, etc. These are subject to stringent compliance requirements.
- Limited-Risk AI: Requires transparency (e.g., chatbots must disclose they are AI).
- Minimal-Risk AI: Most AI systems (e.g., spam filters) are not regulated under strict obligations.
2. High-Risk AI Obligations (Impacting MedTech & Software):
- Risk management and quality management systems.
- Data governance and dataset quality.
- Logging, traceability, and record-keeping.
- Transparency and user information.
- Human oversight requirements.
- Robustness, accuracy, and cybersecurity.
- CE Marking under existing sectoral legislation (MDR/IVDR) is harmonized with AI Act requirements.
3. Specific Role Definitions:
- Provider: Entity developing or placing the AI system on the market.
- Deployer: User operating the AI system (could be hospitals, clinicians, etc.).
- Importer/Distributor: Economic operators involved in supply chain responsibilities.
4. AI Regulatory Sandbox and Innovation Support:
- Facilitates a controlled environment that allows AI system providers to develop, train, validate, and test their innovative AI technologies for a limited period under regulatory supervision.
- SME-friendly approach to reduce regulatory burden for startups.
5. Governance and Enforcement:
- National Competent Authorities and AI Office for harmonized enforcement.
- High fines for non-compliance (fines of up to 35 000 000 EUR or up to 7 % of its total worldwide annual turnover, whichever is higher).
Key take-aways:
- Compliance with EU MDR/IVDR remains mandatory, but additional AI-specific obligations will apply.
- Harmonized standards and MDCG guidance (e.g., MDCG 2025-6) will help in overlapping compliance pathways.
Conclusion:
The EU AI Act marks a groundbreaking step in ensuring that AI technologies are developed and deployed in a safe, transparent, and human-centric manner. By introducing a risk-based framework and aligning AI obligations with existing sectoral regulations like the MDR and IVDR, the Act sets a global benchmark for trustworthy AI. For businesses, especially in regulated sectors like healthcare, proactive preparation for compliance will not only ensure market access but also build long-term credibility and user trust in AI-driven innovations.
For more details, Contact RN Consulting Solutions MedTech Compliance Experts
👉 Follow us on LinkedIn for expert insights on medical device compliance, EU MDR, FDA regulations, and quality system strategies. https://www.linkedin.com/company/rn-consulting-solutions/
Stay informed, stay compliant!