Introduction
In a world first, an autonomous AI was recently approved as a medical device.
This means the skin cancer detection system (Skin Analytics’ DERM) is legally authorized to making clinical decisions without human oversight.
This marks a milestone in digital health. For early-stage digital health developers, navigating the regulatory landscape can be one of the most challenging aspects of bringing an AI-driven clinical tool to market.
This blog explores the steps required to achieve UKCA marking under the UK Medical Device Regulations 2002 (UK MDR 2002), offering key insights for those looking to develop AI-powered clinical decision systems.
Understanding Compliance
UKCA marking is essential for medical devices entering the UK market, signifying compliance with health, safety, and performance standards. Following Brexit, the UK Medical Device Regulations 2002 now govern medical device approvals.
If you have developed a piece of software (such as an app) that is not incorporated into a piece of medical hardware, that software might still be classed as a medical device according to the MHRA.
Your software will be considered to have a medical purpose if it has one or more of the following functions:
-
Prevention of disease
-
Diagnosis of disease, an injury, or handicap
-
Monitoring of disease, an injury, or handicap
-
Treatment or alleviation of disease, an injury, or handicap
-
Compensation for an injury, or handicap
-
Investigation, replacement, or modification of the anatomy or of a physiological process
-
Control of conception
It might deal with data or tests happening outside the body. This is called in vitro data.
If your software deals with one of the functions mentioned above, plus one of the following, then it is considered to have a medical purpose:
-
Concerning a physiological or pathological state
-
Concerning a congenital abnormality
-
To determine the safety and compatibility with potential recipients
-
To monitor therapeutic measures
There are some devices that can be classed as In Vitro Diagnostic Medical Devices. If you’re not sure, check the latest MHRA guidelines.
The legislation lays out a system for classifying medical devices into different risk classes. The higher the risk, the greater the level of assessment required by UK Approved Bodies.
Under Part II of the UK MDR 2002, ‘general’ medical devices are grouped into four classes as follows:
- Class I – generally regarded as low risk
- Class IIa – generally regarded as lower medium risk
- Class IIb – generally regarded as higher medium risk
- Class III – generally regarded as high risk
Classification of a medical device will depend upon a series of factors, including:
-
How long the device is intended to be in continuous use
-
Whether or not the device is invasive or surgically invasive
-
Whether the device is implantable or active
-
Whether or not the device contains a substance which, if used separately, is considered to be a medicinal product and is liable to act on the body with an action ancillary to that of the device
For AI-driven diagnostic systems like DERM, compliance requires:
-
Classification of the Device: AI-based diagnostic tools typically fall under Class IIa, IIb, or III, depending on the level of risk associated with automated decision-making (MHRA, 2023)
-
Clinical Evaluation: Demonstrating safety and efficacy through clinical studies is crucial, ensuring real-world applicability and minimising bias
-
Post-Market Surveillance: AI models continuously evolve, requiring ongoing monitoring and reporting of performance in clinical settings
Key Steps in the Regulatory Journey
1. Define the Intended Use and Risk Classification
Understanding how regulators classify your AI system determines the level of scrutiny it will face. The UK MDR uses the Rule 11 classification framework for software as a medical device (SaMD), meaning most AI-driven diagnostic tools are classified as higher-risk devices (UK MDR 2002).
2. Develop a Robust Quality Management System (QMS)
Compliance with ISO 13485 ensures your organisation follows best practices in design, development, and risk management. A QMS helps structure documentation and ensures alignment with regulatory expectations.
3. Generate Clinical Evidence
Clinical validation is mandatory for UKCA marking. This involves:
-
Conducting prospective clinical studies to validate AI performance.
-
Comparing AI outputs against expert clinicians to measure sensitivity and specificity.
-
Addressing potential biases by ensuring a diverse dataset (Topol, 2019).
4. Engage with an Approved Body
Approved Bodies (ABs) are organisations designated by the Medicines and Healthcare products Regulatory Agency (MHRA) to assess medical device compliance. For autonomous AI tools, working with an experienced AB early in development can smooth the approval process.
5. Post-Market Surveillance and Continuous Learning
Unlike traditional medical devices, AI models can change over time. The UK MDR requires manufacturers to implement:
-
Post-market clinical follow-ups (PMCFs) to track real-world performance
-
Periodic Safety Update Reports (PSURs) to ensure ongoing regulatory compliance (MHRA, 2023)
Lessons from DERM: Key Takeaways for AI Startups
1. Start Regulatory Planning Early
One of the biggest mistakes early-stage startups make is treating regulatory approval as an afterthought. Regulatory planning should be integrated into product development from day one.
2. Prioritise Transparency and Explainability
Transparency and explainability are very important for gaining user trust, but healthcare professionals and regulators also need to trust AI decisions. Explainable AI is a set of processes that allow users to understand the outputs created by machine learning algorithms, deep learning, and neural networks. This is also important for organisations – organisations should have awareness of the decision-making process so they can have accountability. Explainable AI can also help address concerns about bias and decision-making opacity (Lipton, 2018).
3. Build for Real-World Integration
AI tools must fit seamlessly into clinical workflows. DERM’s success was partly due to its ability to integrate with existing dermatology pathways, so that disruption for clinicians would be minimal. As with regulatory planning, it is much easier to involve clinicians at an early stage and develop integration into your product than to try to fit retrospectively.
So What Can We Learn From DERM?
Navigating the regulatory landscape for autonomous AI in healthcare requires early planning, rigorous clinical validation, and continuous compliance with evolving regulations. By following the structured approach outlined above, startups can increase their chances of achieving UKCA marking and bringing innovative AI-driven healthcare solutions to market.
For digital health developers, Skin Analytics’ DERM serves as an inspiring example of how AI can transform clinical decision-making—provided that regulatory hurdles are carefully navigated. If you’re developing a similar product, now is the time to engage with regulatory experts, align with UK MDR requirements, and lay the groundwork for successful clinical adoption.