Artificial intelligence (AI) is a term that refers to a range of software technologies designed to mimic human intelligence and perform tasks autonomously.
Examples of AI include:
- Machine Learning (ML)
- deep learning
- neural networks
- Large Language Models (LLMs)
- generative AI
- adaptive AI
- Natural Language Processing (NLP)
- computer vision
- robotics
- expert systems.
We regulate software when it meets the definition of a medical device under section 41BD of the Therapeutic Goods Act 1989 (the Act). Developers of AI-enabled medical device software may meet the definition of a manufacturer or sponsor (or both) of a medical device under the Act. It’s important to understand when and how your product is regulated and what your responsibilities are.
When we regulate AI as a medical device
Australia’s medical device regulatory framework is technology agnostic which means we regulate products based on their intended purpose, not the technology they use. The intended purpose – what the product is used for – is defined by the manufacturer and will determine whether the product meets the definition of a medical device under the Act regardless of the platform used – whether a watch, phone, tablet, cloud service, laptop, or hardware device.
Software or AI products including apps, websites, programs, internet-based services or packages will be regulated as medical devices if they are intended for:
- diagnosis, prevention, monitoring, prediction, prognosis, or treatment of a disease, injury, or disability
- alleviation of, or compensation for, an injury or disability
- investigation of the anatomy or of a physiological process
- control or support of conception.
Examples of medical devices that use AI include:
- apps that help diagnose melanoma from photos taken on a mobile phone
- cloud-based analytics that predict patient deterioration
- chatbots suggest, deliver or monitor treatment to consumers or health professionals
- clinical decision support tools that use generative AI to provide diagnostic or treatment recommendations
- eye disease screening apps for conditions such as diabetic retinopathy, glaucoma and macular degeneration
- radiology image analysis to aid in diagnosing pneumothorax, pneumonia and tumours.
How we regulate software including AI
To be legally supplied in Australia, medical devices must undergo pre-market assessment and be included in the Australian Register of Therapeutic Goods (ARTG), unless excluded or exempt.
If you develop an AI medical device, you are likely to be considered a manufacturer, sponsor or both, and must comply with the regulatory framework. See our guidance on How we regulate software products for more information.
List of AI-enabled medical devices in the Australian Register of Therapeutic Goods (ARTG)
We have published a list of AI-enabled medical devices that are in the ARTG. The medical devices listed incorporate AI or machine learning (ML) technologies.
Managing scope creep in AI systems
Manufacturers should monitor how system updates affect their product’s functionality. New features or functionality may change the intended purpose and cause the product to meet the definition of a medical device – sometimes referred to as "scope creep” or “feature creep”.
Where new features or functionality change the intended purpose or performance of a medical device, these must not be implemented until the device has received appropriate regulatory approvals. This may include submitting a Device Change Request (DCR) to vary the ARTG inclusion.
It is therefore important for the manufacturer to continuously monitor their software’s performance and functionality to assess the impact of updates or identify unintended scope creep.
Example of scope creep in AI
A developer creates a digital scribe that uses AI to record and summarise clinician-patient conversations. Initially, it does not meet the definition of a medical device. Later, the developer adds a feature that suggests diagnoses or treatments that haven’t been mentioned during the conversation between the clinician and their patient. The addition of this feature changes the intended purpose of the product, and it will now meet the definition of a medical device.
The developer must either:
- not release the update and continue supplying the original version, or
- complete our pre-market process and include the updated version in the ARTG.
Managing off-label use
When a manufacturer becomes aware that their product is being used outside its intended purpose (known as off-label use), they must either:
- take steps to prevent further off-label use, or
- cease supply, revise the intended purpose to include the off-label use, and undergo the appropriate TGA pre-market assessment to include the product as a new kind of medical device in the ARTG.
Example of off-label use
A developer releases a large language model (LLM) designed to provide general information to consumers. They later discover it is being used to provide health advice including diagnoses and treatment strategies. Since this use likely meets the definition of a medical device, the developer who will meet the definition of a manufacturer under the Act, must either:
- implement controls to prevent the system from providing health advice, or
- stop supply of the product, seek our approval for the new intended use and include the product in the ARTG.
Evidence requirements for software using AI
In addition to general software requirements the manufacturer of software that uses AI or machine learning (ML) is required to possess evidence that is sufficiently transparent to enable evaluation of safety and performance of the product.
For more information on evidence requirements for software using AI, please review the information on our Evidence requirements for software using AI webpage.
Using synthetic data
The term synthetic data is commonly used to refer to data that is artificially generated to augment or replace real-world data for training or validation purposes. Other terms that might be used interchangeably include “artificial data” and “simulated data.”
These terms all describe data that is created through algorithms or simulations to mimic the characteristics of real data. Synthetic data is often used to assist in training and validating AI models where it isn’t possible to rely solely on real-world data.
Synthetic data may be used in place of data from real patients or devices for training or validation of AI systems. Manufacturers must provide a clear rationale for its use, along with a description of how the data was generated and its relevance to the intended use.
For many use cases, synthetic data may not provide sufficient depth and variability to adequately validate a product. Where data is readily available in large volumes, synthetic data is less likely to be considered appropriate.
For clinical validation, synthetic data may supplement. However, it will generally not replace clinical data in satisfying clinical evidence requirements.
Acceptable use cases for synthetic data may include:
- rare diseases where real-world data is limited
- scenarios where privacy concerns restrict access to real data.
Whole of government AI oversight
The Australian Commission for Safety and Quality in Health Care (the Commission) is responsible for e-Health safety. Information is in development to support AI safety.
We are working with the Commission and other parts of the Australian Government Department of Health and Aged Care to ensure safety and performance. This is while balancing the need to minimise regulatory burden for AI.
Our work on responsible AI includes working with the Department of Industry, Science and Resources to ensure alignment with the National AI Plan.
More information
You can access more information about how we regulate software that meets the definition of a medical device in our guidance on How we regulate software products. This can help you:
- determine if your device meets the definition of a medical device
- better understand your obligations.
For clarification after reviewing the information on our site, you can email us at digital.devices@tga.gov.au. You can also request a formal pre-submission meeting with us before submitting an application.