The rapid evolution of artificial intelligence and the expansion of the use of software in the delivery of healthcare is transforming the medtech landscape, offering unprecedented opportunities for innovation in patient care.
The Therapeutic Goods Administration (TGA) recently published its Clarifying and Strengthening the Regulation of Medical Device Software including Artificial Intelligence (AI) Report that includes a review of the TGA’s position on the regulation of software that amounts to a medical device software (software as a medical device or SAMD), and AI. The regulation of SAMD and AI will become increasingly relevant for medtech developers, privacy professionals, and anyone navigating the intersection of technology and the health sector in Australia.
We previously covered the consultation that preceded this review in our post, It’s alive! Safe and responsible AI in therapeutic goods.
A technology-agnostic, risk-based approach
The TGA’s regulatory framework remains fundamentally technology-agnostic and risk-based. Regulation is not tied to specific technology or types of AI, but rather to the risks posed by a device (or software) throughout its lifecycle. This approach is designed to be flexible and responsive to accommodate rapid innovation without the need for constant regulatory or legislative overhaul.
This places an onus on developers and sponsors to proactively identify, mitigate, and monitor risks.
Key legislative and regulatory findings
The 2025 review confirms that the current legislative framework is largely fit for purpose, but highlights several areas for potential refinement:
Definitions and roles: The TGA recognises that terms like ‘manufacturer’ and ‘sponsor’ may not align with the realities of software and AI development, where the roles of ‘developer’ and ‘deployer’ are better understood. A need for clearer definitions and guidance to ensure all parties understand their regulatory obligations, especially when software is supplied via online platforms or developed overseas was recognised by the TGA.
The TGA is considering whether to amend the Therapeutic Goods Act 1989 to better reflect these roles, or to clarify them through updated guidance. |
Supply and access: The ease of distributing software-based medical devices online challenges traditional notions of ‘supply.’ The TGA is considering amendments to ensure that access to devices through digital means is appropriately regulated. This includes reviewing the definition of ‘supply’ to ensure it covers access to technology online, digitally, or virtually. |
Assigning responsibility: As AI systems increasingly replace or augment human decision-making, the question of who is responsible for the outputs of deployed AI becomes more complex. The TGA is reviewing whether current language in the legislation adequately assigns responsibility, particularly in cases where AI outputs may lead to regulatory breaches.
This is especially relevant for strict liability offences, where responsibility does not depend on intent or fault. |
Compliance and enforcement: There is a recognised need for improved education, guidance, and compliance. Many developers are unaware that their products may meet the definition of a medical device, or misunderstand the implications of being regulated as a therapeutic good. The TGA is prioritising direct engagement with the software sector, developing new resources, and taking action to remove unapproved devices from the market.
A specific review of digital scribes is also underway to determine whether they should be regulated as medical devices. |
Review of software exclusions for consumer health
The TGA is re-examining the list of software products that are excluded from therapeutic goods regulation, particularly in light of the increasing use of AI. Exclusions for digital mental health tools, consumer health products, and certain laboratory information management systems are undergoing urgent review due to a concern that these exclusions may no longer be appropriate in light of increasing risk. The TGA is considering whether to remove or amend these exclusions, or to introduce new exemptions with specific conditions.
As AI and advanced analytics become increasingly embedded in consumer-facing technologies — such as fitness trackers, smartwatches, and mobile health apps — the distinction between general wellness products and regulated medical devices is becoming less clear.
This may mean that some apps and wearables previously considered low-risk may soon be subject to stricter regulatory oversight, including requirements for transparency, performance validation, and post-market surveillance. Developers and providers of these technologies will need to continually assess whether products meet the definition of a medical device (so as to be regulated as a therapeutic good), ensure compliance with evolving TGA guidance, and be prepared for increased scrutiny around claims, data use, and user safety.
Guidance for adaptive AI, open datasets, and performance monitoring
Emerging applications of new technology, such as adaptive AI (which can change functionality post-deployment) and the use of open datasets or software of unknown provenance, are at the forefront of regulatory concern. The TGA acknowledges that current processes are based on static models, and that adaptive systems may require new approaches to change control, validation, and ongoing monitoring. Guidance is being developed to address these challenges, likely with a focus on:
- defining what constitutes a ‘significant change’ in adaptive AI, and how those changes should be managed and reported
- providing clarity on the use and validation of open datasets and software of unknown provenance, referencing applicable international standards (such as ISO/IEC 5338:2023 and IEC 62304) and
- enhancing post-market performance monitoring, including real-world data collection and mandatory adverse event reporting.
Transparency and user information
Stakeholders, including clinicians and consumers, are calling for greater transparency about medical device software and AI. This includes clear labelling, information about datasets used in training models, and in-app notifications about risks and updates.
The TGA is reviewing advertising provisions and considering modifications that could be made to the Australian Register of Therapeutic Goods (ARTG) to provide more accessible information about approved devices, their intended use, and AI components. The TGA is also considering the introduction of Unique Device Identification (UDI) systems that include software and AI-related details.
International harmonisation
Australia’s regulatory approach is closely aligned with international standards and practices (which is essential given most medical devices supplied locally are imported and certified overseas). The TGA is actively engaged with global regulators, including the International Medical Device Regulators Forum (IMDRF), to promote harmonisation, reduce regulatory burden, and facilitate timely access to innovative devices. The TGA is also monitoring international developments such as the EU AI Act and the US FDA’s approach to AI-enabled medical devices.
Implications for medtech developers and privacy professionals
For emerging medtech companies, these updates underscore the importance of:
- understanding whether software or an app meets the definition of a ‘medical device’ for the purpose of regulation by the TGA
- clarifying the roles and responsibilities of developers, deployers, and sponsors
- ensuring robust risk management, data governance, and performance monitoring processes are in place, including compliance with relevant international standards
- preparing for increased scrutiny around transparency, user information, and post-market surveillance and
- staying informed about evolving guidance, particularly for adaptive AI, the use of open datasets, and changes to software exclusions.
Looking ahead
The TGA’s 2025 review signals a proactive and consultative approach to regulating medical device software and AI. While the core framework remains stable, we expect further targeted consultations and guidance will follow, particularly in areas where technology is outpacing regulation. Medtech innovators should engage early with the TGA, seek expert advice on compliance, and prioritise transparency and user safety in their product development.
As the regulatory landscape continues to evolve, privacy, IP, and health law professionals will play a critical role in helping companies navigate these complexities — ensuring that innovation in healthcare is both safe and responsible.
Feature image: Dietmar Rabich / Wikimedia Commons / /