In a country as geographically vast as Australia, providing high quality health care can be difficult to provide cost effectively. AI-powered large language models (LLMs) have the potential to support the delivery of health care through telehealth services. With their increasing ability to understand and generate natural language in real time, the adoption of LLMs may reduce administrative burden, improve clinical consistency, and extend services to communities that have historically been underserviced.
LLMs and telehealth services
We expect to see the application of LLMs in several key areas, including:
- Automated triage and symptom checking: We are starting to see LLMs embedded within patient-facing chatbots that can prepare reports of symptoms in plain, consistent language, apply Australian clinical triage protocols, and direct patients to appropriate virtual clinics and emergency services.[1] This use of LLMs frees clinicians from time consuming data gathering tasks and enables patients to be directed to services efficiently.
- Generation of draft clinical notes: LLMs are increasingly being used to prepare concise and accurate clinical notes following the delivery of health services via telehealth. Key clinical details are captured in a consistent manner so as to reduce the administrative workload of health care professionals and enable patients to be moved effectively between different providers.[2]
- Mental health: Under the supervision of registered mental health practitioners, LLMs are being used to deliver cognitive behavioural therapy exercises and collect mood and behaviour data between sessions.[3] This provides an opportunity to expand access to mental health support systems across regions with limited specialist availability, significantly reducing the cost associated with seeking out and obtaining mental health care. The number of patients that can be effectively treated by individual health care practitioners is able to be increased in a time of very high demand and recognised limited availability.
Challenges
LLMs will only be able to be deployed effectively within Australian telehealth ecosystems where material risks are managed proactively.
CLINICAL SAFETY |
REGULATORY COMPLIANCE |
PRIVACY CONSIDERATIONS |
The success of LLMs in the provision of clinical care is dependent upon the availability of complete and unbiased training data.
To mitigate risk associated with training data, it is critical that providers:
|
LLM software that is integrated into a telehealth platform may be classified as a medical device under the Therapeutic Goods Act 1989 (Cth)[4] and regulated by the Therapeutic Goods Administration (TGA).
Sponsors of medical devices must have evidence to enable the TGA to evaluate the safety and performance of the device, including clinical data to support:
|
Telehealth providers must prioritise strong data privacy and security protocols to safeguard sensitive health information, including by:
|
LLMs are poised to become integral to the delivery of care via telehealth in Australia. Those benefits are coupled with significant obligations relating to human oversight to safeguard clinical quality, conformity with TGA requirements, and rigorous privacy and data security controls. Providers that embrace transparent disclosure and robust risk management will be able to unlock the opportunity that LLMs offer while maintaining the trust that underpins high quality virtual care.
Medical Devices Consultation by the TGA
The regulation of telehealth will evolve. Our team has previously covered the TGA’s consultation paper ‘Clarifying and strengthening the regulation of Artificial Intelligence (AI)’ released on 12 September 2024. Through this consultation, the TGA received feedback from 53 key stakeholders within the healthcare and therapeutic goods sectors on the regulation of AI in healthcare.
A summary of this initial feedback is accessible on the TGA Consultation Hub, with key stakeholders broadly agreeing that the existing framework can effectively regulate AI, but that there is room for further nuance.
We look forward to providing an update on the outcome of these findings when the TGA’s final report is issued to the Australian Government.
Featured image by Chen from Pixabay.
[1] Christopher Pearce et al, ‘Artificial intelligence and the clinical world: a view from the front line’ (2019) 210(S6) Medical Journal of Australia S38, S38 (accessible at: https://onlinelibrary.wiley.com/doi/full/10.5694/mja2.50025).
[2] Tiago Cunha Reis, ‘Artificial intelligence and natural language processing for improved telemedicine: Before, during and after remote consultation’ (2025) 57(8) Atención Primaria 1, 5 (accessible at: https://www.sciencedirect.com/science/article/pii/S0212656725000149).
[3] Bianca Farmakis, ‘My AI therapist won’t stop texting me’ The Australian (online, 23 January 2025) <https://www.theaustralian.com.au/health/my-ai-therapist-wont-stop-texting-me/news-story/07d8d195fa9a882861037a6e913afbde?btr=b421507820b2ddfe69e2df4812615df1>.
[4] See section 41BD(1) of the Therapeutic Goods Act 1989 (Cth), where the definition of medical device includes software that is used for the purpose of diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease.
[5] See TGA’s guidance on AI and medical device software: https://www.tga.gov.au/how-we-regulate/manufacturing/manufacture-medical-device/manufacture-specific-types-medical-devices/artificial-intelligence-ai-and-medical-device-software.
[6] The obligations of the Privacy Act 1988 (Cth) apply to ‘APP entities’, which includes all private health services providers per sections 6, 6C and 6D(4)(b).
[7] See OAIC’s guidance on de-identification: https://www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/handling-personal-information/de-identification-and-the-privacy-act.