AI and accountability in physiotherapy
A recent decision by the Administrative Review Tribunal has highlighted the serious professional risks of using artificial intelligence (AI) in clinical documentation without appropriate oversight.
According to The Australian Financial Review, an NDIS participant’s bid to increase her $202,000 support package was rejected in part because two physiotherapists had used AI tools to help draft reports submitted as evidence.
The tribunal found that the reports contained factual inaccuracies—including describing the participant as living in a ‘rural’ area when she actually resided in suburban Perth—and that the clinicians could not adequately explain the reasoning behind some of their conclusions.
The case underscores an important truth: while AI can streamline clinical documentation, physiotherapists remain fully accountable for every word submitted under their name.
Technology does not transfer responsibility
Under the Health Practitioner Regulation National Law and Ahpra’s Code of conduct, physiotherapists must ensure that all clinical documentation is accurate, contemporaneous and verifiable.
The use of AI—whether to draft, format or summarise—does not shift or dilute that responsibility.
Tribunal senior member Dr Bertus de Villiers emphasised that while he did not object to AI use in principle, practitioners must remain ‘responsible and accountable for the content of a report’.
This aligns with Ahpra’s professional standards, which prioritise accuracy, integrity and clinician accountability for all professional communications.
When AI-generated material is not thoroughly reviewed, practitioners risk breaching these standards—and potentially facing professional, reputational or legal consequences.
What went wrong in the case
The two physiotherapists involved, working via telehealth, reportedly used AI programs to assist in drafting letters that supported the participant’s claim for additional NDIS funding.
While they stated that they reviewed the reports, the tribunal found several inconsistencies that could not be rationally explained.
One physiotherapist could not clarify why her report referred to the participant’s ‘rural’ location; another admitted that phrases such as ‘minimum necessary’ may have been introduced by the AI system.
Ultimately, the tribunal concluded that the reports were unreliable and upheld the NDIA’s decision to deny additional support.
Importantly, the tribunal did not condemn the use of AI itself but the lack of professional oversight and verification.
Professional, ethical and privacy implications
This case highlights several overlapping responsibilities for physiotherapists using AI-assisted documentation:
- the Ahpra Code of conduct—practitioners must ensure that their records are factual and not misleading. Submitting unverified AI-generated content could breach this obligation
- the NDIS Code of Conduct—providers must act with integrity and competence. AI-assisted errors influencing funding decisions could constitute a breach
- the Privacy Act 1988—using AI platforms that process identifiable patient data raises obligations regarding consent, data handling and information storage, especially when using overseas servers.
The message from regulators is clear. AI can assist but it cannot assume professional accountability.
A framework for responsible AI use in physiotherapy
AI tools can offer real value to physiotherapists by reducing administrative burden but they must be integrated safely.
The following framework can help mitigate risk:
- verify every output—independently confirm all AI-generated text against source documentation and clinical observations
- maintain documentation trails—retain both original and AI-assisted versions of reports, noting where AI was used and how it was verified disclose material AI use—if AI contributed meaningfully to a report, disclose this and ensure informed patient consent for data use
- retain human judgement—limit AI’s role to drafting and structuring, not reasoning, clinical interpretation or justification of funding levels
- select compliant tools—use systems specifically designed for healthcare that comply with Australian privacy and data security requirements.
Trust and the profession’s reputation
This case extends beyond one tribunal matter.
Trust in physiotherapists’ professional judgement underpins the entire allied health sector.
If AI-generated inaccuracies become common, NDIS planners and agencies may respond with heightened scrutiny, audits and documentation demands—penalising the many for the errors of a few.
Conversely, clinicians who use AI transparently and responsibly can lead by example, demonstrating that technology, when governed properly, enhances rather than undermines clinical credibility.
Building literacy and governance
To navigate this evolving landscape, three coordinated elements are essential:
- AI literacy—continuing professional development programs must go beyond basic tool use and teach clinicians how to critically appraise AI outputs
- professional guidance—the APA and Ahpra should continue developing practical standards for AI use in clinical documentation
- organisational policy—practices should implement clear internal policies on data governance, verification and disclosure.
Oversight is non-negotiable
This case is not an indictment of technology; it is a reminder that professional oversight remains essential.
AI can assist with structure and speed but it cannot discern truth, apply reasoning or uphold ethical responsibility.
When physiotherapists sign their name to a report, they affirm both the accuracy of its content and the integrity of their judgement.
Technology should serve that trust, not replace it.
Disclosure: the views expressed in this article are solely those of the author and do not represent the official position of the APA, its AI Advisory Group or CliniScribe AI.
Barry Nguyen APAM
>>Barry Nguyen APAM is a physiotherapist, a software engineer, the founder and CEO of CliniScribe AI and a member of the APA’s AI Advisory Group.
© Copyright 2026 by Australian Physiotherapy Association. All rights reserved.
