Informed consent in the age of AI

 
A box labelled AI, surrounded by caution signs

Informed consent in the age of AI

 
A box labelled AI, surrounded by caution signs

The APA’s insurance partner BMS and Scott Shelly and Alexander Sheridan of health law firm Barry Nilsson discuss informed consent and physiotherapists’ obligations in the use of AI tools.

AI tools can provide many benefits, from operational efficiency to diagnosis documentation. However, they can also introduce new areas of risk in relation to patient privacy and safety, legal regulations and ethics. This is why informed consent from the patient before using an AI tool is important. Informed consent is vital to ensure that patients are making informed, voluntary decisions about the healthcare they are receiving. Patients have a right to feel safe and informed when receiving treatment. The use of AI, without consent, can damage the trust between a patient and a practitioner.

Requirements of informed consent 

Physiotherapists are encouraged to review their informed consent obligations under the shared Code of conduct. Before physiotherapists use AI within their clinics or with patients, they should inform their patients, gain consent for its use and document this consent in their records. 

At Barry Nilsson, we recommend that physiotherapists take the same approach to informed consent in the use of AI in their practice as they would use for treatment. 

Specifically, physiotherapists should: 

  • explain what AI program is being used
  • explain what the specific AI program is used for (eg, as a scribe or for treatment planning, assessment or other)
  • explain what information the AI will collect and where it will be stored
  • explain the limitations of the AI program and what physiotherapist oversight and supervision of the AI will occur 
  • inform the patient that they can opt out of using the AI program in their treatment
  • provide the patient with opportunities to ask questions regarding the use of AI. 

When using AI, physiotherapists must: 

  • be well informed about the AI tool and its uses, limitations and functionality so they can appropriately inform their patients
  • ensure that they are using language the patient understands when discussing it be prepared for patients who decline the use of AI in their treatments and consultations and have alternate options available such as manual note taking in place of an AI scribe • document consent in the patient records 
  • remember that clinical records made with the assistance of or by AI tools must be reviewed by a physiotherapist for accuracy and appropriateness.

A physiotherapist’s obligations regarding clinical record keeping remain even when using AI tools. 

Physiotherapists should also be mindful of using AI tools that record the conversation between the physiotherapist and the patient.

In some Australian jurisdictions it is unlawful to record a private conversation, even where you may be a party to that conversation, without consent. 

It is therefore possible that recording a patient using an AI tool, without obtaining their informed consent, may leave a physiotherapist open to penalties or civil claims. 

Clinical scenario 

While there are examples of physiotherapists inappropriately using AI to generate reports, there has not been a published decision relating to practitioners failing to gain consent to use AI. 

At Barry Nilsson, we anticipate that claims regarding informed consent for the use of AI will increase as more clinics begin to implement AI tools. Take the following hypothetical example. 

A new patient attends your practice for advice and treatment regarding rehabilitation from an Achilles tendon repair surgery. 

Your clinic has incorporated a suite of AI tools including: 

  • AI note taking that records the conversation within the consult and transcribes it
  • an AI tool that reviews clinical records and generates correspondence to the patient
  • a new AI tool that reviews the clinical records and recommends appropriate treatment plans. 

After leaving the clinic, the patient discusses the treatment plan with their surgeon, who recommends that the patient query aspects of the plan with the physiotherapist. 

In the next consultation, the patient is upset to find out that the AI tool, not the physiotherapist, has generated the treatment plan based on recordings of the patient assessment in the initial consult. 

Learnings and safeguards 

Physiotherapists should employ a range of safeguards when considering using AI tools to:

  • record a conversation with a patient
  • review the patient’s information for the purpose of clinical record keeping, generating correspondence and generating the treatment plan
  • assist the physiotherapist to generate an appropriate treatment plan based on the patient’s information and the AI tool’s clinical database.

At Barry Nilsson, we suggest that the physiotherapist take specific time to discuss with the patient the fact that the AI tool will be suggesting treatment plans based on the information put into it, along with the role of review and what safeguards are in place.

There is case law in relation to new and emerging technologies that suggests that informed consent to use these technologies requires a higher level of patient understanding. 

To assist patient understanding of an AI tool, a physiotherapist could: 

  • provide written documentation explaining the AI tool and its functions and provide adequate time for the patient to comprehend the information before making an informed decision
  • always use clear and straightforward language when speaking with and instructing patients
  • ask patients to confirm what has been said to ensure proper comprehension
  • be aware of their patient’s literacy levels and adapt communication accordingly
  • ensure they understand the AI tool they’re using and are aware of their obligations under the shared Code of conduct

More information 

Learn more about your professional obligations and informed consent from the following resources. 

‘Meeting your professional obligations when using Artificial Intelligence in healthcare’  

‘AI in healthcare: what are your responsibilities?’

‘AI and accountability in physiotherapy’  

Disclaimer: this article is written by BMS, the APA insurance partner, and Scott Shelly and Alexander Sheridan of health law firm Barry Nilsson. Barry Nilsson communications are intended to provide commentary and general information. They should not be relied upon as legal advice. Formal legal advice should be sought in particular transactions or on matters of interest arising from this communication. ‘APA Member Insurance’ refers to Professional Indemnity and Public & Products Liability Insurance that provides cover for an individual. You must be a current Australian Physiotherapy Association (APA) member to be eligible to register for the APA Member Insurance Program. If your membership ceases you will not be offered renewal when your policy expires. In offering this insurance to our members APA is a distributor of BMS Risk Solutions Pty Ltd (BMS) AFSL 461594, ABN 45161187980. The insurance is issued by BMS on behalf of AAI Limited ABN 48 005 297 807 trading as Vero Insurance (the insurer). BMS acts on behalf of the insurer and not on your behalf. This is general advice only and BMS has not considered whether it was suitable for your personal circumstances, current objectives, needs or financial situation. Please read the Policy Wording and the BMS Terms of Engagement which contains the Financial Services Guide before making a decision about purchasing this policy. APA receives an annual payment from BMS which is used for insurance related marketing and professional development activities to support our members.
 

© Copyright 2026 by Australian Physiotherapy Association. All rights reserved.