Innovation at hand: AI's impact on physiotherapy
Like it or not, artificial intelligence is poised to make an impact on the practice of physiotherapy, both in the short term and in the long term.
Since the widespread launch of ChatGPT, the free artificial intelligence (AI) program that swept the world in 2023 with its apparent ability to provide answers to any question posed, write essays and documents on command and even write novels based on a few prompts, people have been wondering how it will affect their own work.
At first glance, physiotherapists might not think that AI has anything to offer them.
After all, physiotherapy is a hands-on profession—patients present with everything from an injured knee to a chronic neurological condition and clinicians rely on their manual skills and knowledge to treat them.
However, AI is likely to impact physiotherapy, allied health and healthcare in many ways.
For some health practices, including radiology and pathology services, the benefits of AI are already becoming clear as it streamlines administrative processes and enables faster, more sensitive detection of anomalous results.
‘AI has been used in healthcare since the mid-1980s.
‘We just haven’t called it that.
‘The bigger question is—where can it go and how pervasive will it be?’ says Professor Enrico Coiera, director of the Centre for Health Informatics at the Australian Institute of Health Innovation, Macquarie University, and co-founder of the Australian Alliance for Artificial Intelligence in Healthcare.
He notes that many instruments used in pathology and radiology incorporate AI technologies and that AI is increasingly being used to power symptom checkers in services such as Healthdirect Australia, the Australian Government’s virtual public health information service.
‘Without exaggeration, I think that in the next 20 to 30 years, almost everything we do will involve engaging with AI in one way or another, both as consumers and as professionals.’
Late last year, the Australian Alliance for Artificial Intelligence in Healthcare released A national policy roadmap for artificial intelligence in healthcare, a national plan for the safe and ethical use of AI in healthcare (Dorricott 2023).
The road map lays the foundation for Australia’s healthcare industry to embrace the opportunities offered by AI, identifying gaps in capabilities and providing guidance on key issues.
The policy recommendations focused on five key areas—safety, quality, ethics and security of AI; workforce training and development; consumer literacy; competitive industry; and research.
‘It’s a whole-of-sector approach,’ Enrico says.
Streamlining administration
Professor Enrico Coiera, from the Australian Alliance for Artificial Intelligence in Healthcare says that AI is increasingly being used in healthcare.
One of the first uses of AI likely to be implemented widely across healthcare is applications that streamline and simplify paperwork.
‘Everybody writes documents.
‘Physios do. GPs do. Nurses do.
‘So everybody will feel the impact of that,’ Enrico says.
Physiotherapist turned software developer Barry Nguyen agrees.
Last year, at the Physiotherapy Research Foundation’s Physio Pitchfest, he pitched an AI-based tool called CliniScribe, which can manage documents and referrals, and take clinical notes and use them to generate reports and referrals, saving the clinician valuable time.
‘CliniScribe’s AI handles repetitive and menial tasks for physiotherapists, alleviating clinician burnout and enabling them to focus on delivering higher value patient care,’ Barry says.
Dr Ryan Gallagher MACP , a physiotherapist who now works as a data scientist for digital healthcare company Honeysuckle Health, believes that the way clinicians capture information from their patients will change with the introduction of AI-powered tools, removing the need to document every interaction verbatim.
He suggests that AI could also be used to gather information such as the patient’s previous medical history, which is typically collected on paper forms just prior to the appointment, giving the clinician more time to physically assess the patient.
‘I don’t think we’ll ever get to the point of allowing an AI to listen in on our clinical interactions.
‘But documenting everything from start to finish will be redundant if you can provide AI with examples of your existing documentation, indicate the key points from the patient interaction that you want to document and then generate the required documentation in the format you need.
‘I think that’s very realistic in the medium term.’
Decisions, decisions
Another area likely to be affected by the use of AI in the medium term is clinical decision support.
For many conditions, both chronic and acute, there are multiple approaches to treatment and AI can play a role in determining the best approach based on the information to hand.
‘Physiotherapists are always looking for the best treatment that’s going to get a result in a realistic time frame.
‘Highly experienced clinicians are better at doing that; junior clinicians need more time and guidance.
‘AI is going to be able to provide those recommendations, based on the patient’s history and examination, along with a much more detailed insight into how someone’s going to respond to a particular treatment option,’ Ryan says.
By converting clinical prediction rules into algorithms, AI-based apps can start to generate insights and make suggestions or even predictions, guiding clinicians to the best option or treatment plan based on the patient’s age, gender, clinical symptoms and so on.
Professor Steven McPhail, a physiotherapist who now heads up the Australian Centre for Health Services Innovation, says that using an AI approach to analyse how patients respond to treatments and interventions might allow physiotherapists to further personalise interventions.
‘At the moment the pathway from assessment to intervention selection can be quite crude in many areas of healthcare, including physiotherapy.
‘And a one-size- fits-all approach doesn’t work once we start moving into complex health conditions.
‘It can be tricky to know which intervention to commence and at what intensity, duration and frequency,’ Steve says.
‘As we digitise our systems, including our records, there is an opportunity to gain new insights by using AI to help overcome some of these elements of practice.’
Keeping patients on track
Dr Ryan Gallagher APAM says the way clinicians gather and use information will change with the introduction of AI-powered tools.
AI can also help patients stay on track with treatment plans, providing support and motivation and progressing exercise programs.
Physiotherapist Phebe Liston has developed a chatbot that walks the user through a simple evaluation of an injury.
The app, Physio Phebe, is underpinned by an AI-powered platform, providing a conversational interface for patients to describe their injury and then links to videos with first aid advice and suggestions, including when to seek professional help.
Phebe says the app in its current form is very simple but the potential is there to create something more powerful.
Another direction for the development of consumer-facing apps lies in generating exercise programs for patients and, perhaps more importantly, keeping them engaged and motivated.
‘With AI we can start to incorporate an array of motivational factors and it can learn whether or not they’re effective for particular patients.
‘We may be able to determine particular approaches for goal setting and for the structure, content and frequency of reminders.
‘This is all fair game for AI to help personalise our care,’ says Steve.
AI in the clinic
The hands-on nature of physiotherapy means that there are very few measurement tools routinely used in the clinic.
Some clinicians will use a goniometer to assess range of motion or linear scales to measure reach; others look at how many repetitions of a simple movement like a calf raise can be done before fatigue sets in or assess balance and gait.
It’s hard to know how and where, even if, AI can help with these assessments.
‘We’re starting to see programs pop up that can measure range of motion, monitor gait patterns or provide feedback on exercise program adherence.
‘Meanwhile, patients are using their smartphones to video themselves or record their exercise program to get feedback on the quality of the exercise and whether they’re doing it properly.
‘That’s something that’s going to become more and more prevalent,’ says Ryan.
He notes that there are still questions about the accuracy and clinical benefits of these apps but suggests that physiotherapists might be able to check on patients via telehealth appointments and remote monitoring.
Pattern recognition is one thing that AI can do very well and deep learning is being used in image analysis—for example, in MRIs and other scans—to identify anomalous results, speeding up diagnosis and increasing accuracy.
While the use of pattern recognition may not be as straightforward in physiotherapy applications, it may help with analysing complex movement patterns, says Steve.
Where AI might turn out to be most useful in functional assessment is in recognising subtle changes to normal patterns, such as in gait or balance analyses.
For example, a group of researchers at Victoria University is developing wearable sensors that they hope can predict and prevent falls before stability is lost in people with mobility issues (Victoria University 2023).
Another group at the Australian e-Health Research Centre in Queensland is using wearable sensors on infants to see if they can identify cerebral palsy earlier through movement patterns, allowing earlier intervention (Koopman et al 2020).
Barriers and challenges
Professor Steve McPhail APAM says that in the future, clinicians may be able to use AI to decide on the best intervention or treatment pathway.
A number of challenges would need to be overcome before these applications of AI become commonplace in physiotherapy practice.
‘I was chatting with a colleague, who said, “This will be one of the last areas to have its practice defined by algorithms because each therapist treats each patient like a unique person and a unique scenario.”
‘We’re going to need a lot of data, a lot of information, if we want to develop more sophisticated systems,’ Steve says.
‘Physiotherapy is not as amenable to pattern recognition and an algorithm as a CT image or a plain film X-ray would be.
‘Because of that, there are some challenges in developing the technology.’
Steve also notes that just because AI can do a task as well as a human can, it doesn’t mean that it should.
‘In healthcare, it turns out that we’re probably already doing a pretty good job most of the time.
‘It can be more challenging for an AI application to add value to clinical outcomes than people would like to admit,’ he says.
While physiotherapists might be happy to use AI to streamline the administrative side of their practice, developers of AI-based therapeutic applications will need to make a compelling case for their clinical use.
‘Ideally, I want to see clinical trial data from multi-site studies or a meta-analysis showing how effective something is.
‘But at the moment we’re stuck in this land of one case study here or there with questionable methodologies and uncertain outcomes,’ Steve says.
Regulatory approval from the Therapeutic Goods Administration may also be necessary for devices and apps that have a therapeutic benefit.
The regulatory authority recently updated its guidance on the regulation of software-based medical devices, including AI-based products (Therapeutic Goods Administration 2023).
‘We are still learning about the best way to regulate digital health interventions and that includes decision support guided by artificial intelligence.
‘If a digital health system is making treatment recommendations, then the Therapeutic Goods Administration’s going to be interested,’ Steve says.
‘Physiotherapists should be aware that if they subscribe to services that are based overseas, these companies may not be across the requirements for regulation in Australia.
‘We need to be mindful of that risk and ensure that our practices are run in accordance with Australian regulation requirements.’
Privacy and ethics also need to be considered.
Consumers are already wary of organisations collecting their data and using it for unrelated purposes and may baulk at an AI-powered app asking them about their medical issues or collecting their data without knowing what it will be used for.
‘People are becoming savvier about what data privacy looks like and the risks that arise in the 21st century when people provide sensitive data.
‘That’s going to be amplified in healthcare,’ Ryan says.
For example, Honeysuckle Health uses healthcare data shared by both the patients and the health funds it interacts with to provide telephonic and digital healthcare.
‘We reach out to patients based on their healthcare interactions to offer the healthcare programs they need.
‘These are all very common and widely available programs, like cardiac rehabilitation, post-discharge support and rehab in the home,’ Ryan says.
The ethical use of data is an aspect of AI that needs to be addressed before it is widely used in healthcare, says Enrico.
‘There are lots of different frameworks around for AI ethics but basically we shouldn’t be shipping data to people who might exploit it.
‘A big worry about, say, ChatGPT is that if I use it with patient data right now, that data may go to the United States and be used by the company for other purposes.
‘There are ways of keeping it in Australia and many people who use generative AI in healthcare make sure that the data stays in Australia to avoid the issue, but consumers may not know their data is being harvested and re-used, which is an ethical problem,’ he says.
Finally, there is the issue of ChatGPT ‘hallucinations’— an observed phenomenon where the chatbot can’t find appropriate information and makes up answers instead.
Examples have included citing research papers that don’t exist or basing legal arguments on made-up precedents.
AI users stress the importance of providing specific prompts and double-checking the output to ensure it is correct.
Where do we go from here?
From Enrico’s perspective, AI is something that the healthcare sector, including physiotherapy, needs to keep on top of to ensure that they don’t fall behind.
‘AI is already out there being used by your patients.
‘Maybe they are self-treating when they shouldn’t or maybe they’re going home and not doing what you asked them to do because they’ve got some information from AI.
‘Right now, bright young physios are dictating their notes into ChatGPT, turning them into text and summarising them because that’s what they do; it’s a generational thing.
‘They may not know that it’s risky,’ he says.
‘You should be having a conversation with other groups—nursing is in much the same position as physiotherapy.
‘Think about what you want to do in terms of training the next generation of physios and bring the technologies into training; bring them into research.
‘There are smart and entrepreneurial physios out there inventing the products that will be bread and butter to everybody in five or 10 years’ time, so support and encourage those people.’
Some useful definitions
Artificial intelligence, commonly shortened to AI, is defined by Wikipedia as ‘the intelligence of machines or software, as opposed to the intelligence of… humans’.
As a concept, it was first defined in the 1950s as a machine’s ability to perform a task that previously would have required human intelligence (Wikipedia n.d.).
Terms you might encounter in articles about AI include the following.
Machine learning and deep learning—fields of AI study that use statistical algorithms and models to analyse and draw inferences from patterns in data. Technology based on machine and deep learning is now commonly used to identify patterns in radiological images and genetic data.
Artificial neural networks—a form of machine/deep learning inspired by biological neural networks (like the brain) that is often used in predictive models, adaptive control and problem-solving.
Machine perception—the ability to use input from sensors to gather information.
Natural language processing—technology that allows programs to read, write and communicate in human languages. This is critical for generative AI.
Generative AI—AI that takes inputs such as text, image, audio, video or even code and uses it to generate new content in response to a prompt.
ChatGPT is the best known example of this but others specialising in image/video or audio creation abound.
Chatbot—a computer program designed to simulate conversation with human users. A simple chatbot may be based on a flow chart of responses, while a more sophisticated chatbot uses natural language processing, machine learning and generative AI
to parse questions and supply a response. Siri and Alexa are examples of chatbots using AI.
Physio Phebe
Phebe Liston APAM has developed a chatbot to guide consumers through basic care for musculoskeletal injuries.
Created by physiotherapist Phebe Liston APAM, Physio Phebe is a chatbot that helps users work out what to do for an injury.
‘I found a platform called Chatfuel—it’s run by Facebook Messenger—and I was just having a play around with it,’ Phebe says.
An active social media user, she found that she was being asked a lot of questions about what to do for this or that injury and set up the chatbot as a creative exercise.
In 2019 the Physio Phebe chatbot was one of the finalists in the inaugural Physiotherapy Research Foundation Pitchfest competition.
Physio Phebe is based on a flow chart consisting of blocks of questions and responses.
For example, one of the early questions is whether the injury is in the upper or lower part of the body and the next question narrows it down to, say, the neck, shoulders or torso.
Once the site and nature of the injury have been established, the chatbot directs the user to a video explaining simple first aid measures such as rest, ice or exercises and suggests visiting a physiotherapist.
‘It was a lot of work to set up but it’s a very user-friendly program—I’m not a tech guru by any means,’ says Phebe.
While this version of Physio Phebe isn’t AI-driven, it is the kind of app that could be adapted to an AI platform—Chatfuel now offers generative AI capabilities.
CliniScribe
Barry Nguyen APAM has developed an AI-based app to help generate clinical reports, referrals and letters.
When Barry Nguyen’s physiotherapy business dried up during the COVID-19 pandemic, he took the opportunity to retrain as a software engineer.
‘From there I started building my own projects and one of them was CliniScribe, a generative AI app that was a Pitchfest finalist last year,’ says Barry.
CliniScribe came out of Barry’s own experience of running a small physiotherapy business and his frustration with the paperwork required to keep up.
‘CliniScribe is an app that converts clinical SOAP [subjective, objective, assessment and plan] notes into reports, patient educational material, treatment plans, referral letters, GP letters, care plan letters and so on,’ he says.
The app uses patient notes to quickly generate required documentation, which the user can edit and customise as necessary, reducing time spent doing administrative work.
It is free and currently has around 100 users in Australia and New Zealand.
CliniScribe will soon launch a subscription model with additional features, including the ability to directly add information from faxes and emails to patient files and to capture patient consult notes in a standardised SOAP format.
>> Barry Nguyen APAM is a former innovation adviser for the APA.
Quick links:
© Copyright 2024 by Australian Physiotherapy Association. All rights reserved.