GPs turn to AI to help with patient workload

2025-01-14 01:19:00

Abstract: AI tools are aiding UK GPs by automating tasks like transcription, saving time and improving patient care. AI can also assess cancer risk and suggest actions. However, caution is advised.

In the UK, the difficulty of booking an appointment with a general practitioner (GP) is a well-known problem. Even if an appointment is successfully made, the increasingly heavy workload of doctors means that consultation times may be shorter than either the doctor or patient would like.

However, Dr. Deepali Misra-Sharp, a GP partner in Birmingham, has found that artificial intelligence (AI) has relieved some of the administrative burden of her work, allowing her to focus more on patients. About four months ago, Dr. Misra-Sharp started using a free AI-assisted medical transcription tool called Heidi Health, which listens to and transcribes patient appointments, and she says it has made a big difference.

“Normally, when I’m with a patient, I need to record all sorts of information, which distracts me during the consultation,” she said. “Now it means I can spend all my time making eye contact and actively listening to the patient. This makes the consultation of higher quality.” She stated that this technology reduces her workflow, saving "two to three minutes, or even more" per consultation. She also cited other benefits: "It reduces the risk of errors and omissions in my medical records."

With the number of medical practitioners decreasing and the number of patients continuing to grow, GPs are facing tremendous pressure. According to the British Medical Association (BMA), a full-time GP is currently responsible for an average of 2,273 patients, an increase of 17% since September 2015. So, could AI be a solution to help GPs reduce administrative tasks and alleviate burnout?

Some studies suggest that this is possible. A report published by Health Education England in 2019 estimated that new technologies such as AI could save at least one minute per patient, equivalent to 5.7 million hours of work for GPs. Meanwhile, a 2020 study by the University of Oxford found that 44% of the administrative work currently performed in GP practices could be largely or completely automated, freeing up more time to spend with patients.

Corti, a Danish company, is working on this, having developed AI that can listen to medical consultations (whether by phone or in person) and suggest follow-up questions, prompts, treatment options, and automated record-keeping. Corti says its technology handles about 150,000 patient interactions daily in hospitals, GP practices, and healthcare facilities in Europe and the US, totaling about 100 million per year. Lars Maaløe, co-founder and CTO of Corti, said: “Our idea is to allow doctors to spend more time with patients.” He said the technology can ask questions based on previous conversations it has heard in other medical settings.

Mr. Maaløe said: “The AI can access relevant conversations, and then it might think that, in 10,000 similar conversations, most of the time question X would be asked, but that question hasn’t been asked yet.” He added: “I imagine GPs going through consultations one after another, with almost no time to communicate with colleagues. This is like giving them advice from a colleague.” He also stated that the system can view patients’ historical data. “For example, it could ask, do you remember asking the patient if their right knee is still hurting?”

But are patients willing to have technology listen to and record their conversations? Mr. Maaløe stated that “the data does not leave the system.” However, he does think it is good practice to inform patients. “If the patient objects, the doctor cannot record. We rarely see this, because patients can see better record keeping.” Dr. Misra-Sharp said that she informs patients that she has a listening device to help her take notes. “No one has objected to this so far, but if they did, I wouldn’t use it.”

Meanwhile, 1,400 GP practices in England are currently using the C the Signs platform, which uses AI to analyze patient medical records, check for different signs, symptoms, and risk factors of cancer, and recommend actions to be taken. Dr. Bea Bakshi, CEO and co-founder of C the Signs, who is also a GP, said: “It can capture symptoms such as coughs, colds, and bloating, and basically see within a minute if there is anything relevant in their history.” The AI is trained based on published medical research papers.

Dr. Bakshi said: “For example, it might say that the patient has a risk of pancreatic cancer and should have a pancreatic scan, and then the doctor will decide whether to refer them to the appropriate pathway.” “It doesn’t diagnose, but it can provide convenience.” She said that they have conducted over 400,000 cancer risk assessments in real-world settings, detecting over 30,000 cancer patients across more than 50 different types of cancer.

A report on AI released by the BMA this year found that "AI should be expected to transform, rather than replace, healthcare work by automating routine tasks and improving efficiency." Dr. Katie Bramall-Stainer, chair of the BMA’s General Practitioners Committee, said in a statement: “We recognise the potential for AI to revolutionise care in the NHS – but if it is not implemented safely, it could also cause considerable harm. AI is susceptible to bias and errors, could compromise patient privacy, and is still a work in progress.”

“While AI can act as another tool in the GP’s toolbox to enhance and complement what they can offer, it is not a panacea. We cannot wait for the promise of tomorrow’s AI to deliver the much-needed productivity, consistency and safety improvements today,” warned Alison Dennis, partner and co-head of the international life sciences team at Taylor Wessing law firm, adding that GPs need to be cautious when using AI.

Ms. Dennis said: “Generative AI tools are highly likely to fail to provide a full, comprehensive or correct diagnosis or treatment pathway and may even give incorrect diagnoses or treatment pathways, that is, hallucinate or output based on clinically incorrect training data.” “AI tools that are trained on reliable data sets and fully validated for clinical use – almost certainly for a specific clinical use – are more appropriate for use in clinical practice.” She said that professional medical products must be regulated and receive some form of official certification.

“The NHS also wants to ensure that all data inputted into the tools is kept securely within NHS system infrastructure and not absorbed by the tool provider for further training data without appropriate GDPR safeguards.” For GPs like Misra-Sharp, for now, it has changed their work. “It has allowed me to enjoy consultations again, instead of feeling rushed.”