Meet your doctor’s new assistant — AI
Your doctor may have a new assistant that isn’t human.
Artificial intelligence is not new to medicine or health care delivery. But as large language model AI, such as ChatGPT, rapidly become more powerful, doctors have increasingly been adopting patient-facing tools in their practice.
But AI comes with its own set of risks and challenges, and those can be especially serious in the context of health care.
“Certainly there’s a whole number of ways that AI can be used,” said Lorian Hardcastle, an assistant professor and health policy expert at the University of Calgary who studies AI in health.
“But I think we always need to be considering the trade-offs in terms of quality of care and accuracy.”
Scribe tool piloted in Alberta hospitals
One type of AI tool currently in use in Alberta is the medical “scribe,” which records the conversation between patient and doctor during a consultation and produces a written summary.
Tools like this have seen growing popularity with doctors for their ability to cut down on physician administrative duties, saving them hours that they could potentially use to see additional patients.
Numerous commercial examples exist, but Alberta Health Services has developed its own AI scribe in the hopes of mitigating concerns about patient data being shared with private companies.
![A woman with long hair wearing a green necklace and brown sweater sits at a table.](https://i.cbc.ca/1.7450708.1738716446!/fileImage/httpImage/image.jpg_gen/derivatives/original_1180/lorian-hardcastle.jpg?im=)
The original form of the AHS scribe tool was created by Dr. Michael Weldon, a Red Deer emergency physician with a background in engineering.
From there, it evolved into a joint project between the University of Alberta, AHS and the Canadian Medical Association with a $1 million grant. Since November 2024, the AI scribe has been in real-world use as part of a pilot project.
A spokesperson for AHS said about 15 emergency physicians are using the tool at the Red Deer hospital, the University of Alberta Hospital and the Royal Alexandra Hospital. The pilot runs until June 2026, with no time frame for broader availability.
As part of the terms and conditions of using the tool, doctors are required to review the generated summary for accuracy before copying and pasting it into the patient’s medical chart. The data never leaves AHS servers.
“It sounds to me like AHS has gone to some lengths to try to mitigate some of the legal concerns in this space,” said Hardcastle.
“I guess one of the concerns that I might have is, are we really reviewing these AI-generated notes thoroughly before they go in the chart or might some people skim them or not review them especially thoroughly? Are we compromising accuracy by having these notes generated by this tool?”
Nevertheless, physicians themselves have found such tools useful. A 2024 study commissioned by the Ontario government found that doctors using AI scribes reported a three-hour reduction per week in time spent on administrative tasks after hours, and better work-life balance.
“They’re emailing us, they’re calling us with things like, I’m smiling more today … or I’ve been practising for 30 years and this is the first Saturday I’ve had to myself,” said Meaghan Nolan, co-founder of Calgary-based Mikata Health, which develops AI-based health-care tools, including a scribe.
“It’s really exciting. I think this really can move the needle in terms of the challenges that we’re facing in our health-care system.”
Questions of consent
Physicians must also get patient consent before using the AHS scribe.
An “advice to the profession” document regarding AI from the College of Physicians and Surgeons of Alberta states that “at a minimum, patients should be… made aware of potential risks involving data integrity, bias and privacy.”
While noting that the document represents guidance for doctors rather than a professional standard, a CPSA spokesperson said in an email that it encourages physicians “to exercise their best judgment when using AI tools in their practice, inform patients of any known risks and ensure the care they provide meets the expectations laid out in our standards of practice.”
The CPSA’s standard for informed consent requires that the patient “demonstrates a reasonable understanding of the information provided and the reasonably foreseeable consequences of both a decision and a failure to make a decision.”
When asked whether a patient agreeing to a doctor using an AI scribe without being informed of any potential risks would meet the threshold of informed consent, the CPSA said it “would depend on the facts of the individual situation.”
For the pilot of the AHS scribe, doctors are provided with a short script to use with patients, which CBC News has seen. It mentions that the patient’s information may be used for quality improvement, and that declining consent will not impact care. It does not mention risks such as bias or error.
Hardcastle pointed out that many questions regarding responsible use of AI have not yet been adjudicated.
“We have a lot of judicial interpretation around what informed consent looks like to surgery, to a diagnostic test, and explaining what the courts [call] material risks … but this is different because this is a note-taking device, in essence, and typically we don’t get patient consent on the procedural aspects of health care.”
A spokesperson for Health Minister Adriana LaGrange said in a statement that the government recognizes “the importance of clear and meaningful informed consent” around AI, including “ensuring that patients fully understand how these tools work and how their data will be used.”
Alberta Health did not answer questions about current guidelines on the use of commercial AI tools by physicians, whether it knows how many doctors currently use such tools, or how it would address errors made by AI.
Primary Care Alberta, the new entity intended to take over primary care responsibility from AHS, did not provide answers to questions about AI sent by CBC News.
Few regulations
AHS has plenty of company in the field of health-care AI.
In May 2024, Alberta Innovates — a provincial Crown corporation — announced nearly $10 million in funding for research and development in health-care AI. The University of Alberta received nearly $4 million of that money for projects such as developing a suggestion-making AI tool for Alberta Health Link 811, and evaluating AI-enabled opioid overdose prediction.
Mikata Health received $800,000 toward developing its own AI scribe, which Nolan said they try to approach thoughtfully.
“This emerging concept that’s critical, that I think companies like ours do need to take just as seriously as privacy and security, is this idea of responsible AI development,” said Nolan.
“It’s a bit of a higher bar, both in terms of the fact that you’re incorporating AI into your product and that you serve health care as an industry. That’s how we see it, anyway.”
Before using any of these tools, doctors are required to submit a privacy impact assessment (PIA) to the Office of the Information and Privacy Commissioner (OIPC), detailing any potential risks to individual privacy that may arise.
The OIPC previously listed such assessments publicly, but due to changes in how it processes PIAs, that public registry stopped being updated at the end of September 2024. The OIPC told CBC News that the future of the public registry is under review.
![A woman smiling in a park.](https://i.cbc.ca/1.7456781.1739323411!/fileImage/httpImage/image.jpg_gen/derivatives/original_1180/meaghan-nolan.jpg?im=)
At present, however, there are no laws specifically regulating the use of AI in Alberta. The provincial government has been working on regulations since early 2024, but a spokesperson did respond to a request for comment on when those regulations might be finalized.
In response to a question about whether doctors using AI tools have enough regulatory or legal clarity, the Alberta Medical Association said no.
“The environment is generally not clear enough for the average physician to confidently use AI tools, though the risks of AI scribes if the physician diligently reviews and edits the records are potentially less than with other possible applications,” said a spokesperson in an email.
Building trust in Dr. AI
The role of AI in health care is only likely to grow. It is being used in medical research to find potential treatments for disease. A pilot project in Alberta will soon see AI added to existing MRI machines in an attempt to improve efficiency.
Diagnostic AI already exists, and both Hardcastle and Nolan point out that in some cases it has already proven to be more accurate than a human physician. Yet many remain hesitant.
“There’s still this … I don’t know how to describe it other than an ‘ick’ of having an AI do your diagnosis,” said Nolan. “It’s counterintuitive.”
She hopes for sound regulations to guide responsible development of health-care AI. That, along with people getting used to AI over time, might push AI to a place where, she believes, it could save lives.
“I think it’s extremely positive,” said Nolan. “I’m really optimistic about what those opportunities are.”
View original article here Source