Emergency doctors estimate AI scribe ‘Heidi’ saving up to 10 minutes per patient

Source: Radio New Zealand

Doctors say the new AI scribe rolling out in EDs around the country is saving them up to 10 minutes per patient. Supplied

Doctors say the new AI scribe rolling out in EDs around the country is saving them up to 10 minutes per patient, and is particularly helpful for slow typers.

The tool, known as Heidi, was trialled in Hawke’s Bay Hospital’s ED, before the government announced it was being rolled out to all hospitals earlier this month.

The senior doctors’ union, ASMS, said in an update to members there had been no reported resistance from patients and senior medical officers had reported it eased cognitive pressure.

Health New Zealand (HNZ)’s director of digital innovation and AI, Sonny Taite, said clinicians were consistently reporting it reduced the time associated with clinical documentation, allowing them to focus more on patient care.

“Early qualitative feedback from senior medical officers indicates this has helped ease documentation pressure during busy shifts, and there has been no reported resistance from patients to its use in emergency settings.”

But with formal evaluation work ongoing, Health NZ was not attributing specific time savings percentages or quantified burnout outcomes at this stage.

Emergency physician Dr John Bonning said doctors in EDs were finding it “very helpful”, with its main benefit “speeding up those that are slow typists”.

It would normally take 15 minutes to see one patient and write up their notes, Bonning said, but one colleague had reported writing notes for three patients in 11 minutes – less than four minutes per patient.

Bonning himself had trialled the software a couple of times and was planning to incorporate it more into his work, and feedback among his colleagues had been mostly positive, with only about 10 percent deciding it was not for them.

“We do ask [patients’] consent before every use,” he said. “I don’t think I’ve ever had anybody say no, because it helps you do your job, and it helps you be more efficient.”

The app could summarise a handover with a paramedic, for example, which could then be turned into a referral letter or later on, a discharge note.

The notes could be quite wordy, and did need to be “very carefully edited, and occasionally it hallucinates and puts in false information, but not too much”, Bonning said.

Hallucinating, or adding false or illogical information to a response, is a known phenomemon among many types of AI. Tech giant IBM described it as “similar to how humans sometimes see figures in the clouds or faces on the moon”.

Emergency physician Dr John Bonning. Supplied / ACEM

HNZ’s Taite said feedback from 40 clinicians surveyed showed a need to “further improve accuracy and reduce editing effort, which would enhance trust and preserve time savings, particularly for senior clinicians”.

“Many also saw clear gains from smoothing workflow and device integration and better tailoring functionality to the realities of ED consults. Alongside this, there was interest in clearer guidance, templates, and training to support safe, confident use while reinforcing clinical reasoning and governance.”

Security features include encryption, two-factor authentication

Following hacks at both MediMap and ManageMyHealth in recent months, security is a topic front-of-mind for many in the health sector.

Taite explained Heidi operated as a secure cloud service and had been assessed against Health NZ’s privacy, security, and contractual requirements. “Appropriate safeguards are in place as part of the rollout,” he said.

Yass Omar, head of legal and regulatory affairs at Heidi, explained all data within Heidi was encrypted and de-identified, and the app used two-factor authentication.

Data was stored in the cloud, rather than in the device, unless it was waiting for an internet connection – and in those cases, it was stored in the app’s secure sandbox (that is, an isolated part of the app not accessible to anybody else) before being uploaded straight to the cloud once it reconnected.

The information collected by listening in to conversations was transcribed and summarised in the app, and then able to be copy and pasted into the patient’s notes in the hospital’s own IT system, where patient notes had always been stored.

“So you can imagine that [someone] finds an unlocked phone, they see the Heidi app, they press on it, it prompts them for 2FA [two factor authentication], they can’t pass that. And then the next step would be, oh, can I find some files? No, because they’re not actually stored on the phone.”

Yass Omar, head of legal and regulatory affairs at Heidi Supplied

Heidi had worked with NAIAEAG, Health NZ’s AI group, to make sure its security features were up to scratch, which Omar said was “an exceptionally high bar” to meet in terms of security.

None of the information fed into Heidi was used to train its AI. “Everything we do is about data minimisation,” he said. “We don’t collect any more data than we have to.”

Currently the encrypted, de-identified data was stored in a cloud-based server in Australia, but opening a server in New Zealand was on the cards.

“That’s something that is high in our priority for 2026,” Omar said. “The only thing that limits us is the availability of suitable infrastructure. At the scale that we are, we can’t just kind of use any cloud provider. We have to find ones that can cope with the volume of traffic that we push through.”

Study shows trust in AI will be difficult to repair if broken

According to a new paper, titled “Maintaining patient trust as artificial intelligence’s role in healthcare grows” by Rosie Dobson, Melanie Stowell and Robyn Whittaker, trust around AI could be built and maintained through transparency and good governance – “but if broken or lost, it will be difficult to repair and will have wider implications”.

Through interviews with patients and healthcare workers, the researchers found a few common threads when it came to their concerns:

  • The primary benefit of sharing AI data should be to the New Zealand public – not private companies or those overseas
  • Strong data protection needed to be in place
  • Patients needed choice and to give consent on when to share their data
  • AI should not replace the “human touch” of health professions
  • There should be Māori representation in work to develop AI tools, and governance over their use
  • Universities and New Zealand-based organisations were seen as more trustworthy AI development partners than commercial companies or overseas institutions

The authors recommended there be a culture of transparency, with health well-educated on how their tools work so they could explain it to patients. There also needed to be good governance, with the input of patients and healthcare workers.

GP says patient diagnosis the next step for AI in healthcare

Richard Medlicott, Wellington GP at Island Bay Medical Centre, said the future of AI in healthcare was as a tool for advice, not just a scribe.

Richard Medlicott, GP at Island Bay Medical Centre. RNZ / Karen Brown

Right now, among GPs, AI tools listened to consultations and made notes, which could then be copy and pasted or even automatically fed back into the GPs own patient notes system.

His practice used IntelliTek Health, a company which Medlicott himself had a stake in, rather than Heidi, but any AI software would have the effect of reducing ‘cognitive load’.

“At the end of a consultation, we might have to remember three or four things that were talked about in that fifteen minutes, and then get them all down,” he said.

“I find that quite fatiguing, and the use of scribes over the last two years has been really helpful in that regard.”

He said the scribe also meant he was verbalising more during consultations – “oh, your chest sounds clear, or your tummy’s nice and normal, no signs of an enlarged liver” – for the benefit of the scribe, but which patients appreciated.

And for doctors who preferred to type notes throughout the consult rather than afterwards, it meant they were more present in the conversation rather than at the keyboard, which patients said they appreciated.

It was saving GPs anywhere between two and five minutes per consultation, he said.

The future of AI would move beyond clinical scribes. Around the world already, AI was being used to look at medical records and give medical advice.

“I think we’ll get there, but AI sometimes hallucinate terribly, and just get things wrong,” he said. “That is the next stage, it’s happening now, but it is higher risk than AI scribes.”

Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.

– Published by EveningReport.nz and AsiaPacificReport.nz, see: MIL OSI in partnership with Radio New Zealand