FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.

THE NEW YORK TIMES. A.I. May Someday Work Medical Miracles. For Now, It Helps Do Paperwork.

The best use for generative A.I. in health care, doctors say, is to ease the heavy burden of documentation that takes them hours a day and contributes to burnout.

Dr. Matthew Hitchcock, a family physician in Chattanooga, Tenn., has an A.I. helper.

It records patient visits on his smartphone and summarizes them for treatment plans and billing. He does some light editing of what the A.I. produces, and is done with his daily patient visit documentation in 20 minutes or so.

Dr. Hitchcock used to spend up to two hours typing up these medical notes after his four children went to bed. “That’s a thing of the past,” he said. “It’s quite awesome.”

ChatGPT-style artificial intelligence is coming to health care, and the grand vision of what it could bring is inspiring. Every doctor, enthusiasts predict, will have a superintelligent sidekick, dispensing suggestions to improve care.

But first will come more mundane applications of artificial intelligence. A prime target will be to ease the crushing burden of digital paperwork that physicians must produce, typing lengthy notes into electronic medical records required for treatment, billing and administrative purposes.

From leaders at major medical centers to family physicians, there is optimism that health care will benefit from the latest advances in generative A.I. — technology that can produce everything from poetry to computer programs, often with human-level fluency.

But medicine, doctors emphasize, is not a wide open terrain of experimentation. A.I.’s tendency to occasionally create fabrications, or so-called hallucinations, can be amusing, but not in the high-stakes realm of health care.

That makes generative A.I., they say, very different from A.I. algorithms, already approved by the Food and Drug Administration, for specific applications, like scanning medical images for cell clusters or subtle patterns that suggest the presence of lung or breast cancer. Doctors are also using chatbots to communicate more effectively with some patients.

Physicians and medical researchers say regulatory uncertainty, and concerns about patient safety and litigation, will slow the acceptance of generative A.I. in health care, especially its use in diagnosis and treatment plans.

Those physicians who have tried out the new technology say its performance has improved markedly in the last year. And the medical note software is designed so that doctors can check the A.I.-generated summaries against the words spoken during a patient’s visit, making it verifiable and fostering trust.

“At this stage, we have to pick our use cases carefully,” said Dr. John Halamka, president of Mayo Clinic Platform, who oversees the health system’s adoption of artificial intelligence. “Reducing the documentation burden would be a huge win on its own.”

Recent studies show that doctors and nurses report high levels of burnout, prompting many to leave the profession. High on the list of complaints, especially for primary care physicians, is the time spent on documentation for electronic health records. That work often spills over into the evenings, after-office-hours toil that doctors refer to as “pajama time.”

Generative A.I., experts say, looks like a promising weapon to combat the physician workload crisis.

“This technology is rapidly improving at a time health care needs help,” said Dr. Adam Landman, chief information officer of Mass General Brigham, which includes Massachusetts General Hospital and Brigham and Women’s Hospital in Boston.

For years, doctors have used various kinds of documentation assistance, including speech recognition software and human transcribers. But the latest A.I. is doing far more: summarizing, organizing and tagging the conversation between a doctor and a patient.

Companies developing this kind of technology include Abridge, Ambience Healthcare, Augmedix, Nuance, which is part of Microsoft, and Suki.

Ten physicians at the University of Kansas Medical Center have been using generative A.I. software for the last two months, said Dr. Gregory Ator, an ear, nose and throat specialist and the center’s chief medical informatics officer. The medical center plans to eventually make the software available to its 2,200 physicians.

But the Kansas health system is steering clear of using generative A.I. in diagnosis, concerned that its recommendations may be unreliable and that its reasoning is not transparent. “In medicine, we can’t tolerate hallucinations,” Dr. Ator said. “And we don’t like black boxes.”

The University of Pittsburgh Medical Center has been a test bed for Abridge, a start-up led and co-founded by Dr. Shivdev Rao, a practicing cardiologist who was also an executive at the medical center’s venture arm.

Abridge was founded in 2018, when large language models, the technology engine for generative A.I., emerged. The technology, Dr. Rao said, opened a door to an automated solution to the clerical overload in health care, which he saw around him, even for his own father.

“My dad retired early,” Dr. Rao said. “He just couldn’t type fast enough.”

Today, the Abridge software is used by more than 1,000 physicians in the University of Pittsburgh medical system.

Dr. Michelle Thompson, a family physician in Hermitage, Pa., who specializes in lifestyle and integrative care, said the software had freed up nearly two hours in her day. Now, she has time to do a yoga class, or to linger over a sit-down family dinner.

Another benefit has been to improve the experience of the patient visit, Dr. Thompson said. There is no longer typing, note-taking or other distractions. She simply asks patients for permission to record their conversation on her phone.

“A.I. has allowed me, as a physician, to be 100 percent present for my patients,” she said.

The A.I. tool, Dr. Thompson added, has also helped patients become more engaged in their own care. Immediately after a visit, the patient receives a summary, accessible through the University of Pittsburgh medical system’s online portal.

The software translates any medical terminology into plain English at about a fourth-grade reading level. It also provides a recording of the visit with “medical moments” color-coded for medications, procedures and diagnoses. The patient can click on a colored tag and listen to a portion of the conversation.

Studies show that patients forget up to 80 percent of what physicians and nurses say during visits. The recorded and A.I.-generated summary of the visit, Dr. Thompson said, is a resource her patients can return to for reminders to take medications, exercise or schedule follow-up visits.

After the appointment, physicians receive a clinical note summary to review. There are links back to the transcript of the doctor-patient conversation, so the A.I.’s work can be checked and verified. “That has really helped me build trust in the A.I.,” Dr. Thompson said.

In Tennessee, Dr. Hitchcock, who also uses Abridge software, has read the reports of ChatGPT scoring high marks on standard medical tests and heard the predictions that digital doctors will improve care and solve staffing shortages.

Dr. Hitchcock has tried ChatGPT and is impressed. But he would never think of loading a patient record into the chatbot and asking for a diagnosis, for legal, regulatory and practical reasons. For now, he is grateful to have his evenings free, no longer mired in the tedious digital documentation required by the American health care industry.

And he sees no technology cure for the health care staffing shortfall. “A.I. isn’t going to fix that anytime soon,” said Dr. Hitchcock, who is looking to hire another doctor for his four-physician practice.

FOR EDUCATIONAL AND KNOWLEDGE SHARING PURPOSES ONLY. NOT-FOR-PROFIT. SEE COPYRIGHT DISCLAIMER.