Do physicians believe AI will solve doctor burnout?

Healthcare is grappling with a burnout crisis that’s driving physicians out of the field at alarming rates.

In 2021, 33% of medical practices reported losing at least one doctor to burnout, a condition described as a “psychological syndrome” caused by chronic job stress.1 This cycle is intensifying workloads for those who remain, with 37% of physicians struggling to achieve work-life balance and 67% saying their clinics could do more to support it.2

Administrative tasks, identified by 21% as a major burnout factor, have fueled interest in artificial intelligence (AI) as a potential relief. While AI holds promise for reducing these burdens, only 17% of Sermo physicians report using autonomous AI in their practices, and 53% haven’t considered it.3

With these mixed views, one question stands out: Can AI genuinely ease physicians’ workloads, or will its potential be limited by the challenges it faces?

Assessing the potential of AI in reducing burnout

Assessing the potential of AI in reducing burnout

How could doctors use AI?

Many doctors find themselves overwhelmed by their current workload. For example, during 60–hour weeks, one doctor saw 25 patients daily, often working weekends to complete patient notes. She reflected on the toll her work had taken. “Over time, you think, “This isn’t how I wanted to practice medicine.1

Indeed, for the 21% of physicians who point to administrative tasks as a major contributor to burnout,2 AI offers hope. In a Sermo survey, 78% of respondents expressed optimism that AI could improve clinic efficiency by reducing time spent on documentation and non-clinical duties.3 In particular, AI has the potential to streamline workflows through time management, an approach that 31% of physicians feel would help mitigate burnout.2

An overview of the best AI use cases for doctors to solve burnout

As healthcare faces mounting admin burdens, AI in healthcare administration tools have emerged to help alleviate some of the workload that contributes to physician burnout.

While only 17% of surveyed physicians report seeing autonomous AI fully or partially implemented in their practices, 74% believe it could feasibly be integrated and 70% feel it’d be cost-effective.3

Here’s a look at some of the most effective AI for medical admin tools currently assisting healthcare professionals:

Diagnosis and Imaging Support

AI-powered “smart” stethoscopes can detect heart disease with 90% accuracy,4 enhancing diagnostic speed and reliability, which saves doctors time and improves patient care.

Personalized Treatment

Tools like Aitia use AI to match patients with possibly effective treatments,5 streamlining data analysis and enabling doctors to focus more on patient interaction. As Dr. Michael Hasselberg, Chief Digital Health Officer at UR Medicine, notes, “The machine actually performs better than the human” in triaging patient messages.1

Predictive Analytics

Lightbeam Health uses predictive analytics, assessing over 4,500 factors to anticipate patient needs, which can help doctors prioritize cases and reduce unexpected workload.6

Patient Monitoring

Platforms like Wellframe allow real-time patient monitoring via mobile apps, enabling personalized care and reducing the need for frequent in-person follow-ups.7

Administrative Efficiency

AI automates routine tasks like billing, insurance claims and prescription renewals. Workflow management tools also streamline schedules and documentation, freeing up doctors for patient care.

Drug Discovery and Robotic Surgery

AI aids drug discovery by analyzing vast data and enhances surgical precision through robotics, minimizing fluctuations and providing real-time support for complex procedures.

Electronic Health Records (EHR) Management

AI-driven EHR tools, such as those from Allscripts and Epic, streamline the often-overwhelming process of electronic record management. With one in five patients having an EHR as long as Moby Dick—206,000 words1—AI simplifies the data, allowing for quick, efficient review. This means less time navigating complex records and more time with patients.

Physician burnout and work-life balance in the context of AI

work-life balance in the context of AI

While some view AI as a way to free up time for family and self-care, many others have their doubts.

Does AI turn physicians from writers to editors?

AI’s promise of reducing time on documentation doesn’t always hold in practice.

A recent study from the University of California San Diego and Stanford University found that while generative AI tools could draft responses to patient messages, the final editing process often took more time. Physicians found themselves making substantial clinical content changes while keeping AI’s polite additions.

The study revealed that doctors spent nearly 22% more time on these messages as they double-checked the AI’s input to ensure accuracy, underscoring the necessity for physician oversight in using AI.1

AI can not solve the problem of a physician shortage

Despite AI’s administrative support, it can’t address the core problem of physician shortages.

With the U.S. facing a projected shortfall of up to 86,000 doctors by 2036,8 Garrett Adams, vice president of research and development at Epic, explains, “AI is not going to solve the physician shortages across the country…we can help them do more with less, but we can’t solve the fact that there is less.1

AI may optimize physician time, but it can’t replace the need for trained professionals.

Does AI lead to ethical concerns?

Even if AI saves time, there’s concern over how it’ll be used. Will it enable longer patient visits or simply increase patient load?

Some, like a Sermo rheumatologist, worry this efficiency will advantage large hospital systems, disadvantaging smaller rural practices. “This will give the huge hospital systems another advantage and put smaller rural hospitals out of business, which is another reason why health shouldn’t be a business endeavor.9

Security concerns also persist, as a Sermo stomatologist member warns, “AI will be an uncontrolled weapon in the hands of unscrupulous people.10

Preserving the human element in patient care

human element in patient care

Many, like Dr. Daniel Yang, VP of AI at Kaiser Permanente, highlight AI’s ability to “liberate providers from their keyboards” to focus on patients directly.1 However, 16% of physicians note resistance among colleagues to adopting AI for core clinical tasks due to concerns about its effect on the human element of care.2

AI Has limits when making clinical decisions

Physicians in the Sermo community emphasize that AI lacks the nuanced judgment required in clinical settings: “AI may do better in an inpatient setting,” notes a family medicine doctor, “but to translate a patient’s vague description of symptoms into a working diagnosis and differentiating nuances of visuals, such as rashes and lesions is a level that at this point only a trained human physician can perform.9

Elsewhere, a hematologist observed, “I tested ChatGPT for medical advice, and it was abysmally wrong on factual matters.11

Indeed, this is something many patients are worried about, and many want their ‘intelligence’ to come from a human. As one Sermo member for Ophthalmology states, “I had a patient … and I offered her artificial intelligence. She says, ‘I only want the real thing.12 ’”

This reflects a common sentiment: AI may support it, but it cannot replicate the depth of clinical insight.

Is AI outperforming doctors? Not at every task

While AI supports diagnostic and data tasks, physicians stress that empathy, intuition and clinical expertise are irreplaceable in areas like counseling and complex decision-making.

One preventive medicine doctor said, “AI can provide information, but a knowledgeable physician is essential to deliver the best treatment.13” In support of this, a Sermo member specializing in Family Medicine states, “On a simpler level, we had an EKG machine that would give us readings and, quite often, my husband, the cardiologist, would disagree.13

For many, AI complements rather than substitutes the human touch in medicine.

Many physicians are cautious about data privacy and legal accountability. As a general practitioner put it, “AI tools may be useful, but they can never replace doctors due to professional liability. The responsibility will always rest on a human, not AI.14

This underscores the importance of maintaining ethical standards and accountability in AI-assisted care.

Can AI reduce physician burnout?

Can AI reduce physician burnout?

AI for doctors offers real promise in reducing administrative burdens, enhancing efficiency and possibly supporting a healthier work-life balance.

However, it’s clear that AI alone can’t solve the complexities of physician burnout.

While some tasks can be streamlined, true relief will likely require AI’s capabilities to be combined with systemic changes and continued emphasis on human expertise in patient care. As one Sermo member for rheumatology sums it up, “Great potential but a little scary.9

Footnotes

  1. Newsweek
  2. Sermo POTW
  3. Sermo AI
  4. NIHR
  5. Aitia
  6. Lightbeam Health Solutions
  7. Wellframe
  8. McKinsey & Company
  9. Sermo community: How well does the AI perform clinically?
  10. Sermo community: ChatGPT and Healthcare
  11. Sermo community: Is Artificial Intelligence Actually Intelligent?
  12. Sermo community: Artificial Intelligence and Healthcare
  13. Sermo community: Why is AI “good” and everything else artificial “bad”?
  14. Sermo community: Will Pathologists be replaced by Artificial Intelligence in the Future?