AI in Medicine: Technology Must Serve the Patient–Physician Relationship, Not Replace It - Annals of Internal Medicine: Fresh Look Blog

728x90

Wednesday, March 18, 2026

AI in Medicine: Technology Must Serve the Patient–Physician Relationship, Not Replace It

Recent statements from both the American Medical Association (AMA) and the American College of Physicians (ACP) underscore a truth that should guide the digital transformation of health care: Artificial intelligence (AI) must be developed and deployed under the leadership of physicians, with the patient–physician relationship at its center. 

The AMA calls for physicians to “lead in developing AI tools to improve patient care and workflow” (1), while the ACP emphasizes in its policy paper that AI should augment rather than replace clinical judgment (2). Both perspectives are vital—and timely. As AI becomes woven into nearly every aspect of clinical practice, from diagnosis to documentation, we must ensure that technology serves human connection rather than eroding it. 

The digital revolution promised efficiency and precision. In some ways, it has delivered: Electronic health records, predictive algorithms, and machine-learning models can synthesize vast amounts of data in seconds. But in other ways, it has created a new kind of distance between doctors and patients. Many of us now spend more time facing screens than the individuals sitting across from us. The “iPatient,” a term popularized by Dr. Abraham Verghese (3), too often receives more attention than the living person behind the chart. 

AI holds great potential to help reverse this trend—but only if it is designed and governed responsibly. The goal should not be to automate the art of medicine, but to free it. Technology should take on tasks that distract from human care—documentation, data entry, and administrative burden—so physicians can return their attention to listening, examining, and empathizing. 

The danger lies in the opposite direction: when technology dictates rather than assists. Imagine a future where an algorithm analyzes a patient’s data and produces a single treatment plan—one that medical insurance will cover and clinicians are expected to follow without question. Such a system would strip medicine of nuance, individuality, and compassion. It would transform the physician from a trusted healer into a data interpreter and implementer, and the patient from a person into a probability score. 

As the ACP wisely notes, the use of AI in health care must be aligned with principles of medical ethics, serving to enhance patient care, clinical decision making, and the patient–physician relationship. Treatment decisions must be made by physicians who understand not just the patient’s medical history, pharmacologic exposure, and testing results but also their values, hopes, fears, and anxieties. Human judgment, informed by evidence and empathy, cannot be reduced to code. 

An air traffic controller once told me that automation works beautifully until the weather changes. When storms arise—when the unpredictable happens—it’s human expertise, not algorithms, that saves lives. Medicine is much the same. AI may guide, predict, and optimize, but it cannot comfort a grieving family, recognize subtle emotional cues, or weigh the ethical complexity of a life-altering choice. 

As physicians, we must insist that the digital tools we help build reflect these truths. AI should serve the patient–doctor relationship, not the other way around. It should reduce burden, not add to it. It should expand possibilities, not narrow them. And it should remind us—through its very design—that the most powerful diagnostic tool in medicine remains the relationship between 2 human beings. 

References

  1. Payerchin R. AMA: physicians must lead in developing AI tools to improve patient care and workflow. Med Econ. 2025.
  2. Daneshvar N, Pandita D, Erickson S, et al; ACP Medical Informatics Committee and the Ethics, Professionalism and Human Rights Committee. Artificial intelligence in the provision of health care: an American College of Physicians policy position paper. Ann Intern Med. 2024;177:964-967. [PMID: 38830215] doi:10.7326/M24-0146
  3. Verghese A. Culture shock—patient as icon, icon as patient. N Engl J Med. 2008;359:2748-51. [PMID: 19109572] doi:10.1056/NEJMp0807461



No comments:

Post a Comment

By commenting on this site, you agree to the Terms & Conditions of Use.