Why Physicians are Adopting AI (Even When Their Organizations Aren't): Insights from the 2025 Offcall Report
December 17, 2025
Why Physicians are Adopting AI (Even When Their Organizations Aren't): Insights from the 2025 Offcall Report
Offcall, a physician-led online community, recently published its 2025 Physicians AI Report, a survey that looks at how doctors are using artificial intelligence in their everyday work.
What makes the report compelling is that it focuses on present-day behaviour rather than distant forecasts. According to the survey, 89 percent of physicians report using AI at least weekly, and 67 percent say they use it daily. Most are not experimenting casually. They are relying on these tools as part of their routine.
At the same time, the report highlights a clear disconnect. While adoption is widespread, satisfaction with how AI is being introduced and managed is low. Many physicians in larger organizations feel removed from the decisions shaping these tools and unsure how they fit into their clinical workflows. Contrast that with those In small clinics and private practices, where technology decisions tend to sit much closer to the point of care and physicians have much more control and autonomy.
TL;DR
Nearly 9 in 10 physicians use AI weekly, and two-thirds use it daily, often filling gaps left by systems that struggle to keep pace with the realities care
84% say AI helps them work better, yet over 80% are dissatisfied with how these tools are introduced through centralized programs
71% report little or no influence over AI tool selection, a limitation common in larger care settings but one that community clinics are often better positioned to address
Documentation and administrative work remain the biggest operational drains, particularly in small practices where there is limited support staff and little room for inefficiency
In owner-operated clinics, adoption tends to be practical and selective: tools are kept when they reduce after-hours work, dropped when they add complexity, and judged by their impact on patient access and clinic sustainability
Physicians Are Already Using AI, Even When Systems Lag Behind
One of the strongest signals in the report is that physicians are not waiting for formal programs to begin before adopting new tools. Eighty-four percent of respondents say AI makes them better at their job, yet more than 80 percent report dissatisfaction with how AI is deployed in their organization.
This gap suggests that the issue is not whether AI is useful. It is whether it is being introduced in a way that aligns with real clinical work.
Many physicians report turning to AI outside of official systems because existing workflows feel heavy. Documentation takes too long. Administrative tasks stretch into evenings. Core systems evolve slowly.
AI is filling gaps that have existed for years, often without much guidance or integration.
Why Small and Private Practices Experience This Differently
Much of the frustration captured in the Offcall report reflects life inside large healthcare organizations. Decisions are centralized. Rollouts move slowly. Physicians often have little input.
In fact, 71 percent of physicians surveyed say they have little or no influence over which AI tools are selected or how they are implemented.
Private, community-based practices operate under different constraints.
In a small clinic, time is the most limited resource. If a tool saves time, it is adopted quickly. If it adds friction, it disappears just as fast. There is no buffer between the software and the physician’s day.
This may help explain another finding from the report: 67 percent of physicians say that having more influence over AI decisions would significantly improve their job satisfaction. In independent practices, that influence is often closer to the clinician, which can change how technology is evaluated and used.
Medical Documentation Is Still Where the Time Goes
When physicians were asked where AI could help most, the answers were consistent:
Documentation & Charting (65%): The primary pain point for almost two-thirds of doctors.
Administrative Burden: A close second, highlighting the need for "invisible" AI.
Clinical Decision Support: Ranked lower, suggesting doctors want help with workload before judgment.
This reflects a familiar reality: notes that extend into evenings and follow-ups that are hard to track. At Aeon, we believe these problems aren't caused by a lack of intelligence, but by systems that are difficult to navigate.
Control and Trust Matter More Than Novelty
Another theme that appears throughout the report is concern about control.
The most commonly cited fear among physicians is not that AI will make mistakes, but that it will be used in ways that prioritize efficiency or cost over patient care. Many worry about unclear accountability and tools being introduced without sufficient transparency.
This concern shows up differently in smaller clinics, but it is still present. Clinicians tend to trust systems they understand. They want to know what a tool does, when it acts, and how it affects their workflow.
Novelty alone does not build trust. Clarity does.
Where the EMR Fits Into This Picture
Any conversation about AI in healthcare eventually leads back to the electronic medical record. It is where physicians spend a large portion of their day and where most friction either accumulates or gets resolved.
If the EMR is slow or fragmented, new tools struggle to gain traction. If it is intuitive and well designed, it becomes a stable foundation for change.
This perspective has shaped how we think about EMR design at Aeon.
Rather than treating AI as something to layer on top of existing complexity, we have focused on making the core experience easier to work with first. Clear task ownership. Faster charting. Less mental overhead. Workflows that reflect how small clinics actually operate.
The intention is not to rush automation into practice. It is to build an EMR that works well today and can support more advanced tools as they prove useful and earn trust.
What Still Feels Open
Offcall raises questions that do not have simple answers.
How do we reduce documentation without increasing mental load?
How do we support clinical judgment rather than crowd it out?
How do we move quickly while maintaining trust?
Smaller clinics have the luxury of navigating these questions pragmatically—but how does that translate into action?
The proliferation of AI scribes has shown a willingness from small clinics to try new tools to improve their productivity. But the conditions in which those tools were introduction and accessed made it easy. Scribes can sit outside of the EMR and still provide value to the user. Anything requiring deeper integration or change in workflow suddenly becomes too risky, expensive, or disruptive.
Ultimately, AI in healthcare will not succeed because it is powerful—it will succeed because it is powerful, fits into real clinical work, and is easy to implement with little to no risk to the clinic.
Getting the fundamentals right is where that work begins.
Aeon Health
At Aeon, reports like this resonate because they mirror what we hear from small clinics every day.
Physicians are not asking for more complexity. They are asking for systems that respect their time, reduce administrative weight, and feel reliable at the end of a long day. AI has a role to play in that future, but only if the foundation underneath it is solid.
Our focus has been to start with the fundamentals. Building an electronic medical record that is intuitive, calm to use, and shaped around real clinical work. One that supports physicians today and is capable of evolving responsibly as new tools prove their value.
We believe progress in healthcare technology happens when systems adapt to clinicians, not the other way around. Getting there requires patience, clarity, and a willingness to prioritize usefulness over novelty.
That mindset, more than any specific feature, is what guides how we build.

