Skip to main content

AI Vendor Graveyard: Q2 2025

June 10, 2025
Image: [image credit]

Mark Hait
Mark Hait, Contributing Editor

By mid-2024, the flood of AI vendors pitching into healthcare was so vast it seemed to defy gravity. Every niche had a contender such as ambient note-taking, triage prediction, prior auth automation, care gap closure, SDoH signal processing. But in the first half of 2025, gravity has caught up. The landscape is shifting from explosive innovation to quiet collapse. Capital has tightened. Pilots have stalled. And a growing number of generative AI startups are exiting the market by way of silent shutdowns, acqui-hires, or integration into legacy platforms that offer no roadmap for their technology.

This AI Vendor Graveyard is the first installment in a quarterly editorial series designed to chronicle the disintegration of healthcare’s AI surplus. We will track who disappeared, why, and what it reveals about the industry’s operational capacity to absorb and govern this class of tools. Each edition will ground itself in verified external data, field quotes, and editorial analysis aimed at provider organizations, investors, and the surviving vendors recalibrating their message.

This is not a postmortem of hype. It is a real-time accounting of market correction.

The Quiet Exit Wave

As of Q2 2025, the digital health sector has entered a phase of accelerated contraction. According to Rock Health, 67 percent of M&A deals tracked this year involved digital health startups acquiring other startups. The implication is stark: many firms are no longer viable as standalone entities and are being absorbed at markdown valuations or fire-sold for engineering teams. AI-first companies—especially those with no regulatory clearance or revenue model—are leading the exits.

A VP at a digital health fund that backed three generative startups reflected bluntly: “We thought the EHR layer was dead. Turns out it’s where all the gravity lives. You can’t build something on top of Epic without being crushed by the integration cost.”

Even larger firms that once courted these AI vendors are retreating. As reported by Business Insider, major acquirers including Big Tech, payers, and retail health platforms have cooled on health AI targets, wary of high burn rates and opaque risk profiles.

Where Ambient AI Failed to Deliver

Nowhere has this correction been more brutal than in ambient documentation. Nearly a dozen companies launched between 2022 and 2023 with variations on the same promise: AI-powered note-taking that would eliminate physician documentation burden. But in practice, ambient tools ran into audio clarity problems, legal ambiguity over real-time PHI transcription, and inconsistent accuracy in environments like emergency departments or surgical floors.

A CMIO at a four-hospital system in the Midwest described the reality: “We trialed two vendors. Neither could handle our trauma bay. Background noise, cross-talk, the AI just lost the thread.”

Even Nuance’s DAX, bolstered by Microsoft, is deploying more slowly than originally forecast. For startups without a cloud-native backbone, enterprise credentials, or integration muscle, the bar was unreachable.

Ambient is not dead—but it is shrinking. Only a few firms have both technical infrastructure and workflow legitimacy to survive the scrutiny now being applied by CIOs, CMIOs, and clinical governance teams.

Why Hallucination Risk Still Kills the Deal

A separate cluster of GenAI vendors focused on triage, prior auth prep, or eligibility summaries have hit another wall: hallucination. While large language models can summarize notes, recommend referrals, or prefill forms, they also fabricate data in subtle ways like missing context, mislabeling codes, or generating plausible but incorrect answers.

One digital health lead at a regional plan explained why a promising pilot was paused indefinitely. “The model invented a CPT code that doesn’t exist. No human caught it before submission. That’s a compliance risk we’re not prepared to own.”

The FDA’s 2024 guidance on AI/ML devices clarified expectations for transparency and traceability. But many vendors tried to sidestep regulation by branding their tools as “assistive” or “non-clinical.” In legal terms, that may work. In procurement reality, it does not.

According to a recent CB Insights analysis of AI startups, healthcare remains one of the sectors where LLM hallucinations present the greatest commercial risk. For companies that failed to mitigate those risks—or even acknowledge them—the window to prove value has closed.

Who Survives and Why

There is no shortage of AI vendors still in the market. But the survivors fall into a narrow category: those who designed for infrastructure, not attention. Companies like Abridge, Navina, and Hippocratic AI have so far avoided mass attrition by embedding into clinical workflows, demonstrating measurable ROI, and maintaining credibility with both compliance and informatics stakeholders.

An enterprise IT executive at a coastal IDN said, “If a vendor walks in with an AI demo, I stop them. If they walk in with metrics from a live deployment and a FHIR-compatible integration model, I listen.”

What these firms understand is that AI cannot survive in healthcare as a novelty layer. It must function like plumbing; quiet, observable, governed. Flash is a liability.

The Next 90 Days

This series will return in Q3 2025 with a deeper dive into vendor funding status, new market exits, and the emergence of “AI middleware” as a new enterprise category. We will also track which ambient and co-pilot vendors are gaining traction through real-world deployments, not marketing claims.

We continue to accept anonymized reports of AI vendor shutdowns, failed pilots, or sunset technologies via our secure editorial intake form. All identities are protected. If you are a product team member, investor, or buyer and want to contribute insights, we welcome your input.

This is not a hit piece. It is a mirror. The healthcare AI market is not collapsing. It is correcting. The graveyard is growing—but it’s also clearing space for what might actually work.