4 Actions to Close Hospitals’ Predictive AI Gap

4 Actions to Close Hospitals’ Predictive AI Gap. An AI agent analyzes data and a human data analysts review hospital health data.

As hospitals press ahead with digital transformation, the deployment of predictive artificial intelligence (AI) increasingly is becoming table stakes. But newly released data show a mounting “digital divide” — smaller, rural, independent and critical access hospitals remain far behind their larger, system-affiliated peers. For CEOs and health system C-suite leaders, the implications are twofold: Competitive pressure is rising and inequities in care and operations may deepen unless addressed.

What the Data Show

According to the latest analysis from the Assistant Secretary for Technology Policy (ASTP)/Office of the National Coordinator for Health Information Technology based on the AHA Information Technology Supplement survey, 71% of nonfederal acute care hospitals reported using predictive AI integrated into their electronic health records (EHRs) in 2024, up from about 66% in 2023.

Yet this broad figure obscures sharp variation:

  • Systems vs. Independents

    Among hospitals affiliated with multi-hospital systems, 86% reported using predictive AI in 2024, compared with only 37% of independent facilities.

  • Rural vs. Urban

    Rural hospitals reported 56% adoption vs. 81% among urban hospitals.

  • Critical Access

    Critical access hospitals (CAHs), defined as small facilities at least 35 miles from the next hospital, reported just 50% usage, compared with 80% of non-CAHs.

  • Operational Tasks

    Early use cases reflect an operational emphasis: The largest year-over-year growth in predictive AI occurred in simplifying or automating billing (+25 points), facilitating appointment scheduling (+16 points) and identifying high-risk outpatients for follow-up (+9 points). In contrast, usage for monitoring health or making treatment recommendations remains relatively low, likely reflecting a lack of clinical confidence in the accuracy and reliability of these tools.

  • Governance

    On the governance side, among hospitals using predictive AI in 2024, 82% were evaluated for accuracy, 74% were evaluated for bias and 79% conducted post-implementation monitoring.

The authors describe a “persistent digital divide” in AI uptake across the hospital ecosystem.

Strategic implications for hospital C-suite leaders

For health system executives and CEOs, the data point to three major strategic takeaways:

  • Operational efficiency and financial viability. Hospitals with mature predictive AI deployments are realizing a measurable upside in operational workflows — billing, scheduling and risk stratification of outpatients — all of which help promote financial sustainability and free clinician time for higher-value work. For hospitals lagging in this area, the gap is growing both operationally and in terms of viability.
  • Care gaps. When hospitals lack access to predictive tools that identify high-risk patients, streamline scheduling or support timely follow-up, they may face downstream gaps in care consistency and patient safety. For rural and independent hospitals that serve vulnerable populations, this technology gap can become a care gap.
  • Governance and risk oversight. With widespread deployment, governance of predictive models is emerging as a board-level priority. Model bias, accuracy drift, integration errors and regulatory scrutiny require structured oversight. The fact that most hospitals already evaluate AI tools is positive — but variation in depth underscores the need for clearer governance. As resources grow, understanding which frameworks better support effective oversight will be essential.

To close adoption gaps and strengthen responsible AI use, hospital leaders can:

1 | Establish multidisciplinary oversight early.

The ASTP Data Brief found that most hospitals use governance teams with multiple responsible parties — three-quarters involve more than one, and more than half involve three or more. Building this cross-functional oversight early helps to ensure that predictive tools align with institutional strategy, safety standards and compliance expectations.

2 | Adopt a “three lines of defense” approach.

Drawing parallels to financial services governance, hospitals can formalize accountability across three layers: front-line operations that use the tools, risk management teams that monitor performance and bias, and internal auditors who verify results. This layered model supports consistent and transparent evaluation of AI applications.

3 | Evaluate models for performance, safety and fairness.

The report underscores the importance of continuous evaluation — not only for accuracy but also for bias and patient safety. Hospitals should standardize evaluation metrics and documentation practices to maintain trust and regulatory readiness.

4 | Align individual tools with an enterprise AI strategy.

With the rapid expansion of predictive tools, hospitals benefit from ensuring that each implementation fits within a broader, systemwide AI road map. Central governance helps to avoid duplication, ensures resource efficiency and supports long-term scalability.

Related Resources

AHA Center for Health Innovation Market Scan
Public
AHA Center for Health Innovation Market Scan
Public
AHA Center for Health Innovation Market Scan
Public
AHA Center for Health Innovation Market Scan
Public
AHA Center for Health Innovation Market Scan
Public
AHA Center for Health Innovation Market Scan
Public