Hospital Quality Scatter Explorer

Hospital Quality Scatter Explorer

1. About   healthcare hospitals quality dataviz

hospital-quality-banner.jpeg

Figure 1: JPEG produced with DALL-E 4o

The federal government rates every US hospital on a 1-5 star scale across five quality domains. This post visualizes the full landscape of 5,400+ hospitals – where they cluster, how ownership type predicts quality, and where your state falls – using the CMS Hospital Compare dataset.

2. TLDR   tldr

The federal government rates every US hospital on a 1–5 star scale across five quality domains: mortality, safety, readmissions, patient experience, and timely care. This post lets you explore the full landscape of 5,400+ hospitals — where they cluster, how ownership type (nonprofit, for-profit, government) predicts quality, and where your state falls. The headline finding: nonprofit hospitals have more 4–5 star ratings than for-profit hospitals, but the variation within each ownership type is enormous.

3. Introduction   healthcare cms hospitals quality

Every hospital in the United States that participates in Medicare is evaluated on the same set of quality metrics — and those evaluations are public. CMS publishes an overall 1–5 star rating for roughly 3,000 acute care hospitals, aggregating performance across five domains: mortality rates, safety, readmissions, patient experience, and timely/effective care delivery.

These ratings are controversial. Hospital trade associations regularly lobby against the methodology. Teaching hospitals complain their complex patient mix inflates apparent mortality. Large urban hospitals with many high-risk patients argue they're penalized relative to suburban hospitals with healthier populations. Some of these critiques are valid — the star rating system doesn't perfectly control for case mix — but the data still represents the most comprehensive public view into comparative hospital quality that exists.

This post visualizes the full distribution. 5,359 hospitals, 2,855 with valid overall star ratings, pulling from the CMS Hospital General Information dataset and HCAHPS patient survey data.

3.1. How the star rating is calculated   methodology

The overall star rating is a composite. CMS uses a latent variable model that groups hospitals into five tiers based on weighted performance across the five quality domains. The weights are roughly:

  • Mortality (22%) — risk-adjusted death rates for conditions like heart failure, pneumonia, COPD
  • Safety of care (22%) — healthcare-associated infections, complications, PSIs
  • Readmissions (22%) — 30-day unplanned readmission rates
  • Patient experience (22%) — HCAHPS survey: nurse/doctor communication, pain management, discharge info
  • Timely & effective care (12%) — door-to-balloon time, ED wait times, sepsis bundles

Hospitals need data on at least three domains to receive an overall rating. Critical access hospitals, psychiatric hospitals, and VA facilities are excluded.

4. The Main View: Patient Experience vs. Mortality   dataviz scatter

This scatter plots every hospital with both patient experience data and mortality quality data. The x-axis is the HCAHPS patient experience star rating (1–5); the y-axis shows what percent of a hospital's mortality measures were rated "better than national average." Dot size reflects the overall star rating. Color shows ownership type.

A few things stand out:

  • Patient experience and clinical quality don't always align. Some hospitals score well on patient experience (4–5 stars for how nurses communicated) but poorly on mortality measures. The reverse is also common.
  • Nonprofit hospitals (blue) dominate the upper-right quadrant — high experience, strong mortality performance.
  • For-profit hospitals (orange) show more spread — some high performers, but a larger cluster at lower mortality quality.
  • The middle mass of hospitals — 2–3 star experience, 20–50% better-than-national mortality — represents the typical American community hospital.

4.1. Why patient experience and clinical quality diverge   analysis

The partial decoupling of experience scores from clinical outcomes has a straightforward explanation: they measure different things. HCAHPS captures whether your nurse explained your medications clearly and whether the room was quiet at night. Mortality rates capture whether patients with your condition died within 30 days.

Hospitals can invest in hospitality — single rooms, responsive call systems, well-trained patient-facing staff — without improving the clinical protocols that drive mortality outcomes. Some high-mortality hospitals have excellent bedside manner. Some hospitals with sterile, intimidating environments have exceptional clinical results.

5. Who Gets 5 Stars? Ownership and Quality   dataviz ownership

The distribution of star ratings by ownership type reveals a clear pattern:

  • Nonprofit hospitals are more likely to receive 3–4 stars and overrepresented at 5 stars relative to their share of total hospitals.
  • For-profit hospitals are overrepresented at 1–2 stars, though they also have a meaningful presence at 4–5 stars.
  • Government hospitals (state, county, federal non-VA facilities) cluster heavily at 2–3 stars, partly reflecting that many government hospitals serve high-acuity, underinsured populations in urban areas.

This pattern is consistent with research on hospital ownership and quality: for-profit hospitals tend to have higher margins but lower quality ratings, particularly on measures that don't directly affect revenue (HAIs, 30-day readmissions). Nonprofits, constrained to reinvest surplus into operations, tend to maintain higher quality baselines. But the within-group variation is enormous — plenty of 5-star for-profit hospitals and 1-star nonprofits exist.

6. The Geography of Hospital Quality   dataviz states geography

Average hospital star ratings vary substantially by state. This map shows the mean overall rating among rated hospitals in each state.

The geographic pattern correlates with several factors: hospital market consolidation (monopoly hospital markets tend to have lower quality incentives), state Medicaid policy (expansions increase insured volumes and reduce financial stress), the presence of academic medical centers (which both depress ratings due to case mix and improve them through teaching excellence), and rural vs. urban mix.

States in the Mountain West and upper Midwest tend to perform well. States with highly consolidated markets or large proportions of safety-net hospitals tend to score lower.

7. Quality by Domain: Where Ownership Predicts Outcomes   dataviz domains ownership

The star rating aggregates five domains. This chart breaks that out — for each major quality domain, what fraction of measures does each ownership type have rated "better than national"?

The domain breakdown reveals where ownership effects are strongest:

  • On mortality and safety, nonprofits consistently outperform for-profits — these are the clinical quality measures that require systemic investment in infection control, care protocols, and staffing ratios.
  • On readmissions, the gap narrows — readmission reduction requires care coordination infrastructure that all ownership types have invested in, partly because CMS penalizes hospitals financially for excess readmissions.
  • Government hospitals underperform on most domains, again reflecting the challenging patient populations (uninsured, high-acuity, high-complexity) that many public hospitals disproportionately serve.

8. The Quality Landscape   dataviz quadrant

This scatter plots hospitals on two independent quality axes: mortality performance (x) and safety performance (y), colored by overall star rating. The dashed lines separate hospitals above and below the 50% benchmark on each axis.

The upper-right quadrant — better than national on both mortality and safety — is dominated by 4–5 star hospitals. The lower-left quadrant — below benchmark on both — is dominated by 1–2 star hospitals. But there's substantial scatter in the middle bands. Some 3-star hospitals outperform on both; some 4-star hospitals have weakness on one axis.

The hospitals most worth scrutinizing are those in the off-diagonal quadrants: high mortality quality but poor safety (or vice versa). These represent systematic imbalances — a hospital excelling on one dimension while failing on another — that the aggregate star rating can obscure.

9. Data and Methods   data cms methodology

All data comes from CMS Provider Data:

  • Hospital General Informationxubh-q36u. 5,426 hospitals with ownership type, hospital type, overall star rating, and quality domain measure counts (better/no different/worse than national for mortality, safety, and readmission domains).
  • HCAHPS Patient Surveydgck-syfz. Filtered to H_STAR_RATING measure: the overall patient experience star rating (1–5) derived from the Hospital Consumer Assessment of Healthcare Providers and Systems survey.

"Better than national" means the hospital's performance on a specific measure (e.g., 30-day heart failure mortality rate) was statistically significantly better than the national average. Hospitals with too few cases to evaluate are excluded from that measure's count.

Ownership simplification:

  • Nonprofit: "Voluntary non-profit — Private", "Voluntary non-profit — Church", "Voluntary non-profit — Other"
  • For-Profit: "Proprietary"
  • Government: all "Government —" subtypes

Not included: Critical access hospitals, psychiatric hospitals, rehabilitation facilities, long-term care facilities. These use different payment and evaluation systems and are not rated on the 1–5 scale.