A Review of CMS' Hospital Compare Website
As we reported, in July CMS released its first-ever hospital quality star ratings on its Hospital Compare website. The overall star ratings are based on 64 quality measures grouped under three process categories—effectiveness of care, efficient use of medical imaging, and timeliness of care—and four outcomes categories: mortality, patient experience, readmissions, and safety of care. Many of the hospitals widely considered to be the nation’s best were unable to achieve a five-star rating.
Unfairly punishes hospitals
According to a recent analysis, this star rating system rewards hospitals that serve mostly affluent patients and punishes those serving the poor. The research by Bloomberg BNA compares star ratings of hospitals, indicating a correlation between high star ratings and high household income, and a corresponding correlation between low ratings and low income.
Critics of the rating system point out that low-income patients are more likely to have difficulty accessing transportation for both routine primary care and post-discharge follow-up care. That can and does affect readmission rates, which are a key component of quality ratings. Critics of the system also point to anomalous results such as the consistently low ratings of academic medical centers, which are generally considered among the nation’s best hospitals and which are often located in low income urban areas.
As published by Bloomberg BNA:
Outcome reporting issue
As reported in Health Affairs, CMS calculated and published star ratings for hospitals that had sufficient data to report on as few as three quality domains, including some hospitals that only had data from one clinical outcome domain. The fewer the clinical outcome domains a hospital reports, the less that hospital’s overall star rating is actually tied to performance on patient outcomes. Based on the July 2016 release of hospital compare data, 40 percent of the 102 hospitals that received a five-star rating did not have the minimum data necessary to report on either mortality or readmissions. Of those, 20 performed at only the national average on patient safety. Are all the shining stars here an accurate representation of quality?
This inconsistent value matrix leads to a wide performance divide among five-star hospitals. As shown by the Health Affairs research, among the 30 five-star hospitals that had sufficient data to report only the minimum number of quality domains required—three out of a total of seven (red bars)—14 had performed higher than the national average on only one quality domain, 15 performed above average on two domains, and a single hospital excelled at those three quality domains. Hospitals that reported all seven quality domains (yellow bars), however, needed at least three quality domains with above national average performance to receive a five-star rating:
MedPAC: Change the program
In October, the Medicare Payment Advisory Commission (MedPAC) sent a letter to CMS raising concerns about the agency's methodology for calculating hospital quality star ratings. Specifically, MedPAC is concerned the ratings do not fully consider the intrinsic health risks that patients bring to hospitals, therefore creating an output score that is not “apples-to-apples”. For example, at one-star hospitals, an average of 78 percent of admissions were admitted through the emergency department. At five-star hospitals, only 36 of admissions were admitted through the emergency department. One-star hospitals treat a higher share of more severe cases from emergency rooms.
MedPAC is also concerned regarding overlapping quality payment and reporting programs.
The Commission asks CMS to align the star rating methodology as much as possible with existing CMS programs, like the Hospital Value-based Purchasing program, which scores hospitals on a set of quality and cost measures and redistributes payments from lower- to higher-performing hospitals. This may perhaps be the only way for CMS to save the program and correct the methodological challenges that currently call into question the utility of the hospital star ratings.