by Beth Godsey, MBA, MSPA
Vizient Senior Vice President, Data Science and Methodology

“In health care, patients expect reliable, consistent, high-quality and scientifically based care to improve their health and quality of life. Health care providers expect the same when being measured for the care they deliver, as well as when seeking data and insights to drive continuous quality improvement.”

Whether you are a hospital administrator, a physician, a nurse, or other member of your hospital’s family, you know well the true north of our daily purpose is to provide quality health care to patients. And the compass that helps us navigate that course is data.

For physicians and other clinicians, data helps us understand what care patients need to achieve the best outcome, as well as what they want to experience when they interact with their care providers and facilities. Data highlights areas of opportunity to improve the care delivered through process improvements, additional training of staff and even in the choice of supplies and equipment. Data can also be helpful to consumers who are seeking treatment for a specific medical condition.

So where do hospitals and consumers go to get this data? One source is the Overall Hospital Quality Star Ratings. The ratings, which are determined using CMS-developed methodology, have been published annually by the Centers for Medicare & Medicaid Services, as part of its Care Compare website.

Vizient has always supported the concept of leveraging contemporary data and clear and actionable methodologies to drive change. However, when ranking methodologies do not support clear insights and action, Vizient has voiced our concerns by working collaboratively these organizations, including CMS. 

Here are several important changes from yesterday’s CMS methodological changes of its Overall Hospital Quality Star Ratings that address historical issues as well as additional recommendations that are still needed to ensure meaningful hospital comparisons.

Changes help create stability for hospitals

Since its launch, the star ratings methodology have included a statistical approach called latent variable modeling. Unfortunately, this aspect of the methodology created difficult to interpret and unpredictable measure weights which limited the hospital’s ability to understand and identify opportunities to improve care. 

We’ve provided expert feedback to CMS about the methodology and proposed changes through public comments, though participation on CMS’ Technical Expert Panel and in communications with agency officials. To create more stability for hospitals using the star ratings, we were able to leverage Vizient’s data analytics capabilities to fully recreate the CMS modeling method and shared this with CMS leaders to demonstrate that the latent variable modeling was being applied in a way that was not optimal for their approach. We also demonstrated that part of the swing in scores hospitals were experiencing from release to release was driven by this modeling methodology. 

In April 28  release, we are pleased to see that CMS adopted Vizient’s recommendation to replace its latent variable modeling with a simple equally weighted measure approach. This change will make it easier for hospitals to understand how their performance translates into a star rating and creates consistent evaluation from release to release.

This welcome change provides more reliability for hospitals and eases their ability to predict future performance since the methodology will remain intact in its simplicity between releases. Hospitals can now focus on improving care instead of wading through the methodology to understand the fluctuating ratings.

Changes mark progress, not perfection

As welcome and necessary the changes from the CMS methodology are for the Overall Hospital Quality Star Ratings, it is important to consider them as progress, not perfection. There are more adjustments needed to support the core mission of the star ratings to provide patients and the public with a clear, simple and objective mechanism when seeking care for a specific medical condition, while providing hospitals with the data and insights needed to help them achieve improved outcomes. And we will continue to help guide that conversation.

While we are excited to see CMS’ exploration of Vizient’s recommendation to group “like hospitals” together, there remains opportunities to refine CMS’s approach.

Up to this point, the way performance was evaluated essentially treated every hospital as if they were equal in their offerings, equal in the patients that they saw and in the services that they provided. We know that is not the case, with some hospitals offering a full range of care in their communities and others offering specific services to support unique community needs. It resulted in very different hospitals being compared to each other. For example, a 30-bed low-volume critical access hospital reporting three measure groups for a select range of services would be compared to a 300-bed high-volume urban hospital that would report five measure groups, creating confusing and often misleading information for consumers.

In yesterday’s release, CMS peer groups hospitals by the number of measure groups for which they have at least three measures. Specifically, after the minimum reporting thresholds are applied, hospitals would be categorized into one of three peer groups: a 3-measure group peer group, a 4-measure group peer group, and a 5-measure group peer group.

Vizient shared its concerns that the new cohorts methodology could be challenging to interpret for any general consumer trying to use these star ratings. A consumer would be challenged to distinguish or interpret what a 5-measure group hospital and how it would compare to a 4-measure group hospital as it relates to services and outcomes. Vizient continues to suggest that hospitals be placed into peer groups based on hospital type and services provided rather than how many measures they report to CMS. Vizient encourages CMS to utilize criteria including relevant volume thresholds that differentiate patient comorbidities and surgical complexity—including the number of solid organ transplants, cardiac surgery and neurosurgery cases, acute transfers in from other hospitals and trauma service line volume. Leveraging these criteria, hospitals could be split into comprehensive academic medical centers, complex care medical centers, and community hospitals. Our most recent comments to CMS provide a more complete proposal for like-hospital peer grouping which Vizient utilizes within its own Quality & Accountability Hospital ranking today.

Additionally, we have shared with CMS that there is a lag between when data is reported and when it is eventually made public. We have estimated that by the time data is made public and included in the star rating it is two to five years old and, as a result, may not reflect the current state of performance by hospitals. Hospitals using these measures and ratings for performance improvement must wait years to see the impact their improvement activities have on the ratings.

We continue to urge CMS to support more timely reporting and inclusion of data to make the star ratings more actionable for patients and hospitals.

Finally, we’re looking at the measures themselves from a patient-centered view and discussing whether the existing provider-centric measures, which were identified by CMS from a cost perspective, fully encompass the types of questions that patients are looking for answers to, such as the typical length of stay for a procedure. The measures that are in there today can be difficult for patients to understand, and until we make that pivot to put ourselves in the patient’s shoes, it can be challenging for patients to utilize that in a deep way.

While we are excited to see CMS’ exploration of Vizient’s recommendation to group “like hospitals” together, there remains opportunities to refine CMS’s approach.  

About the author: Beth Godsey oversees analytical modeling, metric development and the hospital ranking/scoring methodology and framework for Vizient members. Additionally, Beth supports member scenario and impact analysis regarding changes to the national landscape, including, CMS Pay for Performance Program methods and publicly reported measures and methodology. Prior to coming to Vizient, Beth worked in the Center of Clinical Effectiveness at BJC HealthCare in St. Louis, Missouri. In that role, she was responsible for statistical support including case-control analysis, risk prediction and multi-factorial analysis across the BJC HealthCare. Beth has 15 years of experience in statistical analysis and modeling.  Some of her key accomplishments and experience include providing advanced analytical support for major hospital strategic initiatives and providing system wide analytical support. Beth has Masters’ degrees in Predictive Analytics from Northwestern University and Business Administration from Webster University in St. Louis, Missouri. She earned her Bachelors of Science in Statistics from the University of Tennessee at Knoxville and is a Six Sigma Black Belt.

Published: April 29, 2021