PCI ‘Center of Excellence’? Insurers’ Designation May Not Mean Better Outcomes
More transparency is needed and insurers need to evaluate whether these hospitals truly do better, a researcher says.
Hospitals designated as centers of excellence (COEs) by commercial insurers are not necessarily providing better patient outcomes after PCI, new data from New York state suggest.
Looking at programs from three different insurers, researchers led by Sameed Khatana, MD (University of Pennsylvania, Philadelphia), found that hospitals with a COE designation did not achieve lower mortality or readmission rates—or provide improved patient satisfaction—compared with other centers. Within one program, in fact, mortality was slightly higher at the centers singled out as providing high-quality care.
Khatana told TCTMD that this issue has not been well studied in cardiovascular disease, so the findings—reported in a research letter published online May 20, 2019, ahead of print in JAMA Internal Medicine—highlight the need to delve further into the potential usefulness of COE designations.
Moreover, he said, the results indicate that commercial insurers need to be more open about what goes into these types of programs. “We were able to look into and understand the methodologies that were used. But for the average patient going to a patient portal or a commercial payer’s website, they might just see a logo or an indicator saying that this is a high-quality hospital or a center of excellence hospital [and] might not necessarily understand what that means,” Khatana explained. “So I think payers need to be more transparent about their methodology and then also evaluate whether hospitals that are so-called high performers actually perform better.”
When you do look under the hood, a lot of the criteria and metrics that are used to designate these programs might not make complete sense from a clinical perspective. Sameed Khatana
For the study, Khatana and colleagues explored COE programs from three commercial insurers: the Aetna’s Institutes of Quality for cardiac medical interventions, Cigna’s Centers of Excellence for cardiac catheterization and angioplasty, and Blue Cross Blue Shield’s Blue Distinction Centers for cardiac care. Khatana said his team focused on New York because of the state’s robust, publicly available database of PCI outcomes.
Criteria for COE designation varied across programs but included factors like procedural volume; achieved rates of mortality, complications, and readmissions; treatment capabilities; participation in quality improvement registries; patient feedback; and cost-efficiency. Of the 62 nonfederal hospitals performing PCI in New York in 2015, 8% had a COE designation from Aetna, 15% from Cigna, and 27% from Blue Cross Blue Shield.
Regardless of program, however, “overall, there wasn’t a consistent association between being a center of excellence-designated hospital and better outcomes,” Khatana said.
Risk-adjusted mortality at 30 days after PCI only differed between COE-designated and other hospitals within one program—Aetna’s—and was actually higher at those identified as high performers (1.4% vs 1.1%; P = 0.002).
It’s hard to say what’s going on there, Khatana said. “But what we did find overall for all three of the programs was that when you do look under the hood, a lot of the criteria and metrics that are used to designate these programs might not make complete sense from a clinical perspective,” he said, noting that the mortality threshold used for two of the programs was above the state average and the cutoff used for the third program was above the national average.
Because most hospitals will meet those requirements, “using criteria like that doesn’t end up discriminating hospitals based on quality or performance,” Khatana said. “So it does seem that perhaps most of the designation that is happening—or some of it—is related to nonclinical factors. And so it’s possible for some reason that for that one program we ended up seeing a higher mortality related to that.”
Risk-adjusted 30-day rates of readmission after PCI or mortality after acute MI did not differ based on COE designation in any of the programs. Patient satisfaction was no better or worse at the singled-out hospitals either.
A difference in treatment capability emerged within the Blue Cross Blue Shield program, in which hospitals deemed COEs were more likely to have a cardiac intensive care unit (100% vs 69%) and to have on-site cardiac surgery (100% vs 42%).
The authors conclude, however, that “given the insufficient discrimination provided by these programs, the current system of COE designation may allow for assignment based largely on cost and other nonclinical or patient-related factors. Our findings call into question the usefulness of current COE designations; work is needed to improve criteria that clearly identify hospitals that outperform their peers.”
Khatana said one change that might be considered is to use market-specific metrics comparing hospitals to others in the same area rather than using the same metrics across the country.
It will be important moving forward to look at how these issues are affecting patient care, he added. “Patients may already be making provider choice decisions based on these criteria, but it’s not really known what impact it’s having at the moment.”
Khatana SAM, Nathan AS, Dayoub EJ, et al. Centers of excellence designations, clinical outcomes, and characteristics of hospitals performing percutaneous coronary interventions. JAMA Intern Med. 2019;Epub ahead of print.
- Khatana reports no relevant conflicts of interest.