Good News from ACTION GWTG Registry: Most Hospitals Doing Quite Well at Acute MI Care
Hospitals are equally apt to follow the best processes—and achieve similar outcomes—for acute MI irrespective of whether or not they participate in the ACTION Registry-Get With The Guidelines (GWTG) program. The findings, from observational analysis of 2010 data, were published online last week in American Heart Journal.
It is not that registry hospitals were not up to snuff, or that the non-participants were doing unusually well. Rather, according to lead author Robin Mathews, MD, of Duke University Medical Center (Durham, NC), the study shows that performance of these reported metrics is “excellent” on the whole.
“Our results suggest that hospitals collectively have come a long way in achieving high levels of evidence-based management of our patients with acute coronary syndrome,” he told TCTMD in an email. “However, the quality effort is a dynamic process and requires constant evaluation of our performance to look for the next area of potential improvement.”
Mathews et al identified 3,432 hospitals that contributed to the Medicare Hospital Compare database in 2007. Of these, 15% were participating in the ACTION Registry-GWTG, “one of the largest national quality improvement (QI) programs” specific to acute MI. Such hospitals tended to be larger than non-participating institutions (median 288 vs 139 beds) and were more likely to be teaching hospitals (18.8% vs 6.3%) and have cath lab capabilities (85.7% vs 34.0%; P< .0001 for all).
To account for those differences, Mathews et al matched 502 pairs of participating and non-participating hospitals based on teaching status, size, PCI capability, and baseline performance of AMI process measures. These measures included: aspirin on arrival and discharge, beta blocker use on discharge, ACE inhibitor/ARB use in the setting of LV dysfunction, smoking cessation counseling, and (for STEMI patients) PCI within 90 minute.
In 2010, process measure adherence was “very high,” the matched comparison found, reaching 98% or higher for “almost all” metrics regardless of participation in the registry. ACTION Registry-GWTG hospitals were slightly more likely to meet the goal of performing PCI within 90 minutes compared with nonparticipating hospitals (92% vs 89%; P = .005). Risk-standardized rates of 30-day mortality (15.8% vs 15.9%) and readmission (19.8% vs 19.9%) did not differ between the 2 groups (P = NS for both).
Baseline performance did, however, predict the relative impact of taking part in ACTION Registry-GWTG. Among hospitals in the lower half of baseline performance, those that participated showed slightly larger improvements in adherence to process measures over time than those that did not. Among high baseline performers, this positive influence was not seen. Thirty-day outcomes were unaffected by this interaction.
New Metrics May Be Needed
“Clinicians and providers are critical in advancing the quality of care rendered at their institutions,” Mathews said. The ACTION Registry-GWTG program provides feedback to sites so that they can know how well they do in comparison with their peers. That information paired with “QI tools—internal feedback, order sets, risk stratification algorithms, etc—can assist providers on where to focus efforts,” he added.
For lower-performing hospitals, these tools may be especially useful in addressing gaps, Mathews and colleagues suggest. However, among the higher performers, good adherence to existing metrics may cloud efforts to tease out which are continuing to improve. “In addition, the prevalence of QI programs and incentives beyond ACTION Registry-GWTG has increased over time and may contribute [to adherence]. Isolating the effects of any one QI program is challenging, particularly when overlap exists between multiple initiatives,” they say.
If anything, the study captures the challenges inherent in developing QI programs in the modern era, Mathews noted. “The very high achievement, though a great thing, suggests that these metrics are essentially ‘topped out’ and may no longer be the most relevant ones to report on or follow, since most hospitals are doing so well,” he said.
“Interestingly, though performance on quality metrics has improved as a whole over recent years, rates of readmission and mortality have not changed substantially,” Mathews added. “Though this is a complicated issue, it does suggest that we need to continue to focus on 'measure selection' to choose those metrics that would most directly influence outcomes.”
Useful measures already being collected include time to diagnostic testing and time to treatment in patients with angina, he said. Quantifying the time period between first medical contact, rather than hospital arrival, and treatment in STEMI patients “allows us to look at continuum of care that begins when a patient first experiences symptoms or contacts EMS, not just when they physically arrive in an emergency room or a catheterization lab.”
Additionally, “the transition from hospital back to the community is also an important opportunity to influence outcomes,” Mathews continued. “For instance, cardiac rehabilitation referrals on discharge after a myocardial infarction or the ability to make follow up referrals to a primary care physician and/or cardiologist early after discharge have been identified as factors that may reduce downstream events such as readmissions.”
Mathews R, Fonarow GC, Li S, et al. Comparison of performance on hospital compare process measures and patient outcomes between hospitals that do and do not participate in Acute Coronary Treatment and Intervention Outcomes Network Registry-Get With The Guidelines. Am Heart J. 2016;Epub ahead of print.
- This research was supported by the American College of Cardiology Foundation’s National Cardiovascular Data Registry.
- Mathews reports no relevant conflicts of interest.