Published on

Every system is perfectly designed to get the results it gets.

This quotation, commonly attributed to Dr. W. Edwards Deming,1 has never been more relevant for urgent care (UC) than right now. Considered the original guru of quality improvement, Dr. Deming was explaining why systems must be redesigned if the desired outcomes are not being achieved.

The existing “system” for measuring clinical quality in UC needs an overhaul. It is fragmented and underdeveloped, and lacks the infrastructure required to allow for data aggregation and analysis at a national level, which is necessary before true progress in quality improvement can be expected.

By the Urgent Care Association’s count, there are more than 9,000 UC centers in the U.S., collectively experiencing almost 90 million visits annually.2 This is comparable in scale to the number of emergency departments (approximately 4,000) which experience roughly 145 million annual visits..3 However, unlike EDs which use commonly agreed upon surrogate measures for quality, UC centers do not track metrics or patient outcomes in any systematic way.

And yet, demonstrating our clinical competence has never been more important. COVID-19 has placed UC centers at the “tip of the spear” for testing and treating millions of patients. Regardless, UC centers have largely not been part of the discussion for a national vaccine distribution program. To take our place at the table with other ambulatory specialties such as Emergency Medicine and Family Practice, we need to advance how we think about quality.

A recent UCA publication entitled The Quality of Care at Urgent Care Centers outlined some of the challenges UC facilities face because the existing measures developed for the ambulatory care setting or hospital setting cannot be easily applied to UC centers. However, the real problem, as noted by the authors is that “….46% of UC centers assess quality using measures they have developed themselves, and 16.5% do not measure the quality of the care they provide [at all].”4

EDs have developed agreed-upon national benchmarks for clinical quality for multiple conditions such as heart attacks (ACS), strokes (CVA), sepsis, and unplanned readmissions, just to name a few. These have evolved over time to include other measures of high-quality care for serious conditions, efficient use of resources, and diagnostic accuracy. The American College of Emergency Physicians has worked to incorporate several quality measures into the national Physician Quality Reporting System. This comparative data are now widely available to the public and to payers. Prior to this commitment to national standards for clinical quality and transparency, there was no way for those stakeholders, including the clinicians and clinical leaders themselves, to really know how they were doing.

UC, as a burgeoning specialty, would do well to follow the lead of Emergency Medicine. The Institute of Medicine (IOM) has defined six domains of quality:5

  1. Safety of Care (SC)
  2. Effectiveness of Care (EC)
  3. Patient-Centered Care (PCC)
  4. Timeliness of Care (TC)
  5. Care that is Efficient (EFC)
  6. Equitable (EQC)

As a specialty, we must embrace this framework and look for opportunities to define these metrics for ourselves—before others are allowed to choose the metrics for us.

One of the lessons from the early days of quality metric use in Emergency Medicine is that there can be unintended consequences (ie, metric use can help one population at the expense of others). This was a “side effect” of the community-acquired pneumonia (CAP) metric, where EDs were graded on their ability to draw blood cultures and start antibiotics within 4 hours of arrival for patients who were ultimately admitted for CAP.6

In an attempt to respond to this metric, EDs began to administer antibiotics for almost any patient with respiratory symptoms, resulting in antibiotic overuse and subsequent resistance without any appreciable positive effect on patient outcomes. At the same time, this also commonly pulled resources away from the care of other patients whose conditions may actually have been more serious simply because they did not have a condition that was part of an arbitrarily and externally defined cohort.

To kickstart this conversation for UC centers across the US, I would like to propose several clinical quality metrics to consider. This is not intended to be an all-inclusive list, but we need to start the conversation somewhere. Structural, process, and outcomes measures will all be necessary to fulfill the goal of a national comprehensive quality program.

Some of the metrics proposed below are already widely accepted measures of clinical quality in other domains of healthcare; others have yet to be validated by serious research efforts. Some will be harder to measure than others. However, from our UC organization’s experience, many of these metrics can be measured and tracked without excessive effort; we’ve been doing it for years.

Measuring others still presents a challenge. What we are lacking is a consensus opinion on metrics, which, in turn, would allow for the creation of a national comparative data warehouse for outcomes research. This needs to change. As the expression goes, “If we don’t start somewhere, we’re going to go nowhere.”

So, to begin the brainstorming, I humbly submit a list of proposed quality metrics to consider (with the corresponding domain of quality in parentheses.)

  • (SC) Appropriate use of EKGs in patients >35 years of age who present with a chief complaint of chest pain
  • (SC) Appropriate use of UHCG testing in females between the age of 12 and 55 with a chief complaint of abdominal pain
  • (SC) Inappropriate use of oral antibiotics in adult (> 18) and pediatric (<18 years of age) patients
  • (SC) Inappropriate use of oral steroids in adult (over 18) and pediatric (under 18) patients
  • (SC) Percent of patients who leave UC centers with unaddressed abnormal vital signs
  • (PCC) Patient satisfaction measures
  • (PCC) Rate of patients whose care plan is communicated back to their PCP
  • (PCC) Rate of eligible patients who receive smoking-cessation counseling
  • (PCC) Rate of eligible patients who receive obesity counseling
  • (EC) Rate of ED transfers from the UC center to the ED
  • (EC) Rate of patients seen in UC who present to an ED within 72 hours of urgent care  
  • (EFC) Rate of imaging misreads that result in a change in management
  • (EFC) Appropriate use of urine cultures in patients with UTI
  • (EFC) Appropriate use of throat cultures in patients with acute pharyngitis
  • (EFC) Appropriate use of imaging studies in selected conditions (eg, asthma, low back pain, knee and ankle injury)
  • (TC) Percent of patents seen within 30 minutes of arrival to UCC
  • (TC) Percent of patients discharged within 60 minutes of arrival
  • (EQC) Rate of analgesic prescriptions by race/ethnicity/socio-economic status
  • (EQC) Rate of seasonal flu vaccine by race/ethnicity/socio-economic status
  • (EQC) Percent of patients with chronic disease (HTN, DM, COPD, CHF) who have a PCP by race /ethnicity/socio-economic status

If we, as an industry, do not pursue continuous quality improvement at a national level, with agreed-upon benchmarks, robust data, structural measures, and outcomes research with full transparency to the public and payers alike, we risk losing our opportunity to take charge of building a better system for UC delivery. Whether you are a part of a deeply integrated network of urgent cares within a large healthcare system or a small independent practice, it is incumbent upon all of us to seek ways to incorporate clinical quality improvement into our business model.

To make quality improvement a priority, we need to pull on the all the levers we have by engaging the full array of stakeholders: the general public, local, state, and federal regulators, the UC accrediting and certifying bodies, and the owners and operators of our centers. Without a national database into which we can all submit our quality data and set thresholds for performance improvement, this goal will be virtually impossible to achieve. Now is the time to demand this system at a national level; otherwise, we will continue to have “the system” we have and we will continue to “get the results we get.”

Our patients deserve better. We deserve better, too.

References

1. IHI Patient Safety & Quality Healthcare blog. August 3, 2015.

2. Urgent Care Association. UCA 2019 Benchmarking Report.

3. National Hospital Ambulatory Medical Care Survey. Available at: https://www.cdc.gov/nchs/ahcd/factsheets.htm. Accessed January 8, 2021.

4. Weinick RM, Bristol SJ, DesRoches CM. The quality of care at urgent care centers. J Urgent Care Med. Available at: https://www.jucm.com/quality-care-urgent-care-centers/. Accessed January 8, 2021.

5. Institute of Medicine (IOM). Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C: National Academy Press; 2001.

6. Schuur JD, Hsia RY, Burstin H, et al. Quality measurement in the emergency department: past and future. Health Affairs. 2013;32(12):2129-2138.

National Urgent Care Clinical Quality Metrics: ‘This is the Way’

Neal Shipley, MD, MBA, FACEP

Medical director, Northwell Health – GoHealth Urgent Care