The Emergency Department (ED) Return Visit Quality Program launched in 2016 with the goal to foster a culture of continuous quality improvement in Ontario’s EDs. In this program, participating EDs are provided with data reports that identify return visits resulting in admission for which the initial visit occurred at their site. They conduct audits to investigate the causes of these return visits, identify any quality issues or adverse events that may be present, and take steps to address these issues.
The results of the second year of the ED Return Visit Quality Program were submitted to Health Quality Ontario in January of this year (read the report here). Southlake Regional Health Centre’s submission revealed how they have gone above and beyond program requirements to learn as much as possible from return visits involving their site by investigating every return visit flagged in their data reports. We spoke with Kim Storey, Director of the Emergency Department and Patient Flow at Southlake Regional Health Centre, to learn more about their experience with the ED Return Visit Quality Program and the changes they’ve made as a result of participating in the program.
Could you talk a little bit about your organization’s approach to this program?
This program launched at a time when we were looking into proactive ways to assess and improve quality in our ED. It is easy to fall into a reactive approach where we address issues as they arise (for example, through patient complaints), but this program allowed us to actively search for opportunities for improvement before they reach this point.
When the program began, our ED leadership team got together to determine our approach. Our site is a high-volume site, and we get about 250 return visits per quarter. Doing the minimum number of audits would mean that we would be looking into 10-15 of these cases, but we thought – how can we look into those cases and leave the others behind? We instead aimed to look into all of our return visits, because if there’s a learning opportunity there, we want to catch it.
We knew it would be a lot of work to do this. We developed a process where we have two review days each quarter. Leading up to these days, our administrative support identifies the treating physician for each case. Different physicians have different numbers of cases, depending on how many shifts they’ve worked. So physicians are paired with another physician with a similar number of cases, and they review each other’s cases before the meetings. Then, during the meetings, we discuss the cases in which opportunities for improvement were identified as a group. To make sure that the learnings we discuss during these meetings are shared, we also touch on them at our monthly ED meetings, where physician attendance is close to 100% apart from those staffing the ED.
We did consider having physicians review their own cases, but opted for this peer review system to eliminate any potential bias in physicians auditing their own cases. Of course, physicians still have the opportunity to review their own cases as well.
It sounds like you have a really strong culture of quality in your organization.
We are committed to being a high-reliability organization. Our former CEO, Dave Williams, was an astronaut and had a big part in bringing this concept to our organization. We’ve also been influenced by a book called Peak, by Anders Ericsson and Robert Pool, which outlines a process that high-performing people follow – they have a coach, get feedback, and plan how to improve next time. We follow this approach in our organization.
How did you involve your colleagues outside the ED (for example, in internal medicine or diagnostic imaging)?
We invite the director of paramedical services (including laboratory, pharmacy, and diagnostic imaging) as well as other leaders to our review days. Seeing how these issues affect patients really helps buy-in for change, and participating in the meetings also helps everyone to view the process as a collaborative effort to improve rather than pointing fingers.
As an example, one of the changes we have made was to increase access to abdominal CT to 24 h. Previously, we had been giving these patients the option to return the next morning for their scan or be admitted overnight, but through the audits we were finding that some of these patients actually had appendicitis and required surgery as soon as possible. So our director of paramedical services has been working to train the technicians who are working overnight so that they can perform abdominal CT scans. This hasn’t been a huge investment of resources, and has actually smoothed the patient flow for imaging as there is no longer a wave of patients coming in for CT scans in the mornings.
Some of the most common interventions mentioned in this program involved education, rather than interventions such as forcing functions, which may be more effective. Can you discuss your approach to designing interventions to address the issues you find?
Well, the system we have built of continually providing feedback on what is working well and what needs improvement enables our physicians to continually improve - but this only addresses performance issues, which are rare. Often the quality issues we find are system issues. In these situations, we do tend to try to use interventions such as forcing functions to ensure that they’re effective.
As an example, we have point-of-care testing available for certain tests, but nurses would often get very busy and send these tests to the lab, which takes more time. Recently, we implemented changes to some of our order sets in order to require nurses to do point-of-care testing. We recognized that this is more work for our nurses, so we added supports to reduce other aspects of their workload – for example, nurses used to do 100% of the phlebotomy work, but we’ve now added support through technicians to ease our nurses’ workload. We also added an electrocardiography technician to lighten the workload there as well (in addition to reducing the door-to-electrocardiography time, which was another opportunity for improvement we identified through the program).
What are some other examples of initiatives you’re proud of?
We found that our patients were experiencing delays during triage and registration. One thing we did to address this was implement a quick-look triage IPAC (Infection Prevention and Control) screening. This was actually a significant effort because we had to designate an enclosure for nurses to conduct this screening. But it allows our volunteers to work in the waiting room supporting patients, since they are no longer at risk of being triaged as infectious. Since making this change in March, we have seen a reduction in the rate of patients who leave without being seen – the rate was 0.3%, but most days when we have this screening in place it is 0%.
We also did a review of 4,000 cases to assess our compliance with the Choosing Wisely recommendations for emergency medicine. We were at 80% compliance or above, which shows really good alignment with these best practices. We believe our approach of continually providing feedback (from the chief of the ED) and discussing how we can improve to meet best practices helped us get to this point.