The Cardinal Rule:  Measure Outcomes

The foundation for any performance strategy or program should be outcome data.  Unfortunately for many on the road to becoming a high reliability organization (HRO) in healthcare (or even to “zero”), the data being used to become more reliable is unreliable.  Many organizations rely on voluntary event/incident reporting data for identifying “serious safety events” and near misses that have been shown by peer-reviewed research and real-world evidence to identify approximately only 5% of adverse events.  

Contrariwise, the use of clinically validated adverse event outcomes using real-time EHR data (defined as “AE Outcomes”) has been shown by an overwhelming amount of peer-reviewed research and real-world evidence to identify approximately 10x the serious patient harm as compared to event reporting, an approach which remains necessary but also insufficient. 

To be clear, voluntary event reporting data remains essential, even if incomplete and – based on research – racially and socially biased.  Indeed, it is a source of learning but not a source of measurement, such as outcomes data and, specifically, AE Outcomes.  

Would we fly with peace of mind knowing that the airliner aboard which we’re crossing the Pacific Ocean relied on an approach identifying only 5% of the airplane’s malfunctions?  Count us out; unfortunately, patients have no such choice.  True, aviation uses voluntary event reporting and prizes communicating in a culture of safety, but today’s aviation systems (i.e. cockpit, aircraft, control towers, ground systems, etc)  fundamentally rely on extensive real-time electronic surveillance systems to identify existing and potential hazards that could result in the loss of life.  

For many years, clinical leaders and teams relied on voluntary event reporting and cultural interventions because, well, that was all that existed to improve.  Today, leading hospitals and health systems have demonstrated that not only can 10x the level of serious harm be found with a Virtual Patient Safety (VPS) solution but also that greater share of harm can be reduced by over 25% on a sustainable basis.  

The underlying method of VPS relies on AE Outcomes, which are then used to measure and manage all harm all  the time for all patients.  The use of AE Outcomes constitutes a superior method for getting to reliability as well, because AE Outcomes have been demonstrated to be accurate, timely, and actionable.  

So if the cardinal rule in getting to reliability is to establish a foundation of measuring with AE Outcomes in order to assess reliability, let’s explore five reasons why.

5 Reasons Outcomes Get Us to Reliability

Karl Weick and Kathleen Sutcliffe in their seminal work on high reliability organizations (HRO), Managing the Unexpected – Resilient Performance in the Age of Uncertainty, posit five principles that characterize HROs.  

Let’s take a look at each of these principles and suggest why outcomes – and, specifically, an AE Outcomes-driven method – would accelerate, enhance, and even enable each:

1. Preoccupation with failure.  How do we know if we have failed, are failing, or are about to fail if we are not measuring outcomes – and with respect to the safety domain, safety outcomes (i.e. AE Outcomes)?  This principle gives attention to close calls and near misses (i.e. “being lucky versus being good”), but how do we identify events that are antecedents of undesirable outcomes if our approach (of event reporting) is missing 95% of the harm events, underlying patterns, and consequent root causes?  If we are to focus more on failures than successes, how can we ignore potential evidence available through AE Outcomes identifying 10x the patient harm and claim we are closer to high reliability, or even “zero”? 

The data-driven method supports Principle #1 by using AE Outcomes to:

    1. Measure – know that we have failed or are failing by measuring to determine what performance really has been and is currently.
    2. Anticipate – train predictive models with the outcomes we want to predict (a foundational principle in data science) instead of proxy outcomes, and then assist clinical judgment with advanced analytics to anticipate what is likely to happen, i.e. are we likely to fail or to succeed?
    3. Measure to Measure More – those preoccupied with failure want to know not only #1 and #2 above but also to identify contributing factors, antecedent events, and other variables or values in the clinical context – which can only be measured if we first measure the outcome itself and then investigate it.

2. Reluctance to simplify interpretations.  Safety is overly simplified every day by organizations who make assumptions about harm, causes, and performance by relying on learning disproportionately from 5% of the harm identified by event reporting. The key point is:  an organization will never get to reliability when missing so much.  Without taking into account the other 95% of harm, root cause analyses will fail to result in identifying what is or are the first cause(s) of failure or under-performance – simplifying the interpretation of what’s happening in safety.  Beyond missing the events and the patterns of harm in the other 95%, how do we know if those 5% of adverse events identified are the most frequent, severe, or merit the highest prioritized use of organizational resources?  We don’t.

The data-driven method supports Principle #2 by using AE Outcomes to conduct:

    1. Evidence-based epidemiology on all harm all of the time for all patients – avoiding the simplification of only a small portion of the problem;
    2. Common cause analysis (CCA) and root cause analysis (RCA) on 10x the serious harm identified by voluntary event reporting and much sooner 
    3. Robust clinical investigation practices that include fine-grained data characterizing the care being delivered over time to a patient – rich clinical context is essential to generating interpretations that reduce complexity. 

3. Sensitivity to operations.  Weick and Sutcliffe underscore the importance of situational awareness and carefully designed change management processes.  These requirements are alien to many patient safety solutions that are manually intensive (assuming inefficient allocations of time), technically independent (creating silos of data), or culturally intrusive (ignoring how things are done). 

The data-driven method supports Principle #3 by using AE Outcomes by:

    1. Generating efficiently an AE Outcomes data stream that is timely and actionable and is shown to yield ROI, meriting further investment.
    2. Providing a scientifically validated and clinical credible source of measurement used across organizational functions enterprise-wide.
    3. Recognizing that simply sending alerts to the frontline is often distracting or even disruptive, and what’s required is a combination of foreground surveillance and background surveillance in order to characterize, prioritize, route, and escalate information to the right person at the right time in the right way. 

4. Commitment to resilience.  HROs continually devote resources devoted to corrective action plans and training.  Correcting what and why?  Training for what and why?  Indeed, how do we know how to prioritize resources for allocation if our intelligence is limited to the 95% of adverse events being missed?  It is inadequate and perhaps grossly suboptimal to rely on the calls of third-party national organizations to inform investments in resilience, or to look at what was done the year before.  We need to know how our patients are getting harmed and what the causes are.  Then we are better positioned to invest for resilience.   

The data-driven method supports Principle #4 by using AE Outcomes to inform investments in resilience by:

    1. Knowing what adverse events from which my patients are suffering by specific type, frequency, severity, and other clinical context – and doing so using real-time EHR data.
    2. Knowing whether the investments in resilience are working or not when “shocks” to the system occur by having a longitudinal measurement of outcomes. 
    3. Knowing how to target in a more fine-grained way, i.e. what specific types of resource allocation and training are going to be useful for this particular facility, team, or individual versus more blunt approaches often used in patient safety and quality (e.g. gathering a very large group in trainings irrespective of measured needs).

5. Deference to expertise.  This last principle calls leaders and managers to listen to their experts on the front lines.  HROs appreciate that authority follows expertise, and therefore that authority to make the best decisions resides on the “edge” where it resides.  

The data-driven method supports Principle #5 by using AE Outcomes to:

    1. Empower experts with AE Outcomes to enhance the impact of their expertise; experts without data fail to help the organization as much as they otherwise could. 
    2. Evaluate  with AE Outcomes how expertise is contributing to performance, reallocating how the organization is deferring expertise in order to optimize. 
    3. Extend expertise to improve reliability when AE Outcomes identify opportunities for improvement or potential failure modes.
    4. Expand the front line’s capacity for delivering care by eliminating event reporting data entry for information already in the EHR.
    5. Energize  the front line by allowing them to use event reporting not as a data collection service but as a way to focus and amplify attention on their key concerns.

How Outcomes Will Transform HRO Efforts

Healthcare, following other industries around it, is moving on an inevitable path and accelerating pace towards the use of outcomes data to improve performance, and specifically in patient safety, quality improvement, and risk management.

Pascal predicts that we will all look back in amazement that we ever thought we could get to reliability by relying on unreliable data and long-cycle, blunt cultural interventions alone.  To be sure, the foundation of a culture of safety is necessary but not sufficient.

Contrariwise, when it comes to adverse events, the standard of care – and especially if a delivery system seeks to become an HRO – will come to include:

  1. The reliance on clinically validated adverse event outcomes using real-time EHR data (AE Outcomes) 
  2. The investment in the epidemiology of patient harm that identifies common causes of harm across the “missed” 95% and generates rich clinical context 
  3. The still unknown but assuredly inevitable benefits from #1 and #2 that will result in the emergence of outcomes-based analytics that will usher the state of the art in the HRO community through advanced analytics covering every nook and cranny of an organization’s operations.  

Today health systems celebrate the reduction of serious safety events (based on event reporting data).  Tomorrow they will consider commonplace a constellation of actions flowing from  seemingly insignificant events that will be validated to avoid injury and death – that too common today is either unknown or simply preventable.