I have a free app called LoseIt on my phone; it helps me keep track of the number of calories I eat and the number I burn every day. I enter my desired weight, and LoseIt calculates the total number of calories I should consume daily in order to reach or maintain that weight over a seven-day period. So if you eat too many calories on one day (chocolate chip cookies, anyone?), LoseIt tells you how many fewer calories you need to consume on subsequent days to stay on target.
The last time I looked, I could probably stop eating for the next month — and still miss my weekly goal. This sobering fact is a reflection of a small but crucial guiding principle: “Life is uncertain: eat carbohydrates first.” If I knew that I were going to die tomorrow, I’d have carbs today. Carpe diem; pass the bread.
Jokes aside, on the days when I am over my daily allotment of calories, I find myself analyzing how I missed my target. What contained the excess calories, or failed to consume the ones I should have burned off? Was it that glass of wine with dinner, or the missed run? I ruminate on these questions as often and as thoroughly as I do because even if at the end of the week I’ve managed to hit or — better yet — come in under my goals for that period, I know that if I can understand my daily behaviors better — by identifying patterns, say — I can build a plan to manage those extra calories in ways that will spare me the torment of eating way less or exercising way more. Believe it or not, my method has actually worked pretty well (though not as well as I would like — I am a work in progress).
All this strategizing and balancing and compensating got me thinking about work we have been doing recently to analyze and display a client facility’s actual average length of stay (ALOS) as compared to its expected ALOS, including the display of Excess Days.
As does my LoseIt app, the reports we’ve created supply an overall target and associated details: a point estimate about a facility’s ALOS with accompanying details of the underlying data. Often, we also have data about expected patients’ ALOS (based on a predictive algorithm) versus their observed (actual) ALOS; this comparison is expressed as an Observed/Expected ratio. (Note: the “observed” ALOS is each unique patient’s real length of stay divided by the total number of patients. The “expected” value is the sum of those same patients’ predicted length of stay divided by the total number of patients.)
The resulting summary ALOS O/E ratio (sometimes referred to as a Point Estimate) might look like one of the following three examples*:
- The facility both expected a 3 day ALOS and observed a 3-day ALOS. The O/E ratio is 1.0; the ALOS is “as expected.”
- The facility expected an ALOS of 10 days, but observed a 5-day one. The O/E ratio is 2.0; the ALOS is twice as long as expected.
- The facility expected a 4-day and observed a 2-day ALOS. The O/E ratio is 0.5; the ALOS is half as long as expected.
(*For simplicity’s sake, I have not included confidence intervals in these examples; you should. You can find out why in my newsletter of October 11, 2012.)
There’s a crucial piece of information, though, that often goes missing in a report of this type: even when the O/E ratio is “as expected,” or “less than one,” there is still a very high likelihood that some patients were in the facility longer than planned — they had Excess Days.
If we not only keep track of this number, but display the data clearly highlighted, we can increase awareness and encourage an exploration of patterns or trends that might lead to appropriately (note the emphasis!) shorter lengths of stay.
Check out this example display to see what I mean (click on the viz to enlarge).
Each row displays different groups of patients’ O/E ratios by quarter. The “as expected” O/E’s (1.0) or “less than expected” ones (0.9 or less) are marked with a blue point. Each O/E ratio where the ALOS is “greater than expected” (1.1 or more) shows an orange point. This provides a clear, quickly graspable overall indicator of different groups’ average lengths of stay.
In the next section, the average numbers of Excess Days — that is, all days where a patient stayed in the facility longer than expected — are shown as a heat map, using a scale ranging from blue for low values to dark orange for higher ones.
This is where it gets interesting. We would expect to see a large number of Excess Days in Group 2, where all O/E ratios are greater than 1.0. (Q1 = 2.4 with 13.44 excess days; Q2 = 5.7 with 23.56 excess days, etc.) But check out the Groups where the O/E ratios all seem to indicate that there aren’t any Excess Days — in Group 4, say, where the Q2 O/E = 1.0, and there are an average of 2.02 Excess Days. These days, along with all of the ones highlighted in orange, represent a potential opportunity that our viewers might have missed if we’d shown only the O/E ratios.
This layering of information — and the need to drill patiently through those layers to find the truth — is similar to what happens with my LoseIt app. I could have a goal of 14,000 calories consumed, see only 12,000 actual for the week, and get a bit complacent: looks pretty good! When I dig deeper, though, I find that on some days I overindulged (okay, pigged out), and other days, I ate very little. Only by such patient, thorough observation and analysis can I can hope to learn how to spread out my calories more evenly and save my sanity (and my waistline).
The same concept is at work with the O/E ratio. On a primary level, it indicates how a facility is doing. In a more profound sense — uncovered only by meticulous digging to the underlying data — it can point us toward clear, effective, easy-to-understand-and-implement opportunities to improve.