Sunteți pe pagina 1din 3

Another Strategy for Determining Common Cause

Page 1 of 3

Published on Quality Digest (http://www.qualitydigest.com) Home > Another Strategy for Determining Common Cause

Another Strategy for Determining Common Cause


Cut new windows but try not to upset the daily routine
Davis Balestracci

Published: 11/05/2012 Remember the early days of TQM? When I present to healthcare audiences, all I have to do is mention lab turnaround time to get a collective groan and smile. That was always one of the initial forays into what was called total quality management (TQM) or continuous quality improvement (CQI). [ad:22201] Most initial projects like lab turnaround time quickly turned into the project from hell due to overcollection of lots of vague data resulting from a huge cause-and-effect (Ishikawa) diagram that answered a vague question such as, What causes long turnaround times? It can be so tempting to naively jump right to todays strategy before even knowing where to focus Id like to revisit this situation using 2020 hindsight to teach some lessons. (Of course, today you would first connect any project work to a big dot in the boardroom, right?) Data strategies No. 1Exhaust in-house data and No. 2Study the current process could have been used to help get an approximate baseline of the problems extent (both in terms of length of time and occurrence), and some good stratification would help isolate and identify the lab procedures, departments, times of day, or days of the week to account for the majority of too long times (whatever that meanssome good operational definition work would be helpful, too). These initial studies of the current process would be based on an agreed global definition of overall lab turnaround time. One possibility might be the elapsed time from the test being ordered to the time the result is available to the patient... or should it be the physician? There had better be agreement. For deeper issues regarding the measurement process input of this process, it would also be interesting to stratify these results by recording any words on the order, such as STAT, STAT!!, Rush, or Priority. Do they really make any difference, i.e., result in lower turnaround times? If a major opportunity is isolated, this overall lab turnaround time would now have to be broken down or disaggregated into its various components. Such component times might include transportation to the lab, waiting in the work queue, performing the test, communicating the result to the physician (or hall nurse), or time to get the result to the patient. Also, the definitions of each

http://www.qualitydigest.com/print/22197

12/20/2012

Another Strategy for Determining Common Cause

Page 2 of 3

individual component would need to be clearly defined for everyone involved. So, using the first two common cause strategies, you now have a project baseline and have isolated a major opportunity. You have no doubt also learned about how to handle that pesky human variation and realized the need to be quite formal about special data collections. Whats next?

Common cause strategy No. 3: Cut new windows (process dissection)


Juran called his third data strategy cut new windows (Brian Joiner calls it disaggregation or process dissection, interchangeably). This takes strategy No. 2study the current process (i.e., stratification) one step further by gathering data that are, once again, not routinely collected. The purpose is to further focuswithin a major identified opportunityto identify an even deeper source of variation that is hidden in the aggregate overall performance measure. The process now must be deeply dissected beyond any routine requirements or easily obtainable data. The good news is that this must be done only for the 20 percent of the process causes 80 percent of the problem (aka the Pareto principle), which offers a high probability of being productive. However, because of the level of detail needed, this task requires much more disturbance in the daily workflow to get at these data. Unlike study the current process, where one easily records data that are virtually there for the taking, this next step involves breaking a process into its subprocesses, which will take significant time and effort on the part of the data recorders.

Deeper focus
The vital processes have now been dissected into their subprocesses. There might even be an additional factor for considerations regarding potential collection. For instance, the previous analyses may have exposed that it is only necessary to collect these data for one or several specifically isolated time periods during the day. But the analyses might also have shown that Mondays or weekends had unique, specific problems vs. a more typical day. Breaking apart the total time is, in effect, cutting a new window into the processi.e., dissecting it to look at different perspectives and smaller pieces. And note: This level of detail is done only on the tests that have been identifiedusing Pareto analysis of the data collected in strategies one and two, the 20 percent of the test types causing 80 percent of the excessive test timesto be a major problem. With such a focused data collection, gathering the required information will involve no more personnel than necessary; data collection in other areas would add little value for the effort expended. Further benefit: This study of the process for lab procedures with the most problematic turnaround times might also ultimately provide some benefit for all procedures. For planning purposes, previous work on an appropriately detailed flowchart would expose the best leverage points for data collection and allow the creation of an effective data collection sheet. Use process dissection only after a significant source of variation has been exposed and isolated. Realize that it is a very common error for teams initially to jump directly to this step. This not only results in too much data, but more important, it inconveniences people who dont need to be inconvenienceda mistake I have also made in the past more than I care to admit. The data collection has no ultimate benefit for them, and your credibility as a facilitator becomes suspect.

http://www.qualitydigest.com/print/22197

12/20/2012

Another Strategy for Determining Common Cause

Page 3 of 3

As you know, this strategy (actually, any strategy) is a major upset to daily routine. Make sure that its use has been preceded by the only mildly inconveniencing data strategies No. 1 and No. 2 to isolate a major opportunity on which to focus. Make sure any perceived additional work by a culture is also perceived as ultimately valuable. This will enhance your reputation as well as create better organizational beliefs about quality improvement.

About The Author

Davis Balestracci
Davis Balestracci is a member of the American Society for Quality and past chair of its statistics division. These thoughts are taken from chapter nine of his book, Data Sanity (Medical Group Management Association, 2009), with a foreword by Dr. Donald Berwick. It offers a new way of thinking via a common organizational language based in process and understanding variation. Balestracci would love to wake up your conferences with his dynamic style and entertaining insights into the places where process, statistics, organizational culture, and quality meet. Visit his website for more information. 2012 Quality Digest Magazine. All Rights Reserved. Source URL (retrieved on 12/19/2012): http://www.qualitydigest.com/inside/quality-insidercolumn/another-strategy-determining-common-cause.html Links: [1] http://www.qualitydigest.com/inside/quality-insider-column/wasting-time-vague-solutions-part2.html [2] http://www.qualitydigest.com/inside/quality-insider-article/wasting-time-vague-solutions-part3.html [3] http://www.qualitydigest.com/inside/quality-insider-article/four-data-processes-eight-questionspart-1.html

http://www.qualitydigest.com/print/22197

12/20/2012

S-ar putea să vă placă și