Part 1: Complexity of the Real World Has Outpaced the Myth of the Linear Method

Dr. Russel Ackoff spoke often of how we were prepared to deal with the real world in school. We were usually presented with a “case study.” We would then busy ourselves in reading and developing ideas around the case study. We would then turn in or present the case to the teacher for a grade. Ackoff noted that in the real world, problems do not come in the form a case study, but as a “mess.” Part of our journey is understanding the mess and developing a statement of the challenge.  Laurence J. Peter warns us in the quote:

“Some problems are so complex that you have to be highly intelligent and well informed just to be undecided about them.”

The challenge before most of us trying to make improvements is the complexity of environment in which we are working. Systems that are not closed but open systems that are dynamic accompanied by influences for which we may or may not be aware. Ramo (2016) draws the distinction between complicated and complex systems:

Complicated mechanisms can be designed, predicted, and controlled. Jet engines, artificial hearts, and your calculator are complicated in this sense. They may contain billions of interacting parts, but they can be laid out and repeatedly, predictably made and used. They don’t change. Complex systems, by contrast, can’t be so precisely engineered. They are hard to fully control. Human immunology is complex in this sense. The World Wide Web is complex.

Complicated systems have the property of being closed systems, while complex systems operate in an open environment that is very dynamic. Many of us working to make improvements in our organizations can relate readily to the idea of complex system. Into this fray, many of us are given problem solving methods that are linear. We are presented with the idea that if we follow the method, we will be led to a solution. Dr. Jeff Conklin presented some research around the so-called “Waterfall” method commonly used to develop software. Figure 1 describes this method:

Figure 1: Waterfall Method of Problem Solving

 

In this method, we are to gather data, analyze the data, formulate a solution and implement the solution. How does the real world react to this linear path? Conklin then presented the experience of one designer following the method. Figure 2 describes how the perception of the designer vacillates from problem to solution over the course of using this method:

Figure 2: Waterfall Method with One Designer

 

 

Conklin referred to this vacillation as a “wicked journey.” If you have worked on an improvement effort of any complexity you can appreciate this journey. One day filled with hope and a solution, the next day, frustrated by an unintended consequence of your change, you are now faced with the challenge to adapt to the new circumstances you are facing. More work to do!

Life would be good if we could handle complex challenges by ourselves (the lone designer) as in Figure 2. Complex challenges usually require subject matter knowledge of other people. What happens as we add other people? Do they share our perceptions of the problem and solution? Figure 3 describes the journey with two designers:

Figure 3: Waterfall Method with Two Designers

 

 

From Figure 3, we can readily see the perceptions between the two designers track at times and are very different at other times. For improvement teams, we usually have 3-5 people on a team. Conklin refers to this addition of people as “social complexity.” Personality intelligence tells us that people are very different. Their perceptions of the same events, data, etc. may be very different given how they learn and their subject matter knowledge.

We had an improvement team in an international tech company that used a method called Understanding, Develop Changes, Test Changes and Implement Changes (UDTI). Within each of the defined phases the Plan-Do-Study-Act (PDSA) cycles were utilized. The team referred to their journey as “wicked.” Figure 4 describes this team’s journey:

Figure 4: Using UDTI with PDSA Cycles to Make an Improvement

 

 

In following the PDSA cycles in the figure, you can imagine the frustration of the team in PDSA 12 when a rush to implementation led to failure and required a visit back to the “Understand” phase. After this learning, testing was always done before implementation. The six cycles of implementation at the end were the spread of known changes to other regional groups.

What is the downside of the vacillation between the stages of the linear method? When an improvement team discovers an unintended consequence of a test, they must go back to a prior stage of the linear model, many see this as a failure.  Experienced people with improvement efforts understand that when addressing complex challenges, learning and unlearning are natural parts of the journey. However, the same organization that used the UDTI method had one team in Europe that eliminated all the failed PDSA cycles of the improvement journey, forcing a perfect match to the method. Unfortunately, this sort of practice, while helping self-esteem has nothing to do with the science of improvement.

People who use such linear methods often discuss the vacillation of the hopeful path. One of my colleagues is looking for the first project of any complexity that follows such a method. So far, we have not found one. Margaret Wheatley once commented on why the myth of success with linear methods continues: “After the fact, people usually report their journey by the prescribed method, thereby reinforcing their use.” Dr. Jeff Conklin and the UDTI team have given us some insight into use of such methods as they encounter a world of complexity. Hopefully, we won’t be surprised when the real world does not cooperate.

In Part 2, we will examine some methods based on the science of improvement. The importance of questions in addressing complex systems and help in addressing the social consequences of technical change.

 

References:

1.       Conklin, Jeff, Wicked Problems and Social Complexity (2008); This paper is Chapter 1 of Dialogue Mapping: Building Shared Understanding of Wicked Problems, by Jeff Conklin, Ph.D., Wiley, October 2005. For more information see the CogNexus Institute website http://www.cognexus.org. © 2001-2008 CogNexus Institute. Rev. Oct 2008.

2.       Ramo, Joshua Cooper. The Seventh Sense: Power, Fortune, and Survival in the Age of Networks (p. 137). Little, Brown and Company.

3.       Leadership and the New Science, Margaret Wheatley, Berrett Koehler Publishers, San Francisco, 1992. Note: In searching this book, we were not able to locate the quote from Wheatley. In communication with her staff, we were told to attribute. The reader may find this reference useful. Find Part 2 here

 

Continue reading
  4786 Hits
  0 Comments
4786 Hits
0 Comments

Time to Retire the 16th Century Root Cause Phrase and Thinking  

Systems thinking has destroyed the idea of single cause thinking from the 16th century. Systems thinking has been on a roll since Bertalanffy wrote General Systems Theory in 1968. In spite of systems thinking, the use of "root cause" phrase persists.

Psychologically, there is an upside and downside of the phrase “Root Cause Analysis (RCA).” The upside, the illusion of single cause thinking gives people hope. It sends the message that one thing is going on and they can handle that. On the downside, the mental image of a “root cause,” leads people to finding a cause. I once watched in horror as a Master Black Belt (MBB) led a group of engineers in a high-tech company through a multi-voting exercise on an Ishikawa Diagram. Once the MBB had all the votes, they focused on the “top cause.” There was a short plan put together to investigate this single "cause."  Since I was visiting, I was silent until someone asked me what I thought. I asked a question about the possible covariance of factors for the application being discussed. After one engineer that the factors do indeed interact, they got back to reality. Rather than one factor, they needed to consider multiple factors in a designed experiment.

There is hope. People are waking up! In 2015, The National Patient Safety Foundation exposed many of the problems with the myth of "Root Cause Analysis:" From the report:

"RCA itself is problematic and does not describe the activity’s intended purpose. First, the term implies that there is one root cause, which is counter to the fact that health care is complex and that there are generally many contributing factors that must be considered in understanding why an event occurred. In light of this complexity, there is seldom one magic bullet that will address the various hazards and systems vulnerabilities, which means that there generally needs to be more than one corrective action. Second, the term RCA only identifies its purpose as analysis, which is clearly not its only or principal objective, as evidenced by existing regulatory requirements for what an RCA is to accomplish. The ultimate purpose of an RCA is to identify hazards and systems vulnerabilities so that action scan be taken that improve patient safety by preventing future harm.

The term RCA also seems to violate the Chinese proverb “The beginning of wisdom is to call things by their right names,” and this may itself be part of the underlying reason why the effectiveness of RCAs is so variable. While it might be better not to use the term RCA, it is so imbedded in the patient safety culture that completely renaming the process could cause confusion."

 

The last line is tragic, unlearning is usually the first step in learning for many (some try to avoid it at all costs). From this line, the authors are in effect protecting people from learning. Cognitive Dissonance is a natural part of how we learn, adapt and change. The paper on RCA2 can be found here: 


http://c.ymcdn.com/sites/www.npsf.org/resource/resmgr/PDF/RCA2_v2-online-pub_010816.pdf

The effort to restore systems thinking and 21st century science continued in February, 2017 with publication by Kiran Gupta, MD, MPH, and Audrey Lyndon, PhD, entitled Rethinking Root Cause Analysis. This paper has some great tables that describe the various problems associated with RCA. The authors are working with reference to 2015 paper referenced before. Their paper can be found here:

https://psnet.ahrq.gov/perspectives/perspective/216

References:

 

Continue reading
  4810 Hits
  0 Comments
4810 Hits
0 Comments

Six Scaling Mantras

Six Scaling Mantras

Bob Sutton of Stanford University is first rate thinker and author. One of my favorite books by Bob is Hard Facts, Half Truths and Total Nonsense. Well worth a read. Recently, Bob and his co-author have turned their talents to scaling up change. The new book is called, Scaling Up Excellence - Getting to More Without Settling for Less. In the very first part of the book, Bob details Six Mantras of Scale Up:

 

  1. Spread a mindset, not just a footprint. Running up the numbers and putting your logo on as many people and places as possible isn’t enough.
  2. Engage all the senses. Bolster the mindset you want to spread with supportive sights, sounds, smells, and other subtle cues that people may barely notice, if at all.
  3. Link short-term realities to long-term dreams. Hound yourself and others with questions about what it takes to link the never-ending now to the sweet dreams you hope to realize later.
  4. Accelerate accountability. Build in the feeling that “I own the place and the place owns me.”
  5. Fear the clusterfug. The terrible trio of illusion, impatience, and incompetence are ever-present risks. Healthy doses of worry and self-doubt are antidotes to these three hallmarks of scaling clusterfugs.
  6. Scaling requires both addition and subtraction. The problem of more is also a problem of less.

I found #5 to be very descriptive of some famous failures in business. Sutton and Rao go into more detail on the “trio” of illusion, impatience, and incompetence: 

  • Illusion: Decision makers believe that what they are scaling up is far better and easier to spread than the facts warrant.

  • Impatience: Decision makers believe that what they are scaling is so good and easy to spread that they rush to roll it out before it is ready, they are ready, and the organization is ready.

  • Incompetence: Decision makers lack the requisite knowledge and skill about what they are spreading and how to spread it,

     

    If you have not read Sutton before, I think you will find the read informative and entertaining. A great combination.

     

    Reference: Sutton, Robert I.; Rao, Huggy (2014-02-04). Scaling Up Excellence: Getting to More Without Settling for Less (p. 25). Crown Publishing Group. Kindle Edition.

 

 

 

 

 

Continue reading
  7403 Hits
  0 Comments
7403 Hits
0 Comments