Information at the Gemba:

Info

Uncertainty is all around us. A lean leader’s main purpose is to develop people to tackle uncertainty. There are two ways to tackle uncertainty; one is Genchi Genbutsu (go and see) and the other is the scientific method of PDCA. Claude Shannon, the father of Information Theory, viewed information as the possible reduction in uncertainty in a system. In other words, larger uncertainty presents a larger potential for new information. This can be easily shown as the following equation;

New Information gain = Reduction in Uncertainty

Shannon called the uncertainty as entropy based on the advice from his friend John Von Neumann, a mathematical genius and polymath. The entropy in information theory is not exactly the same as the entropy in Thermodynamics. They are similar in that entropy is a measure of a system’s degree of disorganization. In this regard, information can be viewed as a measure of a system’s degree of organization. Shannon recalled his conversation with Von Neumann as below;

“My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”

I loved the encouragement from Von Neumann that Shannon would have an advantage in a debate since “nobody knows what entropy really is”.

In this post, I am not going into the mathematics of Information Theory. In fact I am not even going to discuss Information Theory but the philosophical lessons from it. From a philosophical standpoint, Information Theory presents a different perspective on problems and failures at the gemba. When you are planning an experiment, and things go well and the results confirm your hypothesis, you do not learn any new information. However, when the results do not match your hypothesis, there is new information available for you. Thus, failures or similar challenges are opportunities to have new information about your process.

There are seven lessons that I have and they are as follows;

  • Information Gain ≠ Knowledge Gain:

One of the important aspects from the view of the information available at the Gemba is that information does not translate to knowledge. Information is objective in nature and consists of facts. This information gets translated to knowledge when we apply our available mental models to it. This means that there is potentially a severe loss based on the receiver. A good analogy is Sherlock Holmes and Dr. Watson at the crime scene – they are both looking at the same information available, but Holmes is able to deduce more.

  • Be Open:

When you assume full knowledge about a process, you are unwilling to gain knowledge from any new information available. You should be open to possibilities in order to welcome new information and thus a chance to learn something new. Sometimes by being open to others viewpoints, you can learn new things. They may have a lot more experience and more opportunities for information than you may have.

  • Go to the Gemba:

The majority of times, the source of information is the gemba. When you do not go to the source, the information you get will not be as pure as it was. The information you get has been contaminated with the subjective perspectives of the informer. You should go to the gemba as often as you can. The process is giving out information at all times.

  • Exercise Your Observation Skills:

As I mentioned above in the Holmes and Watson analogy, what you can gain from the information presented depends on your ability to identify information. There is a lot of noise in the information you might get and you have to weed out the noise and look at the core information available. One of my favorite definitions of information is by the famous Cerbernetician Gregory Bateson. He defined information as “the difference that makes the difference.” The ability to make the difference from the information given depends mostly on your skill set. Go to the Gemba more often and sharpen your observation skills. Ask “For what Purpose” and “what is the cause” more often.

  • Go Outside Your Comfort Zone:

One of the lessons in lean that does not get a lot of attention is – “go outside your comfort zone”. This is the essence of Challenge in the Continuous Improvement Pillar of the Toyota Way. When you stay inside your comfort zone, you are not willing to gather new information. You get stuck in your ways and trust your degrading mental model rather than challenging and nourishing your mental model so that you are able to develop yourself. Failure is a good thing when you understand that it represents new information that can help you with understanding uncertainties in your process. You will not want to try new things unless you go outside your comfort zone.

  • Experiment Frequently:

You learn more by exposing yourself to more chances of gaining new information. And you do this by experimenting more often. The scientific process is not a single loop of PDCA (Plan-Do-Check-Act). It is an iterative process, and you need to experiment frequently and learn from the feedback.

  • Challenge Your Own Perspective:

The Achilles’ heel for a lean leader is his confirmation bias. He may go to the gemba more often, and he may experiment frequently. Unless he challenges his own perspective, his actions may not be fruitful. My favorite question to challenge my perspective is “What is the evidence I need to invalidate my viewpoint right now, and does the information I have hint at it?” Similar questions ensure that the interpretation of the information you are getting is less tainted.

I will finish off with a funny story I heard about Sherlock Holmes and Watson;

Sherlock Holmes and Dr. Watson decide to go on a camping trip. All the way to the campsite, Holmes was giving observation lessons to Dr. Watson and challenging him. After dinner and a bottle of wine, they lay down for the night, and go to sleep.

Some hours later, Holmes awoke and nudged his faithful friend.

“Watson, look up at the sky and tell me what you see.”

Watson replied, “I see millions of stars.”

“What does that tell you?” Holmes asked.

Watson pondered for a minute.

“Astronomically, it tells me that there are millions of galaxies and potentially billions of planets.”
“Astrologically, I observe that Saturn is in Leo.”
“Horologically, I deduce that the time is approximately a quarter past three.”
“Theologically, I can see that God is all powerful and that we are small and insignificant.”
“Meteorologically, I suspect that we will have a beautiful day tomorrow.”
“What does it tell you, Holmes?” Watson asked.

Holmes was silent for a minute, then spoke: “Watson, you idiot. Someone has stolen our tent!”

Always keep on learning…

In case you missed it, my last post was The Pursuit of Quality – A Lesser Known Lesson from Ohno.

Confirmation Bias – Colbert and Sagan Edition:

Yes-No

I discussed confirmation bias in an earlier post here. In this post, I hope to bring Astrophysicist Carl Sagan and Comedian Stephen Colbert together and end with a Zen story.

Wikipedia defines Confirmation Bias as “Confirmation bias, also called myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it.”

Confirmation bias can put brakes on your scientific thinking, and it is a daily struggle to avoid your biases.

The Colbert Report Edition:

I recently came across a study performed by LaMarre, Landreville and Beam from Ohio State University. In this study the authors investigated the biased message processing of political satire in the famous “The Colbert Report” TV show. For those who do not know this show, “The Colbert Report” show was a political satire show hosted by Stephen Colbert. Colbert refered to his fictional character as a “well-intentioned, poorly informed, high-status idiot”, and was a caricature of televised political pundits.

In the study, the researchers investigated the biased message processing of political satire in the show and the influence of political ideology on perceptions of Stephen Colbert. The researchers called his style of comedy as “ambiguous deadpan satire”. The following facts were revealed from the study.

  • No significant difference existed between conservatives and liberals regarding Stephen Colbert being funny.
  • Conservatives reported that Colbert only pretends to be funny, and genuinely meant what he said; supporting their conservative ideology. Liberals on the other hand reported that Colbert used satire and was not serious; supporting their liberal ideology.

In other words, both liberals and conservatives with extreme viewpoints watched the exact same show and came away with exactly opposite opinions. This is a classical case of confirmation bias!

Carl Sagan and the Fine Art of Baloney Detection:

Carl Sagan was a very famous American Astrophysicist and great scientific thinker. In his book, The Demon-Haunted World, Science as a Candle in the Dark, Carl Sagan provides us a thinking tool kit that will assist us in detecting baloney, as he puts it. Sagan refers to this as a means to construct and to understand, a reasoned argument and – especially important – to recognize a fallacious or fraudulent argument. The tools are as follows;

  • Wherever possible there must be independent confirmation of the “facts.”
  • Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  • Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  • Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  • Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  • If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  • If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  • Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  • Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Surprisingly, the list above is also applicable to detecting, and reducing confirmation bias.

A cup of Tea – a Zen story:

There once lived a great Zen master, Nan-in. Reputation of his wisdom spread, and a university professor decided to visit Nan-in to inquire about Zen.

The professor was welcomed into Nan-in’s room. Nan-in served the professor tea.

The professor’s cup was soon full and yet Nan-in kept on pouring tea causing the professor’s cup to overflow. Nan-in still kept on pouring.

“Master, please stop. The cup is full. There is no more room for more tea.”

“Like this cup,” Nan-in said, “your brain is full of your opinions and biases. There is no more room for Zen unless you first empty it”

Final Words:

I will finish off with a great piece of wisdom, I heard on Quora. Unfortunately, I do not know the source.

“My opinions are not me.  My opinions are just pieces of data that I carry in a box with me.  I can and should change them based on the information available.  If I marry myself to my opinions, I will cling to them regardless of what the information says.  If I want to be right, I need to be prepared to change my mind.” 

Always keep on learning…

Photo credit – Paul H. Byerly

The greatest barrier to scientific thinking:

confirmation-bias

If one were to ask me, what I am afraid of as an Engineer, I will unequivocally declare “Confirmation Bias”.

“The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.”

– Francis Bacon, Novum Organum, 1620

Confirmation bias is part of everybody’s thinking process. When confronted with a problem, one has to determine how to solve it. The first step is to analyze the problem, and this requires looking inward and finding the mental model that might explain the problem at hand. If one such pattern is available, then he tries to fit the problem into the model, as if it is a suit tailored to fit the body of the problem. This is a form of deductive thinking.

In the absence of a pattern, he tries to gather further information to form a mental model. The newly created model may fit the problem much better. This is a form of inductive thinking.

Sometimes, in the absence of a pattern, one might try to find multiple mental models and see which model fits the problem the best. This is a form of abductive thinking.

No matter what form of thinking is used, the problem occurs when one tries to find evidence to prove the model, and ignores any evidence that might otherwise prove it wrong. This is the curse of confirmation bias. It can create blind spot that sometimes is large enough to hide an elephant!

“When men wish to construct or support a theory, how they torture facts into their service!”

John Mackay, Extraordinary Popular Delusions and the Madness of Crowds, 1852

This creates quite a challenge for any form of activity involving brain functioning like problem solving or decision making. I have attempted to create a list of steps that one can use to minimize the impact of confirmation bias. I will be the first person to tell you that this is a daily struggle for me.

  • Be aware that confirmation bias exists:

The first step is to be aware that confirmation bias is part of what we are. Just being aware of this can help us in asking the right questions.

  • Walk the process:

Walking the process allows us to understand the big picture, and helps us in seeing the problem from other people’s perspective. If a problem is identified on the floor during assembly, it helps to walk the process with the component starting at the receiving dock all the way to the assembly on the floor. This helps to slow us down, and we may see things counter to our initial hypothesis that we may have missed otherwise.

  • Can you turn the problem on and off?:

When a problem occurs, either in the field or on the production floor, I always try to see if I can turn the problem on and off. This helps to clarify my mental model and validate my thinking. The cause alone does not result in the effect. The cause, in the presence of enabling conditions creates the effect. Understanding the enabling conditions help us to turn the problem on and off.

  • Challenge yourself to disprove your model:

Challenging yourself to disprove your own model is probably the most challenging yet most effective way to combat confirmation bias. It is after all, easier to disprove a theory than prove it. This helps to purify one’s thinking.

In a recent conversation with my brother-in-law, he talked about the “tenth man” scene from the movie “World War Z”. In the movie, the whole world is under attack from a zombie virus. Israel had built a wall around the nation that prevented the outbreak up to a certain point in the movie. This was achieved through a policy referred to as “tenth man”. It basically states that if 9 out of 10 people in a council agree on something, the tenth person has to take the opposite side, no matter how improbable it might seem.

  • Understanding the void:

My first blog post here was about understanding the void. This is similar to the negative space idea in designing. The information that is not there or not obvious can sometimes be valuable. Looking for the negative space again helps us in looking at the big picture.

In the short story “Silver Blaze”, Sherlock Holmes talks about the “curious incident about the dog.” Holmes was able to solve the mystery that the crime was committed by somebody that the dog knew.

Gregory (Scotland Yard detective): “Is there any other point to which you would wish to draw my attention?”

Holmes: “To the curious incident of the dog in the night-time.”

Gregory: “The dog did nothing in the night-time.”

Holmes: “That was the curious incident.”

I will finish this post off with a Zen story.

There was a great Zen teacher. Some of his disciples came to him one day to complain about the villagers.

They told him that the villagers were spreading rumors that the teacher was immoral, and that his followers were not good people. They asked him what they should do.

“First, my friends,” he responded, “you should definitely consider whether what they say is true or not.”

Always keep on learning…