Towards or Away – Which Way to Go?

In today’s post I am pondering the question – as a regulator, should you be going towards or away from a target? Are the two things the same? I will use Erik Hollnagel’s ideas here. Hollnagel is a Professor Emeritus at Linköping University who has a lot of work in Safety Management. Hollnagel challenges the main theme of safety management as getting to zero accidents. He notes:

The goal of safety management is obviously to improve safety. But for this to be attainable it must be expressed in operational terms, i.e., there must be a set of criteria that can be used to determine when the goal has been reached… the purpose of an SMS is to bring about a significant reduction – or even the absence – of risk, which means that the goal is to avoid or get away from something. An increase in safety will therefore correspond to a decrease in the measured output, i.e., there will be fewer events to count. From a control point of view that presents a problem, since the absence of measurements means that the process becomes uncontrollable.

He identifies this as a problem from a cybernetics standpoint. Cybernetics is the art of steersmanship. The controller identifies a target and the regulator works on getting to the target. There is a feedback loop so that when the difference between the actual condition and the target is higher than a preset value, the regulator tries to bring the difference down. Take the example of a steersman of a boat – the steersman propels the boat to the required destination by steering the boat. If there is a strong wind, the steersman adjusts accordingly so that the boat is always moving towards the destination. The steersman is continuously measuring the difference from the expected path and adjusting accordingly.

Hollnagel continues with this idea:

Quantifying safety by measuring what goes wrong will inevitably lead to a paradoxical situation. The paradox is that the safer something (an activity or a system) is, the less there will be measure. In the end, when the system is perfectly safe – assuming that this is either meaningful or possible – there will be nothing to measure. In control theory, this situation is known as the ‘fundamental regulator paradox’. In plain terms, the fundamental regulator paradox means that if something happens rarely or never, then it is impossible to know how well it works. We may, for instance, in a literal or metaphorical sense, be on the right track but also precariously close to the limits. Yet there is no indication of how close, it is impossible to improve performance.

The idea of the fundamental regulator paradox was put forward by Gerald Weinberg. He described it as:

The task of a regulator is to eliminate variation, but this variation is the ultimate source of information about the quality of its work. Therefore, the better job a regulator does, the less information it gets about how to improve.

Weinberg noted that as the regulator gets better at what it is doing, the more difficult it is for them to improve. If we go back to the case of the steersman, perfect regulation is when the steersman is able to make adjustment at a superhuman speed so that the boat travels in a straight line from start to end. Weinberg is pointing out this is not possible. When 100% percent regulation is achieved, we are also cutting off any contact with the external world. This is also the source of information that the regulator needs to do its job.

Coming back to the original question of “away from” or “towards”, Hollnagel states:

From a control perspective it would make more sense to use a definition of safety such that the output increases when safety improves. In other words, the goal should not be to avoid or get away from something, but rather to achieve or get closer to something.

While pragmatically it seems very reasonable that the number of accidents should be reduced as far as possible, the regulator paradox shows that such a goal is counterproductive in the sense that it makes it increasingly difficult to manage safety… The essence of regulation is that a regulator makes an intervention in order to steer or direct the process in a certain direction. But if there is no response to the intervention, if there is no feedback from the process, then we have no way of knowing whether the intervention had the intended effect.

Hollnagel advises that we should see safety in terms of resilience and not as absence of something (accidents, missed days etc.) but rather as the presence of something.

Based on the discussion we can see that “moving towards” is a better approach for a regulator than “moving away” from something. From a management standpoint, we should deter from enforcing policies that are too strict in the hopes of perfect regulation. They would lack the variety needed to tackle the external variety thrown at us. We should allow room for some noise in the processes. As the variety of the situation increases, we should stop setting targets and instead, provide a direction to move towards. Putting a hard target is again an attempt at perfect regulation that can stress the various elements within the organization.

I will finish with some wise words from Weinberg:

The fundamental regulator paradox carries an ominous message for any system that gets too comfortable with its surroundings. It suggests, for instance, that a society that wants to survive for a long time had better consider giving up some of the maximum comfort it can achieve to return for some chance of failure or discomfort.

Please maintain social distance and wear masks. Please take vaccination, if able. Stay safe and Always keep on learning…

In case you missed it, my last post was The Cybernetics of the Two Wittgensteins:


  1. The Trappers’ Return, 1851. George Caleb Bingham
  2. Safety management – looking back or looking forward – Erik Hollnagel, 2008
  3. On the design of stable systems – Gerald Weinberg, 1979

Know Your Edges:


In today’s post I will start with a question, “Do you know your edges?

Edges are boundaries where a system or a process (depending upon your construction) breaks down or changes structure. Our preference, as the manager or the owner, is to stay in our comfort zone, a place where we know how things work; a place where we can predict how things go; a place we have the most certainty. Let’s take for a simple example your daily commute to work – chances are high that you always take the same route to work. This is what you know and you have a high certainty about how long it will take you to get to your work. Counterintuitively, the more certainty you have of something, the less information you have to gain from it. Our natural tendency is to have more certainty about things, and we hate uncertainty. We think of uncertainty as a bad thing. If I can use a metaphor, uncertainty is like medicine – you need it to stay healthy!

To discuss this further, I will look at the concept of variety from Cybernetics. Variety is a concept that was put forth by William Ross Ashby, a giant in the world of Cybernetics. Simply speaking, variety is the number of states. If you look at a stop light, generally it has three states (Red, Yellow and Green). In other words, the stop light’s variety is three (ignoring flashing red and no light). With this, it is able to control traffic. When the stop light is able to match the ongoing traffic, everything is smooth. But when the volume of traffic increases, the stop light is not able to keep up. The system reacts by slowing down the traffic. This shows that the variety in the environment is always greater than the variety available internally. The external variety also equates with uncertainty. Scaling back, let’s look at a manufacturing plant. The uncertainty comes in the form of 6M (Man, Machine, Method, Material, Measurement and Mother Nature). The manager’s job is to reduce the uncertainty. This is done by filtering the variety imposed from the outside, magnifying the variety that is available internally or looking at ways to improve the requisite variety. Ashby’s Law of Requisite Variety can be stated as – “only variety can absorb variety.

All organizations are sociotechnical systems. This also means that in order to sustain, they need to be complex adaptive systems. In order to improve the adaptability, the system needs to keep learning. It may be counterintuitive, but uncertainty is required for a complex adaptive system to keep learning, and to maintain the requisite variety to sustain itself. Thus, the push to stay away from uncertainty or staying in the comfort zone could actually be detrimental. Metaphorically, staying the comfort zone is staying away from the edges, where there is more uncertainty. After a basic level of stability is achieved, there is not much information available in the center (away from the edges). Since the environment is always changing, the organization has to keep updating the information to adapt and survive. This means that the organization should engage in safe to fail experiments and move away from their comfort zone to keep updating their information. The organization has to know where the edges are, and where the structures break down. Safe to fail experiments increases the solution space of the organization making it better suited for challenges. These experiments are fast, small and reversible, and are meant to increase the experience of the organization without risks. The organization cannot remain static and has to change with time. The experimentation away from the comfort zone provides direction for growth. It also shows where things can get catastrophic, so that the organization can be better prepared and move away from that direction.

This leads me to the concept of “fundamental regulator paradox”. This was developed by Gerald Weinberg, an American Computer scientist. As a system gets really good at what it does, and nothing ever goes wrong, then it is impossible to tell how well it is working. When strict rules and regulations are put in place to maintain “perfect order”, they can actually result in the opposite of what they are originally meant for. The paradox is stated as:

The task of a regulator is to eliminate variation, but this variation is the ultimate source of information about the quality of its work. Therefore, the better job a regulator does, the less information it gets about how to improve.

This concept also tells us that trying to stay in the comfort zone is never good and that we should not shy away from uncertainty. Exploring away from the comfort zone is how we can develop the adaptability and experience needed to survive.

Final Words:

This post is a further expansion from my recent tweet.

Information is most rich at the edges. Information is at its lowest in the center. Equilibrium also lies away from the edges.

The two questions, “How good are you at something?” and “How bad are you at something?” may be logically equivalent. However, there is more opportunity to gain information from the second question since it leads us away from the comfort zone.

I will finish with a lesson from one of my favorite TV Detectives, D.I Richard Poole from Death in Paradise.

Poole noted that solving murders were like solving jigsaw puzzles. One has to work from the corners, and then the edges and then move towards the middle. Then, everything will fall in line and start to make sense.

Always keep on learning…

In case you missed it, my last post was Bootstrap Kaizen: