April 21, 2020

Systems Thinking episode #3: Mental Models

In a series of episodes, I am focusing on System Thinking. In my previous episodes of this series, I elaborated on what systems Thinking is, on the Wicked Problems Systems Thinking is trying to solve and the behaviour of Complex Adaptive Systems (CAS). In this episode, I will try to show how understanding mental models can significantly improve the abilities of leaders like Scrum Masters, Agile Coaches and Managers. This is because Mental Models are at the core of knowledge and learning.

Introduction

We always and continuously use models. Models are the basic building block for knowledge and learning. Some examples of the countless models you are familiar with:

  • our solar system
  • the dinosaurs
  • an org chart
  • road signs
  • cats
mental models

image by roland flemm and Jon Ingham

What is a “Mental Model”?

A mental model is a representation of reality we construct in our brain. In the list of examples above, you might find that the cat is a bit odd. Cats are on the list because we cannot put a real cat in our head. the concept of “a cat” we construct in our brain is a model. We create an understanding of what we know is a cat: Its properties, abilities, relation to other concepts, etc. Anything you can think of that is relevant for you to know about cats you store in your brain makes up your mental model of cats.

In summary: Storage of any piece of information in our brain is done in the form of models. Learning is comparing these mental models with realty or with other models and processing the feedback we get from it. We process the feedback and use it to adapt our mental model.

Why is this important to us?

Everybody most probably has a different mental model of what Scrum actually is. The differences in understanding the Scrum framework is especially noticeable at moments where we are triggered to compare mental models on Scrum. I trigger these moments in my PSM-I classes when I ask people to draw a picture of what Scrum is to them. This also happens when Scrum Masters get a new team, when a team gets new members, etc. Such moments are great opportunities to align our mental models of Scrum, in other words: to start learning.

George Box once said, “All models are wrong, but some are useful”. This implies everybody is always wrong about everything (and that is a comforting thought). And it’s even worse: Not only are the models imperfect, everybody has a different approximation of that model in their heads too! Understanding the concept of (the imperfection of) mental models is important for becoming aware that no one is right or wrong and that everybody works with an approximation of the truth. In other words: this understanding makes us empathic. It increases our ability to observe concepts from multiple angles.

A model changes all the time

solar system

Pearson Scott Foresman

The model of our solar system has changed quite a bit over the years. When it was conceived first, planet earth was in the centre of the system. Later versions resembled clocks (resembling the latest invention at some point in time). In the same way, the Scrum framework has evolved over the years to better represent the dynamics of empirical process control in product development. When many bright minds compare their mental models about concepts (Scrum for instance), the general perception of what is the “current little less bad approximation” of reality will evolve.

Models and bias

There is a problem if our models do not evolve. This might happen when we are not aware of having mental models (also called “reality bias”). If we do have a clear idea about our model, we could get locked in our own ideas and match our observations to our model.  This is called “confirmation bias” (seeing only what we want to see because it matches what we know already). What also makes models difficult is that there are many competing models describing the same subject. Think about agile scaling methodologies, the ideas behind the extinction of dinos, economic models, etc.

Learning

The relationship between Systems Thinking and mental models is about more closely aligning our current mental models with the real world. Exploring mental models helps us in understanding the complexity of the world around us. They provide a way to convey our ideas and communicate about them.

Processing feedback on your mental model is learning. The real world will provide feedback in the form of information that can:

  • help you to determine the viability of your mental model,
  • select the best mental model among a range of options, or
  • inform you how to adapt your mental model

Using mental models: the Iceberg

iceberg

The tip of the iceberg is above sea level. This is where we can observe events and when events occur, we can react to them. The larger part of an iceberg is below sea level, which nicely represents how much there is happening around us that is not directly observable. When events occur multplte times in a row, we call them patterns. Seeing patterns allows us to predict behaviour. One level deeper lie the system structures that make these patterns emerge. When we think at the level of system structures, we can design structures (such as organisational policies or frameworks). The lowest level in this model is the mental model level. These are the ideas and beliefs, the understanding of things that led to creating the system structures. When we understand mental models and we can change them, we can transform the system they feed.

Let’s have a look at a concrete example to see how the iceberg model works.

iceberg2

A bug gets reported: This is an event and we can react to it: We fix the bug! Or the product owner can be upset with the team and complain about their sloppiness.

When there are multiple system stability bugs being reported, we will discover a pattern: The same event happens multiple times, this is not a coincidence. When we see the pattern, we will be able to predict. If we do nothing, there will be more stability bugs reported in the next release!

But how come these bugs occur? Developers don’t make bugs on purpose. It’s probably because of an organisational structure that makes the developers forget things. In this case, The company system structure is that sales reps are rewarded on a closed deal basis. This leads to flooding of custom system changes and custom features. The ever-rising demand for new features and the fact that the salespeople have promised them to clients pressures the development teams resulting in faulty, poorly designed software. Now that we know this, we could design a system structure that will result in different effects. For example, we could reward them for selling more standard features rather than custom ones. These changes will result in a change of the system but it probably won’t take long before new software problems will pop up.
Why? This is because of the underlying mental models: The owners of this company “know” that software development is easy and that more pressure on developers will deliver software faster. Only if we could change this belief then we would be able to deeply transform the system. All decisions taken at higher levels of the iceberg model will emerge from the new beliefs.

usingiceberg roland flemm

This is what systems thinking is all about. We navigated from observation to understanding system dynamics and we discovered possible intervention points that will have a big impact on the system as a whole. If we were to intervene at a higher level, say the pattern level, we maybe would decide to send all developers to a coding training so they would become faster at coding, but this would not sustainably solve the problems of system instability and faulty code. With this learning, we can deeply change the system. In fact, we are not peeling off, we are looking for larger context, for more “wholeness” to deeply understand how things work.

Caution

There is a possible caveat in the iceberg model. Understanding that events are ultimately sourced in mental models should not lead to tightly coupling people’s behaviour to their mental models. There is no straightforward cause and effect relationship. For example when we work on people’s mental models to separate waste, and we observe them doing that, we might assume their mental model is “I recycle because I want to save the planet for my kids”. There may be a disconnect between the desired mental model and the actual mental model. Consider it might very well be that their actual mental model is: “I recycle because I want to comply with social norms”. The latter is not a sustainable transformation. We need to validate our assumptions by aligning our mental models before applying changes.

Conclusion

Mental models are at the heart of human learning. Understanding that all models are imperfect makes us humble and emphatic, paving the way for constructive dialogue. The iceberg model can help us to better understand complex problems so that we can discover intervention points to improve situations sustainably.