Notes by Brendan Foote and Ian De Silva
IEEE Invited Talk: Games Software Architect Play: On Reasoning Fallacies, Cognitive Biases, and Politics
Phillippe Kruchten, University of British Columbia
Phillippe got exposure to large and not-so-large companies as a software architecture consultant with Rational in the early part of the century. Everywhere, he saw how design really was the same thing as making decisions, and everyone uses a process to do that.
Architectural design is about making decisions. Providing a rationale for the design is the argument behind it. “The life of a software architect is a long (and sometimes painful) succession of suboptimal decision made partially in the dark,” he says. On a positive note, architects can get help by dividing and conquering, bringing in an outsider for additional perspective, and reframing the problem, in addition to many other techniques. But not everything is rosy: there are cognitive biases, reasoning fallacies, and political games.
Cognitive biases occur because designers often rely on intuition, but their intuition is flawed. Reasoning fallacies occur because flawed arguments/incorrect reasoning leads to a potentially wrong decision. Beliefs can also be presented as facts, but most fallacies are accidental. Political games are a set of arguments, superficially plausible, possibly leading to a design decision, but with a concealed ulterior motive.
Phillippe mentions the coincidence of the overlap of this part of his talk with Mary Poppendieck’s this morning. Regardless, architects rely on their intuition, and it is flawed for various reasons. The most humorous of these is the bias bias, which is where the architects think they are not affected by bias!
Phillippe presents a catalog of these games:
- Golden Hammer: when you have a hammer, everything looks like a nail. After developing a deep expertise with some technique/tool/technology, architects tend to favor it, even when it’s not necessarily appropriate.
- Elephant in the Room: all architects are fully aware of some major issue that really must be decided, but everyone keeps busy tackling small items, ignoring the big issue, pretending it does not exist, hoping maybe that it will vanish by magic or that someone else will take care of it.
- Not Invented Here: an architect avoids using or buying something because it comes from another culture or company. Sometimes used jointly with the Golden Hammer.
- Anchoring: relying heavily on one piece of information, to the detriment of other pieces of information, to justify a choice.
But these mental shortcuts aren’t all bad. In his book Blink, Malcolm Gladwell praises the power of snap decisions.
- “Obviously…” is a strong heuristic to look for a non sequitur
- “Yes, but…” can be a delaying tactic
- Perfection or bust: thinking that we need an optimal solution (the fastest, cheapest, nicest, etc., way to do something)
- Cargo Cult: a group of people who imitate the superficial exterior of a process or system without having any understanding of the underlying substance. Gladwell points out that he saw this a lot as people tried to use the Rational Unified Process.
- It has worked before: the conditions when it worked before might have been different, though. This often follows a “blink” decision.
- Sour Grapes: when a tool or solution doesn’t work for you, you decide it is the tool that is the problem.
- Swamped by Evidence: repeating something in public often enough that in the end, it becomes familiar and will look more likely to be true in a subsequent department
- “It’s a Secret”: impose a solution but withholding any evidence, claiming that there are some business reasons to do it that cannot be disclosed at this point
- Teacher’s Pet: making a decision a certain way because the boss will like it
- Groupthink: within a deeply cohesive in-group whose members try to minimize conflict and reach consensus without critically testing
- Let us have a vote: substitutes popularity for correctness in decision making
So what do we know when we’ve become aware of these games? Well, we could always use the knowledge malevolently and exploit them to get what we want. We could be the contrarian who debunks these in a group, or at least challenges the premises.
If there is an opposite to operating under these biases, it must be critical thinking. By that we mean thinking for a purpose, reasoning about data, drawing conclusions, and inferring implications.
Further readings for understanding and mitigating cognitive bias include Thinking Fast and Slow, Daniel Kahneman.