Oana Juncu makes a straightforward statement right out the gate... We make decisions based on our invalidated assumptions that we take for granted facts. Life may be better if we become aware of this.
Every one of us has biases that we use to create a short-hand for our world view and thinking. Most people are unaware of this, and they feel as though they are being very rational and analytical, when in truth they are reacting via a system that has been honed over eons of time. The fact is, we all cope day by day by making assumptions. We make decisions based on our assumptions that we take for reality. We act on a survival instinct. We can make very quick decisions and we can seem very forceful and decisive, but in fact, we are often behaving in ways that are wholly irrational.
Each of us has had an experience where we have convinced ourself of a reality, only to be given new information that opens up our view, and when we see the new "reality" we cannot unsee it.
Cognitive dissonance is what happens when we have an experience where an outside "reality" interferes with our own perceived reality. If we are open to the possibility, it may be very low. It may be much higher if we resist the outside reality. The greater the resistance, the greater the cognitive dissonance.
As testers, we love playing with biases. We try to recognize when they are present, we do our best to de-bias ourselves, and then we feel smugly superior that we have gotten beyond all that. I question that premise, because I think that there are several layers of bias that we need to work through. There's the superficial biases, and then there are those that associate with the core of our beings. Some of these biases are not so easy to set aside. Take belief in a higher power. I have no problem with feeling there is one, regardless of now illogical or irrational it may be. I've had many experiences in my life that have convinced me there is one. For other people, they may have had the exact opposite experience. In a scientific approach, there is a far greater burden of proof on me, in that I cannot prove that there is a higher power. Were I totally rational, I would then say "well, then there isn't one"... and I'm not willing to do that. I understand full well that that is a bias in action, and it is a bias that I can play with, I can consider, but at the end of the day, as it stands right now, I will not put down. Not really. That's how powerful that bias is, and that's why I struggle when I see things that contradict my view, and feel at ease when I see situations that confirm my view.
I sometimes feel smug when I discuss cognitive biases surrounding politics, because it's so easy to see which biases are being applied on both sides (I'm from the USA, so for all practical purposes, there are two (arguably) separate parties with different views of the world). Some people are partisan for one party or another. I've decided to have no affiliation with either, so I can step back and see both the intelligent and ridiculous when it comes to political discourse. I can see the biases in action, but more important, I am at a point of indifference where I can be dispassionate either way. By having this outlook, I can more readily see the fallacies and biases in action with other people. The more closely I've held a bias, the less likely I will be objective with my observations.
Think of the old television program "Let's Make a Deal". For those who don't remember this old game show, one of the premises was that there was a part of the show where a contestant could pick from three doors (Door #1, Door #2, and Door #3). Two of the doors have goats behind them (iow, no prize). One door has a brand new car behind it. We make a choice, and someone else makes a choice. We see that their choice is wrong. Do we change our decision, or do we hold our place? The Status Quo fallacy tells us to hold on to the value we have, because we have a 50/50 chance at this point, but we will very likely hold onto our original choice, and not change it. Why? Because we've already made a choice, and the surety of our choice is more comforting than making a new choice. The odds are equal either way, but most people will not make a change even though the odds are even that they have made a right or wrong choice.
In the world of finance, we are often affected by the Loss Aversion Bias. What impacts us more, receiving something of X value or losing something of X value. Most people react more strongly to loss then they do to gain. Studies have been done (sorry, don't have them off the top of my head, you'll have to trust me for now ;) ), that it takes an almost double level of gain to negate the feelings of loss. Mathematically, it doesn't make sense, but we value the loss more highly than the gain, or perhaps we more acutely feel the loss than we do the gain.
Imagine that you have a chair in front of you. You are asked to draw the chair. Taking out the relative artistic skill of the people drawing, what might we think the outcome will be? Each of us will draw the chair from a slightly different perspective, depending on where we are sitting, but generally, we will draw a chair, because we know it's a chair. Now, let's imagine that we can somehow lose all perspective of the fact the item we are drawing is a chair. It's a form, we can draw it, but we lose all perspective of the item being a chair... do we end up with something different? Can we separate our view? Do we make something different? Chances are, we make something simpler or abstract; it's just a shape. The strange part of this for me is mentally blocking out a shape that I *know* I am seeing. What's interesting is that, for some, they were able to make a more accurate representation of the chair when they were trying to defocus from what it actually is. Others went with a simpler and more freeform design.
From Reality to Obvious
There are steps and filters that shape what we consider to be obvious. Reality is what exists, regardless of our feelings of it. Obvious filters through our experiences, our fears, our frustrations, and our beliefs. Therefore, what is "obvious" to us may not be obvious to someone else, or at least, not in the same way.
“We don't see the world as it is, we see it as we are”
― Anaïs Nin
If you want to test your biases, here's a process to play with:
- State an hypothesis about how a given product should work to meet a customer’s needs
- Define a set of question to validate/invalidate the hypothesis
- Define metrics in respect to what type of answers received
- Collect facts
- Present results
Does this look a lot like the scientific method? It should, because that's really what it is. If we are applying the scientific method appropriately and honestly, the process behind the scientific method will help us defocus our biases. It may not entirely eliminate them, but it is very likely going to make you aware of them.
No comments:
Post a Comment