Cognitive biases in user research

In the late 1800s, there was a famous horse in Germany. Clever Hans was famous because he could do math. Hans’s owner would ask him questions and Hans would tap his hoof the correct number of times. Hans toured Germany, showing off his skills to audiences for over a decade.

A horse that can do math?! No way!
Photo by Mikael Kristenson on Unsplash

A psychologist decided to investigate Clever Hans. He ruled out fraud because the horse was able to answer questions even when his owner wasn’t the one asking.

The investigators later tried asking Hans questions out of his sight. Suddenly, he went from being right 89% of the time, to a success rate of 6%.

It turned out that Clever Hans couldn’t do math after all. He was just great at reading body language, which is an important social skill for horses. His questioners would unconsciously change their body language as Hans’s taps approached the right answer, hinting at when to stop the tapping.

So what am I getting at?

Experimenter’s bias

Observer-expectancy effect, or Experimenter’s bias, is real. Researchers biased their experiments with Clever Hans, which led them to make incorrect conclusions. UX researchers are at risk of doing the same.

It’s impossible to be completely unbiased as an experimenter. We’re all human. However, bias can be reduced by asking open-ended questions instead of closed questions. That prevents putting words in peoples’ mouths, and leads to deeper insights.

For example, think about your answer to the following two prompts:

Do you like sports?

Can you tell me about the last time you participated in sports?

Your answer to the closed question was likely a one-word answer: yes or no. As a researcher, you could sway the answers towards yes or no simply by pronouncing the word “sports” a certain way.

Your answer to the open question may be the start of an insightful story. As a researcher, it would be hard to bias the answer to this question in a particular way because it’s so open-ended.

Asking open questions is one way to counteract the experimenter’s bias.

Curse of knowledge

A 1990 study by a Stanford psychologist asked students to pick a well-known song, then tap their fingers along to the rhythm of the song. The students then had to estimate what percentage of listeners to the taps would correctly guess the song. They predicted a success rate of 50%. In reality, only 2.5% of listeners properly named the tunes.

“Tappers” have a vivid song playing in their heads. Listeners don’t.
Photo by Yvette de Wit on Unsplash

The “tappers” have a high-fidelity song going through their head, like they’re at a concert, and it’s hard for them to imagine not having this “knowledge”.

The curse of knowledge also applies to those who research, design, and build products. You’re too close to the problem to be able to actually see it. This can lead companies to under-fund research because “we know our users already”. It can also lead researchers to take answers for granted instead of digging in.

One way to fight the curse of knowledge is to ask stupid questions. As ADP’s Jesse Zolna explains,

sometimes I get surprised and it’s like the best question I’ve asked all day.

Jesse Zolna, ADP

In everyday human conversation, it’s normal to gloss over some of the details. When someone says “you know” or “and so on”, it’s shorthand for “we both know what I’m about to say, so I’m going to save my breath and not say it”. Often, both people in a conversation don’t actually have the same knowledge, understanding, or expectation.

In research, it’s important to play dumb and make sure nothing important goes unsaid. Both parties in research can have the Curse of Knowledge, and asking dumb questions can help counteract that.

Bias blind spot

We are blind to our faults, especially when it comes to cognitive biases.

A 2015 study published in Management Science asked over 600 Americans to rate themselves on how biased they thought they were. 15% said as biased as other people. 85% said less. Just one person said more biased. One out of 600!

This is the bias blind spot.

“But I’m reading this article,” you say. “I’m not going to be biased now that I know about these biases!”

Wrong.

Another study found that even experts on cognitive biases are unable to prevent or recognize those very biases in themselves.

Luckily, there’s a solution. Work in pairs. Humans are blind to our own biases, but are great at spotting biases in others.

Bias turns out to be relatively easy to recognize in the behaviors of others, but often difficult to detect in one’s own judgments.

The tip of the iceberg

This article touched on three common cognitive biases. There are dozens more that may apply to your work as a researcher.

Do you have a favorite bias that I didn’t mention? Post a comment to let me know!

Here are the specific pages I used when researching this article.

Sources

For a UX-focused list of 52 cognitive biases, see UX-focused List of selected 52 cognitive biases.

Also, scrolling through Wikipedia’s List of Cognitive Biases is fascinating.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s