And how it impacts your work
Did you ever present work, were you did tons of research and sketched a perfect plan of execution, only for your management team to refuse it because they didn’t get it?
Or do you ever get so much into researching a user’s needs and habits, that you overdeliver on features? And your users don’t understand it or use it not as intended? Or you designed a logo with perfect meaning and metaphors, but your customers don’t like it because they don’t see it?
If yes, then what you experience in these cases is a bias called “The Curse of Knowledge”. It’s a term that was coined by business consultants. And it is common at the highest levels of corporate identity strategy as it is with basic commissioned illustrations.
The curse of knowledge is a cognitive bias that occurs when an individual, communicating with other individuals, unknowingly assumes that the others have the background to understand. For example, in a classroom setting, teachers have difficulty teaching novices because they cannot put themselves in the position of the student. A brilliant professor might no longer remember the difficulties that a young student encounters when learning a new subject. This curse of knowledge also explains the danger behind thinking about student learning based on what appears best to faculty members, as opposed to what has been verified with students — Wikipedia
Overconfidence at the top levels
The bias of overconfidence in ones knowledge can be seen at all levels of a company. From top to the bottom management. In an issue of Harvard Business Review of 2006, Chip & Dan Heath “The Curse Of Knowledge” describe it like this:
Top executives have had years of immersion in the logic of conventions of business, so when they speak abstractly, they are simply summarizing the wealth of concrete data in their heads. But frontline employees, who aren’t privy to the underlying meaning, hear only opaque phrases.
Overconfidence in communication
Elizabeth Newton, as a graduate of Stanford University, conducted a series of experiements for her disertation in 1990. The experiment was described in the Stanford Social Innovation Review:
College students were asked to par in an experiment of two roles: “tappers” and “listeners”. Tappers received a list of 25 well-known songs and were asked to tap out the rythm of the song. Listeners tried to guess the song from the taps. The tappers reported. that they could clearly “hear” the lyrics and complete musical accompainment as they bange away. When they were asked to predict how many songs listeners would guess, they predicted 50%. However, listeners heard only a series of seemingly disconnected taps. Indee, of all the songs tapped out, listeners correctly guessed only 3%.
The conclusion is obvious. You tend to overestimate the final result if you already know the answer. You also tend to underestimate the difficulty of the question or the problem.
And this is an important principle for people who design solutions, because you may become so immersed in the problem itself that you may lose touch with the reality.
Optimism and overconfidence
Richard Thaler, who is a behavioural economist, in his book Nudge writes that people have a tendency to be biased towards optimism and overconfidence.
For example, people are too much optimistic even when the stakes are high. About 50% of marriages end in divorce, and this is a statistic most people have heard. But around the time of the ceremony, almost all couples believe there is 0% chance that their marriage will end in divorce. Even those who have been already divorced before.
A similar point applies to entrepreneurs starting new businesses. Where the failure rate is at least 50%. In one survey, people starting new businesses, people were asked two questions:
- What do you think is the chance of success for typical business like yours?
- What is the chance of your success?
The most common answers to these questions were 50% for first and 90% for the second. Sometimes, many said 100% for the second one. And you can see how blind and big of an ego we may have.
In this case, to stop blind optimism, you could use a simple nudge. Remind them of a bad event that happened in their specific situation, and they will reassess their optimism straight away. So we have to take into account these type of factors when designing products. We must assume that the user does not know or does not have enough knowledge.
Context is key
Now let’s take an example of how it works. Let’s say that you have been comissioned to design a new logo for a certain company. And the brief was that the logo has to represent the company’s mission of innovation and staibility. And after you have done your research you came up with a symbol. It’s a circle.
The circle from above would mean nothing for an average person. Why? Because it is out of context. When you look at it, you would not quite understand what it means. It may mean a doughnut drawn by a child. But if we add some context to it, things change.
What if you showed the circle and said it represents “a continuous cycle of discovery, creativity and knowledge”. And it’s not a simple circle. It’s “endless possibilities for the future of the company and its commitment to being a strong, stable and enduring ally for our customers around the world”.
Now that the melody’s implanted in your head, you presumably will be better equipped to follow the tune. Now the customer will be on the same line of thought as you are. And this “salesmanship” is a useful way of managing “the curse of knowledge”. And by the way, the circle is the logo of Lucent Technologies, designed in 1996 by Landor Associates and the example comes from Michael Bierut’s book, Now You See It.
Sometimes you may create a subliminal meaning through your logo, or illustration or even text message, but the user does not get it. It happens because you have too much knowledge or research on the subject. And that makes you think that the other party has it also. And you overestimate stuff and underestimate the other parties knowledge.
So now as we have all of this information what can we do better?
Human beings tend to make mistakes, even in the most obvious scenarios. No matter how intelligent you are. Assuming that someone will make an error in a particular context, can make you think a bit differently. So here are some examples from all over the world on how others ignored or did not take into account the overconfidence or “the knowledge curse” bias:
ATM machines and leaving the cards out
You would assume that now as we receive the cards back before we get the money is a regular action. But we forget that the initial design wasn’t like that. You received the money first, and once you took them, people tended to forget their cards in the ATM’s. Most ATM’s don’t allow this error anymore. So for you to get the cash, you need to remove the card first, and you will not forget it anymore. And people who thought about that, knew that we tend to forget things, even in critical situations.
Gass tank cap
With years, cars became user-friendlier. They are more intuitive in preventing user errors. But take as an example the gas tank cap. Most people tend to forget and leave it open, as in the case with ATM’s. So they drive with the cap open sometimes or lose it. But what if we found a way to attach that cap to a cheap plastic that would not allow the car to start if the cap is open? This is also one of the pitfalls of assuming that people will always close the cap after refuelling their car.
Gmail’s attachment notice
Richard Thaler, when writing the book Nudge, gave a great example of how Gmail prevented an error. Thaler wanted to send a draft of the book to a friend of his. And in the email, he wrote “See the attachment” and hit the send button, and Gmail popped a message “It seems like you forgot to attach a file”. Google was testing at the time an algorithm that scans for the word “attachment”. And will let you know if a file is attached or not. This way you can prevent empty emails.
London’s left to right
You’ve heard at least once in your life that in London cars drive differently. In the US and all countries in Europe cars come from the right side and in the UK on from left. People tend to forget this, and many accidents happen because of that. So the government decided there should be a sign on pavements that says “Look right!”.
Try to take a step back
In the end, since you are in a specific environment on a daily basis, you tend to get biased. And you start believing that people know what you know. But that’s rarely true. And in this case, we have to take a step back, and look at things from a different perspective. Or present it to our customers or management teams from their angle.
Source link https://uxplanet.org/the-curse-of-knowledge-d0d5ce26bd20?source=rss—-819cc2aaeee0—4