There is no shortage of what to occupy our minds with these days. While this feels like it’s never been worse, as long as our brains have had their current structure, which is estimated to be only 200 million years old,[1] “it’s the same as it ever was,”[2] or as noted in the book of Ecclesiastes, “There is nothing new under the sun…” (1:9).
However, with the perception of overwhelming amounts of ambiguous, unclear, uncertain, and unreliable materials to which we are always subject, our clever information-processing brains engage in numerous, seemingly efficient ways of thinking shortcuts which come under the heading of “cognitive heuristics.”
Cognitive heuristics are mental shortcuts that let us make judgments quickly under conditions of uncertainty. It is simply too exhausting and inefficient to thoroughly process every piece of new information. There is no shortage of easy-to-read scholarly books on this important topic, and I am partial to Daniel Kahneman’s 2011, Thinking Fast Thinking Slow.
Although cognitive heuristics help get things done, it is at a cost. The price of such elegant thinking efficiency comes with predictable thinking errors, called cognitive biases, that are automatically built into our processing system. While there are many such biases and, perhaps ironically, no absolute classification system, I would like to address three of these thinking shortcuts frequently observed that lead to predictable inaccuracies. These were first identified in a 1974 landmark Science article by Tversky and Kahneman called “Judgment Under Uncertainty: Heuristics and Biases”[3] and remain research relevant up to the present. The three biases are availability, representative, and anchoring.
- The availability heuristic refers to thinking of the actual probability of an event based on how easily other instances come to mind. Many of us may have recalled the events regarding COVID-19 from family, friends, and acquaintances to judge the severity of the virus or vaccine safety from those around us. We may have even discounted great amounts of aggregated, offsetting information that conflicts with our personal experience (i.e., confirmatory bias). The hallmark of the availability heuristic is the ease with which recent, dramatic, and seemingly similar situations are recalled without thinking about the actual differences from what is known factually and more accurately to what is in your immediate case.
Alas, knowing about the availability heuristic is the first step to beating its influence. Deliberately asking yourself to be a bona fide, dedicated devil’s advocate to walk through your thinking might is a useful way to de-bias your reflexive, availability heuristic thinking.
- The next commonly observed cognitive bias is called the representativeness heuristic. This occurs when estimating that an individual is more likely to belong to one group or another without thinking about actual probabilities. Often, we err by thinking two similar things are more correlated than what is true. For example, is Fred a librarian or a farmer? “Fred is very shy and withdrawn, always helpful, but not a people person. A quiet soul, always keeping things in order, tidy, and focused on detail.” When the representativeness heuristic strikes, people assume Fred is a librarian because he fits the “type,” that is, the degree to which he is representative of, or similar to, without thinking about how many farmers are there in relation to librarians. Additionally, do such characteristics predict occupation? Research has borne out consistently people fail to consider these vital elements when making such discriminations when faced with incomplete information.
It is hard to overcome the influence of the representative heuristic since its hardwired. But, knowing about it is an important start. Purposeful, effortful, and logical thinking will help. Asking peers and friends while making big decisions also mitigates the representative heuristic. Garnering a fresh perspective by willfully freeing oneself from past preconceptions will also help get clear, more accurate, results.
- Finally, anchoring deserves some attention. Anchoring occurs when people are given identical information presented in different orders and make significantly different initial estimations. As new information is added, those estimations tend to get weighted towards the first piece of data presented. Tversky and Kahneman (1974, p. 1128) noted, “different starting points yield different estimates, which are biased toward the initial values.” In other words, when people are tasked to make judgments under conditions of uncertainty, such as risk assessments, they are greatly influenced, or “anchored,” by their starting point in the analysis. This effect’s strong impact can be from the most random of details, like thinking about the last two digits of your social security number before estimating the cost of a bottle of wine. The higher the last two digits, the higher your estimate will be.[4] With the allure of formulaic, actuarial assessments for many tasks, the cognitive bias of anchoring is significant.
Again, the way to counter this subtle but powerful bias is to acknowledge it is there even if you can’t see it. Slow down your decision-making process and do real research on the matter at hand. Create your informed anchors as focal points to center yourself.
Clearly, our brains are miracles, but only you can maximize the accuracy of the output!
Dr. Jeffrey Singer maintains an active forensic and clinical practice. He is licensed for independent Psychology practice in New Jersey and New York.
[1] https://en.wikipedia.org/wiki/Evolution_of_the_brain
[2] Talking Heads, album Remain in Light, Once in A lifetime 1980
[3] Tversky, A & Kahneman, D. (1974) Judgment under uncertainty: Heuristics and biases. Science, 185 , pp. 1124-1131
[4] Ariely, D., Loewenstein, G., & Prelec, D. (2003). “coherent arbitrariness”: Stable demand curves without stable preferences. The Quarterly Journal of Economics, 118(1), 73–106. https://doi.org/10.1162/00335530360535153