The hallmarks of critical thinking are intellectual ability coupled with the willingness to engage in objective analysis of one’s thinking. Thinking about how we think is called metacognition by cognitive psychologists. It is what people often mean when they say they are practicing mindfulness.

The challenge for clinicians is not only how to do something, but also, how to decide to do something. Clinical decision making fascinates me endlessly and dominates the discussions I have with guests on the Medic Mindset podcast. In lengthy, detailed, cognitive autopsies the guests on the show reflect on their thinking and articulate to listeners how they made clinical decisions. This is no easy task. That level of awareness is hard to access because many human decisions are based on hard-wired, intuitive, pattern-processing that never make it to the prefrontal cortex for deliberate, analytical thought.

In his book Thinking, Fast and Slow, the economist and psychologist Daniel Kahneman differentiates between slow, analytical thinking (System 2 Thinking) and  and fast, instinctual thinking (System 1 Thinking). An Emergency Medicine physician in Nova Scotia, Pat Croskerry, applies these theories to medicine. He is hesitant to call System 1 a type of “thinking” at all. Instead, he prefers to describe it as decision-making that doesn’t involve analytic reasoning. System 1 depends on the instinctive parts of the brain that are older, in evolutionary terms. It’s reflexive and reactive. Most cognitive biases live here.

FullSizeRender 8
Notes from various works of Dr. Pat Croskerry

I wish I could write this whole piece without the use of the word “bias.” It is a heavily loaded term that comes with negative connotations. It’s so negative that the natural tendency is for people to read accounts of biases and think that the phenomenon could not exist within themselves and only manifests itself in others.  This tendency is called “blind spot bias.” Some have preferred to use the term “cognitive disposition to respond” to more accurately describe what is happening with cognitive biases. Biases aren’t inherently bad, they just exist. Not respecting that you are biased is what you might call “bad” or dangerous. Biases are well entrenched and considered adaptive.

To conserve brain-power for other functions, the brain relies heavily on intuitive decision making that is correct most of the time. Cognitive psychologists understand that humans spend 95 percent of their conscious time in System 1 decision making and have attributed this tendency to the Cognitive Miser Function. It’s the brain’s default mode for routine, daily life. Without this, humans would spend all their time processing information and never acting.

System 1 decision making is incredibly efficient while doing routine tasks. It does tend to be more error-prone, however. Where it is strong in speed, it lacks in perfection. This is particularly pronounced when novel situations arise.  During clinical decision making, System 1 decision making must be monitored and modulated by System 2 decision making. In order to effectively police your decisions, the first step is to accept that your brain is, indeed, biased. And the best way to accept this is to appreciate that being biased is not tied to intellectual capacity. Remember, the hallmarks of critical thinking are intellectual ability coupled with the willingness to engage in objective analysis of one’s thinking.

This is an introductory post where I invite you to join me in future posts where I will describe the various cognitive biases and how we can implement safeguards to help override cognitive dispositions to respond. Bias is a normal operating function of the human brain. You aren’t faulty. You are biased like the rest of us.

Ginger Locke

I’m honored to be speaking about clinical decision making at these conferences in 2018:

Wisconsin EMS Conference, January 24-27, in Milwaukee, WI

and

FlightBridgeED Air &Surface Transport (FAST) Symposium, March 20-21, in Nashville, TN

References:

Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14 Suppl 1:27-35. PMID: 19669918

Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. In:
Croskerry P. From mindless to mindful practice–cognitive bias and clinical decision making. N Engl J Med. 2013;368:(26)2445-8. PMID: 23802513

Kahneman, D. (2011). Thinking, Fast and Slow, Doubleday Canada.

Tversky A, Kahneman D. Judgment under Uncertainty: Heuristics and Biases. Science. 1974;185:(4157)1124-31. PMID: 17835457

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s