Why I don’t trust my clinical reasoning: and why this matters


“See someone experienced” I hear people with pain say. “They’ll know what’s wrong with you.”

Well, based on the research I’ve read, I wouldn’t be so sure. In fact, I’m certain my own clinical reasoning is biased, prone to errors that I don’t notice, and influenced by factors that most clinicians would be horrified to think they, too, were influenced by.

Let me give you a few to ponder:

I’m interested in women and pain – and there’s a lot of evidence showing that women’s pain doesn’t get the same kind of diagnostic and management attention as men. Now part of this is due to the inherent bias in research where experimental studies often rely on male rats, mice and undergraduates because they don’t have those pesky hormonal fluctuations each month. Even volunteering to take part in a pain study has been found to be biased – people who volunteer have been shown to be more risk-taking and more extraverted (Skinner, 1982) – though to be fair this is an old study!

But contextual factors such as gender, distress and even the supposed diagnosis do influence judgements about pain intensity (Bernardes & Lima, 2011) including potentially life-threatening chest pain (Keogh, Hamid, Hamid & Ellery, 2004). Gender bias has been identified in a large literature review of gender bias in healthcare and gendered norms towards people with chronic pain (Samulowitz, Gremyr, Eriksson & Hensing, 2018).

And if you have the misfortune to be judged to have low trustworthiness and you’re a woman, you’re more likely to be thought to have less pain and to be exaggerating your pain (Schafer, Prkachin, Kaseweter & Williams, 2016). Beware if you’re overweight and a woman because you’ll be likely judged as having less intense pain, the pain will be judged as less interfering, more exaggerated and less related to “medical” factors – women’s pain in particular is likely to be judged as “psychological” and given psychological therapy rather than other treatments (Miller, Allison, Trost, De Ruddere, Wheelis, Goubert & Hirsch, 2018).

The weird thing is that the clinicians involved in these studies were oblivious to their bias. And let’s not even go there with people of colour or so-called “minority” groups such as LGBTQI.

So as clinicians our initial impressions of a person can lead us astray – and I haven’t even started with the contribution experience has on clinical reasoning. Let me go there then!

Something that cognitive psychologists have explored for some years now, is the type of thinking that we draw on for clinical reasoning. System one is “fast reasoning” – where we rapidly, instinctively and emotionally make decisions on the fly. Kahneman (1982) first described these two processes and noted that fast thinking gets better with rehearsal and are helpful especially for skilled clinicians needing to make decisions in pressured contexts, and draw on “pattern recognition” – or to be precise, draw on deviation from a recognised pattern (Preisz, 2019). System two is “slow reasoning” where decisions are made in a considered way, are not influenced by emotional state, and can be thought of as “rational.” Slow thinking is most useful where the situation is complex, where decisions need to weigh multiple pieces of information, where the situation might be novel, or where, for persistent pain in particular, there are multiple disease processes occurring.

OK, so we should choose system two, right? Not so fast! System one is hard to switch from – it’s what underpins “intuition” or “hunches” – and it gets more entrenched the more experienced we are. According to Preisz (2019), system one “seeks to form a coherent, plausible story by relying on association, memories, pattern matching and assumption.”

Why is system one thinking not so great? Well, we’re human. We’re human in the way we respond to any reasoning situation – we anchor on the first and most “plausible” ideas, and these might be unrelated to the actual presentation we see. For example, if we’ve been reading a journal article on a new treatment and its indications, it’s amazing how many people will present with those exact same indications in the next week! This is availability bias or anchoring bias. We’re also inclined to believe our own patients and judgements are different from “those people” – especially “those people” who might respond best to clinical guidelines. This means that even in the face of clear-cut research showing the lack of effects of knee arthroscopy (Brignardello-Petersen, Guyatt, Buchbinder, Poolman et al, 2017) an orthopaedic surgeon I know argued that “we choose our patients very carefully” – essentially arguing that his patients are different, and this approach is the best one.

If experienced clinicians find it hard to “unstick” from old practice, or move quickly to “intuitive” reasoning (even if it’s called “pattern recognition”), and if we all find it hard to recognise when we’re biased, or even that we are biased, what on earth should we do? All us old hands should retire maybe? All follow algorithms and not use “clinical judgement”? Take the “human” out of clinical management and use AI?

Some of these things might work. There is evidence that algorithms and AI can offer effective and (perhaps) less biased diagnosis and management than our unaided human brain (Kadhim, 2018) but there are also studies showing that direct comparisons between decision aids and clinical judgement are rarely made, and those that have been carried out don’t show superior results (Schriger, Elder, & Cooper, 2017). But watch this space: AI is a rapidly developing area and I predict greater use of this over time.

The risk with decision aids is – garbage in, garbage out. If we look at existing research we can see that male, pale and potentially stale dominates: this doesn’t bode well for people of colour, for women, for the unique and idiosyncratic combination of diseases a person can have, or for untangling the impact of disease on the person – in other words, disability and illness.

So, to summarise. We are all biased, and it’s best to acknowledge this to ourselves upfront and personal. We can then turn to strategies that may reduce the biases. For me, the one I turn to most often is a case formulation, using information gathered from a semi-structured interview and a standard set of questionnaires. These have been developed a priori so my biases in information gathering are limited. By taking time to follow a case formulation, my thinking is slowed to that more deliberative system two. At least some of the biases I know I’m prone to are mitigated.

And yet, I know I am biased. That’s why I use a supervision relationship to help me identify those biases, to be challenged and to reflect.

Bernardes, S. F., & Lima, M. L. (2011, Dec). A contextual approach on sex-related biases in pain judgements: The moderator effects of evidence of pathology and patients’ distress cues on nurses’ judgements of chronic low-back pain. Psychology & Health, 26(12), 1642-1658.

Brignardello-Petersen, R., Guyatt, G. H., Buchbinder, R., Poolman, R. W., Schandelmaier, S., Chang, Y., Sadeghirad, B., Evaniew, N., & Vandvik, P. O. (2017, May 11). Knee arthroscopy versus conservative management in patients with degenerative knee disease: a systematic review. BMJ Open, 7(5), e016114. https://doi.org/10.1136/bmjopen-2017-016114

Kadhim, M. A. (2018). FNDSB: A fuzzy-neuro decision support system for back pain diagnosis. Cognitive Systems Research, 52, 691-700. https://doi.org/10.1016/j.cogsys.2018.08.021

Kahneman, D., Slovic, S. P., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases. Cambridge university press.

Keogh, E., Hamid, R., Hamid, S., & Ellery, D. (2004). Investigating the effect of anxiety sensitivity, gender and negative interpretative bias on the perception of chest pain. Pain, 111(1-2), 209-217.

Miller, M. M., Allison, A., Trost, Z., De Ruddere, L., Wheelis, T., Goubert, L., & Hirsh, A. T. (2018, Jan). Differential Effect of Patient Weight on Pain-Related Judgements About Male and Female Chronic Low Back Pain Patients. J Pain, 19(1), 57-66. https://doi.org/10.1016/j.jpain.2017.09.001

Preisz, A. (2019, Jun). Fast and slow thinking; and the problem of conflating clinical reasoning and ethical deliberation in acute decision-making. Journal of Paediatric Child Health, 55(6), 621-624. https://doi.org/10.1111/jpc.14447

Samulowitz, A., Gremyr, I., Eriksson, E., & Hensing, G. (2018). “Brave Men” and “Emotional Women”: A Theory-Guided Literature Review on Gender Bias in Health Care and Gendered Norms towards Patients with Chronic Pain. Pain Research and Management, 2018.

Schafer, G., Prkachin, K. M., Kaseweter, K. A., & Williams, A. C. (2016, Aug). Health care providers’ judgments in chronic pain: the influence of gender and trustworthiness. Pain, 157(8), 1618-1625. https://doi.org/10.1097/j.pain.0000000000000536

Schriger, D. L., Elder, J. W., & Cooper, R. J. (2017, Sep). Structured Clinical Decision Aids Are Seldom Compared With Subjective Physician Judgment, and Are Seldom Superior. Ann Emerg Med, 70(3), 338-344 e333. https://doi.org/10.1016/j.annemergmed.2016.12.004

Skinner, N. F. (1982, 1982/12/01). Personality characteristics of volunteers for painful experiments. Bulletin of the Psychonomic Society, 20(6), 299-300. https://doi.org/10.3758/BF03330107

3 comments

  1. Kahneman et al have a new book, Noise, that gives further support to what you have nicely discussed here. The decisions we make as therapists and clinicians are noisy as well as biased, they vary incredibly from person to person and with in a given person under varying conditions. Unfortunately, many are oblivious to this and therefor in a poor position to do much about it.

    1. Are you adding to my book addiction?! I agree with you about how rarely clinicians acknowledge the biases and noise in their clinical reasoning… I wish all clinicians were given a course in this area before beginning practice. And then support to help notice how strong the thoughts of “oh but not me!” are…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.