2818751904_ef059b0d48_b

Intuition and other failings in clinical reasoning


ResearchBlogging.org
Einstein is accredited with saying “The important thing is not to stop questioning” while Euripedes apparently said “Question everything. Learn something. Answer nothing.” I’m sure of the origins of neither quote – but I think I must have inhaled both of them when I was a toddler because I have never stopped asking ‘why’!

In clinical reasoning, there is a real risk of having our very human cognitive reasoning biases kick in before we can even draw a breath. This isn’t surprising – but it is very, very subtle and we can fail to identify our biases until well after we’ve made decisions based on hunches, intuition or other incomplete information – if we pick them up at all.

And if we think that years of experience make us immune to these errors – think again! Not to mention that if we do have this problem highlighted, we’re also inclined to think that ‘it doesn’t apply to me because I’m so much more thorough, better informed, careful and aware of the problem’. Yeah right, as an infamous advert in NZ says!

Intuition has been a term used and abused in health care.  A strict definition from Princeton Uni, no less, states: intuition n. The act or faculty of knowing or sensing without the use of rational processes; immediate cognition.” In other words, intuition is knowing without thinking.  Some people use it to mean their ‘knack’ of knowing what is ‘really’ wrong with a patient, or their sense of what someone is feeling, or why something has happened.  Other people use it as a way to describe their way of working – coming up with ‘the right thing’ without systematically going through an assessment, hypothesis development and testing process.  And still others use it to defend failing to read the literature or keeping on with learning.

The truth is, because of our human cognitive biases, intuition could be one of a couple of things:

    • It could be simply guessing – putting two and two together and occasionally coming up with four.  While this happens on the odd occasion, we’re likely to think it happens more often than it really does, and this can confirm that we have ‘the knack’.
    • It could be over-learned patterns of observation that we are now no longer aware of. Remember the trip from home to work this morning?  I’ll bet that most of us can’t really recall every kilometer of that journey, and in one sense I guess we could call the trip ‘intuitive’.  Over time, we can very quickly make observations of a patient and arrive at an impression before we’ve carried out any deliberate assessment process.   Some people think that psychics have this ability to very quickly ‘read’ the nonverbal behaviour in people and detect minute changes in facial and body expression that most of us wouldn’t notice.
    • Worse: it could be that we were actually incorrect and the patient hasn’t said anything (demand characteristics), or the intervention by happenstance does ‘something’ – and we are none the wiser!

      I think the problem of biases in clinical reasoning is only exacerbated by ‘self reflection’ as a major approach to looking at our own clinical practice.  This is for many reasons, but one is that we consistently over-estimate our own abilities (the Dunning-Kruger effect).  After all, most of us would estimate our driving ability to be ‘above average’!  So we are likely to look at our therapy and think we’re doing OK especially if we don’t have another referent to use as a bench-mark.  It’s one reason I think a useful strategy is to record a session and review it with a systematic checklist to see how well we’ve followed optimal treatment protocols.  Another strategy is to ask another clinician to sit in on a treatment session.

      A further problem with cognitive biases that particularly plagues clinicians is that we are very good at reasoning back – after the fact! So we may be ‘intuitive’ in-session, then after the session we identify the purpose and processes we used – and a final bias to really knock the reasoning stuffing away is that we’re inclined to convince ourselves of our own ‘rightness’ as part of justifying our treatment.  The more we discuss why we used a particular approach, the more convinced we become that this was the right strategy.

      What to do, what to do

      1. The first step is to become aware of the probability that any clinical reasoning we do will be subject to these cognitive biases.  No-one is immune, from novice to highly experienced clinicians, we’re all inherently vulnerable to the thinking errors that have given us humans such a headstart in dominating the world.
      2. The next step is to put some strategies in place to counter the most common biases. For me, this means systematically collecting a lot of clinical information across many domains, and delaying making a decision on ‘what is going on’ until after I have done this.  It means investing a good deal of time in assessment before beginning treatment.  It also means generating several competing hypotheses about what ‘might’ be going on.
      3. It means looking at outcomes dispassionately – using outcome measures that are less subject to demand biases than asking ‘How do you feel now?’  Taking at least three outcome measures: one before treatment, one after treatment and one at follow-up (actually, I’d make the one after treatment happen several weeks after treatment, and the follow-up several months – but this takes buy-in from the funder).
      4. It means questioning everything carried out as part of treatment. Questioning and challenging and holding up our processes to someone else’s scrutiny.  Preferably someone who is prepared to challenge and question treatment choices just as strongly.  This can take the form of a file review by the whole team, or maybe a random sample of patient treatments that can be reviewed according to written protocols.  Preferably reviews by someone else other than you!

      While self-reflection is a great approach to use every day it simply is not enough to ensure good therapy.  We all need to be able to hold our work up to others for review because our biases are like blind spots.

      I’ve linked to Croskerry’s 2002 paper on heuristics and clinical reasoning.  While this paper refers to emergency medicine, just because we may work in chronic pain management does not mean we make decisions any differently!  In fact, we often deal with highly complex interacting variables, with outcomes often quite subtle and that develop over a long period, perhaps over months – and this makes our reasoning more challenging.  I liked this paper because it describes the bias, lists several of the common names for it, describes the effect of the bias and provides strategies to counter these.  Read – and weep!  We have to take note of all these biases in some way, or we may be trying to use ‘intuition’ when it doesn’t help one bit.

      Croskerry, P. (2002). Achieving Quality in Clinical Decision Making: Cognitive Strategies and Detection of Bias Academic Emergency Medicine, 9 (11), 1184-1204 DOI: 10.1197/aemj.9.11.1184
      Croskerry, P. (2002). Achieving Quality in Clinical Decision Making: Cognitive Strategies and Detection of Bias. Academic Emergency Medicine; 9: 1184–1204.

      11 comments

      1. I am so glad you bring this up because I am so tired of clinicians: primary care doctor, surgeon, pain management specialist, etcetera, deciding based on the way I look that “it can’t be that bad” or “well you look good” so…. My surgeon looked at me and actually said, “well you look good so you can’t be lying around in bed in pain all day”, that was exactly what I had been doing. And he had received many calls from me about my increasing post-surgical pain and knew I had an epidural steroid injection 2 weeks before our visit, he referred me.
        Just because I am in pain doesn’t mean I am not going to try to pull myself together to go to the doctor, it may be the only outing I have that week! I have lost 17 pounds since my February 4th surgery, 5’8″ 135 down to 117 pounds. I guess they like skinny women because they all say the same thing! If they would listen with their ears instead of their eyes, they might actually help me! Phew! Thanks, I’m glad I had a place to express that. When I try to explain to the doctor how much his statements infuriate me, he looks at me like a need a psyc eval!

        1. Hi Yvonne – I can feel your frustration from here! Looks can be so deceiving – no-one knows what your pain or my pain feel like except us! If you ever get that feeling, ask them if they really KNOW whether the taste you have in your mouth when you eat chocolate is the same taste they get when they eat chocolate! No-one can tell whether the blue I see as sky blue is the blue you see as sky blue! So when you go in for an appointment, remember that your own experiences are yours alone, and you’re the only one who can tell whether something is helping or not.

      2. There was a study some time ago that showed that computer programs were *far* better at diagnosis than human beings. (A computer program *is* a human being, of course, at one remove, but it’s one that’s not swayed by intuition.) But there’s fierce opposition to using machines for it. I wish I could remember where I saw that study: it must have been twenty years ago, before I was in health care. I know that in my practice, my intuition often tells me that what I should use on this client is whatever worked really well on my last client🙂

        I’d be the last person to denigrate intuition, but it should not be used a) for diagnosis or b) for evaluating outcomes. For those, you want hard-as-nails criteria that don’t budge, and measurements that can’t be fudged. (If you can get them. That’s another conversation :->)

        1. I found quite a few studies demonstrating that human reasoning (whether it is intuition or cold, hard reasoning) is much less accurate at prediction than a ‘machine’ or algorithm. But as you say, people are not very comfortable with leaving the reasoning to the machines!! The first study about this was in 1938, so it’s not a new phenomenon either.
          I’ll dig out the reference for you and post it up on here.
          cheers
          Bronnie

      3. Well this is timely! I am in the process of writing a chapter in the clinical reasoning book on the topic of intuition. Perhaps we should be doing this together! Have you come across Hamm’s (1988) approach to reasoning along a cognitive continuum? This ranges from systematic to intuitve reasoning and is said to be dependent on the task itself and the experience of the therapist. This makes sense to me – it does not negate science but it certainly acknowledges the wealth of experience that can be built up and so allow for fast decision making without apparently thinking about it. The idea of biases are of course a valid criticism of this type of thinking if it is used without any recourse to critically evaluating those decisions. One question: is there a difference between intuitively making decisions based on theory / practice knowledge and those that are made intuitively based on our personal knowledge (eg. on how to work with people)? The latter suggests a more empathic alliance and is perhaps an aspect of emotional intelligence. Now I am getting well out of my depth!

      Leave a Reply

      Fill in your details below or click an icon to log in:

      WordPress.com Logo

      You are commenting using your WordPress.com account. Log Out / Change )

      Twitter picture

      You are commenting using your Twitter account. Log Out / Change )

      Facebook photo

      You are commenting using your Facebook account. Log Out / Change )

      Google+ photo

      You are commenting using your Google+ account. Log Out / Change )

      Connecting to %s