This is an amended version of a talk on metacognition I gave at the Royal North Shore Hospital Grand Round. The clinical scenario has changed but the content remains the same. The purpose of this talk was to help our colleagues in the hospital specialties appreciate the mindset of the Emergency Physician and the difficulties intrinsically associated with the job we do.
Picture the scene
It’s early October and the ED is already busy. You’re on the morning shift and one of your first jobs is to clear out the patients from the short stay ward to make space for the inevitable admissions of the day. One patient is a 19-year-old student, male, who presented the preceding evening with vomiting and fever. He’s normally well (only taking olanzepine for schizophrenia). The overzealous registrar failed to diagnose Freshers’ ‘flu before doing bloods at 0200 (which were reassuringly normal – WCC 5, CRP 4). For some reason he’s been on the ward all night for a dose of ondansetron and two litres of Hartmann’s. His fever resolved immediately with IV paracetamol and his observations have been normal all night – currently HR 92/min, NIBP 110/68mmHg, RR 18/min, SpO2 97% on RA. The nurses tell you he had some profuse diarrhoea at around 0600 and you dread having the ward closed if it’s norovirus…
As you approach the bed to kick him out of your unit before infection control clock he’s there, you notice he’s barely visible underneath the mountain of sheets and hospital blankets – apart from a left shoulder which sports a cracking bruise.
You wake him up, explain to him that he will feel much better being ill in his own student flat, and reassure him that the blood tests were all normal and he’ll feel better in a few days. He nods sheepishly.
Then you ask him how he got the bruise.
“Bruise?” he asks, looking for it. “I didn’t know I had a bruise.”
Suddenly alarm bells ring in your head – what if it’s not a bruise? What if it’s pupura? But the blood tests are normal…
What can we do when we are faced with contradictions in our diagnostic processes?
Well, as emergency physicians we spend a great deal of time in this state.
With the work of eminent EPs like Prof Pat Croskerry (check out these audio and video lectures which cover much of the first part of this talk) we are increasingly aware of our own thinking processes, so called “metacognition”.
When we make any kind of decisions, our brains rely on the information available to them. We think we tend to process this information in one of two ways, known as dual process theory as described by Nobel prize winner Daniel Kahneman in his book “Thinking Fast And Slow”. System one thinking is fast and intuitive – these make up around 95% of the decisions we make.
System two thinking is slow, deliberate, reflective, measured – and this is much less common.
To demonstrate, let’s try a small quiz – grab a bit of paper and a pen. There are no tricks here – just answer the questions.
- On a standard fire engine, there are two drivers up front, one at the rear and three additional staff – how many do you need for five standard trucks?
- How many turtle doves did my true love send to me on the 2nd day of Christmas?
- A bat and a ball cost £1.10 in total. The bat costs £1 more than the ball – how much does the ball cost?
- It takes 5 machines 5 minutes to make 5 widgets, how long would it take 100 machines to make 100 widgets?
- In a lake there’s a patch of lily pads and every hour it doubles in size. If it takes 48 hours to cover the entire lake, how many hours would it take to cover half the lake?
The answers are 30 (5 x 6) and 2 – these questions are just to warm you up. The others are 5p, 5 minutes and 47 hours. Think about them for a while – and feel free to tweet me if, after a bit of a think, you can’t figure out why those are the correct answers.
This is known as a cognitive reflexive test: your intuitive response is to get these wrong but if you take the time to reflect you can get yourself to the right answers. It has little to do with intellect/intelligence – instead it tells us about your ability to supress the intuitive response, to overcome the System I thinking and engage System II.
We tend to think in one of these two modes: fast, reflexive and intuitive or slower, reflective, analytical and deliberate. It’s really useful – if you’ve seen a lot of sick patients, you get good at recognising patterns – imagine a 55 year old man with chest pain, he’s grey and sweaty – what are you expecting on the ECG?
Heuristics
But sometimes we do need to overcome the automaticity of System I otherwise we can make errors. Many of our patients can be approached in a very simple way – when we first start in medicine and particularly in EM, everything is new, everything is difficult. Thought processes are conscious. But we naturally reduce the complexity of our decision-making to simple pathways: if this, then this – these processes are known as heuristics. Defined as “a practical problem solving approach not guaranteed to be perfect but sufficient for the immediate goals”, these are mental shortcuts which may be generated subconsciously.
As doctors, we like them. It appeals to us to think we are superhuman and infallible. There’s even a word for “I just know” – the word “gestalt”. Sadly so far attempts to validate clinician gestalt alone against other clinical scoring systems have failed to show that we are really as brilliant as we think.
Dunning & Kruger described this in learners – that as we accrue knowledge we enter a phase of blissful ignorance. Educationalists call this “unconscious incompetence”; you might hear it referred to as being “at the top of mount stupid”. We certainly see this in doctors who are new to medicine and especially new to Emergency Medicine. In some places it’s the way we teach medicine – Occam’s razor states that the simplest explanation is usually the best – consider the patient with RUQ pain, fever, jaundice. Acute cholangitis, right? Except that the specificity is 93.2% and the sensitivity is 36.3% – this is very much an imperfect test, probably not good enough for ruling in or ruling out in isolation.
Eventually, we get burned, we make a horrible error and we get away with it – or we don’t – and we come crashing down the other side. And with experience comes wisdom, so we can learn to plug in cognitive stopgaps – for example, I have a programmed cognitive stop to convince myself that any young woman with abdominal pain or syncope does not have an ectopic pregnancy. When we shortcut and rely solely on heuristics and system I thinking, we leave ourselves open to errors in our decision-making.
We have to work hard to deliberately employ system II to ensure we don’t make diagnostic errors. It’s very hard. What’s this picture, of a dermatomal painful rash consisting of numerous tiny blisters?
How long did it take for you to know the diagnosis? Did you even need to click on the link to know it was shingles? There was a time, hard to believe it, when you saw this for the first time and you didn’t know what it was. But when we undertake deliberate practice regularly, it will become subconscious – we can switch things over to system I which is why we use simulation and we regularly attend life support courses – so that when we don’t have time to sit and slowly employ type II thinking, we are equipped to take action.
Other biases
There are other factors at play here too. We all have intrinsic biases that shape our decision-making, many of which we might not be aware of. Think about our patient – is anyone willing to admit that they thought his symptoms might have a psychiatric cause? Bias against obese patients is particularly common and especially powerful. I strongly recommend that you each undertake the implicit association test online (Harvard’s is freely available). The concept of framing – that is, the context within which information is presented to you, either by the patient themselves or by the referring clinician – will shape the way you receive it. Contextual information matters. We tell ourselves that the risk factors patients have increase the probability of having certain disease processes – which is true, but it doesn’t mean that patient without any risk factors at all can’t have those exact same disease processes. This is Hickham’s dictum, the counterargument to Occam’s razor – patients can have as many diseases as they damn well please (with or without whichever risk factors they have or don’t have).
There are a host of other biases that play into our decision-making processes (see this great summary at First10EM) – things like:
- availability bias (we judge the likelihood of disease by the ease with which examples come to mind – so we are more likely to diagnose diseases we’ve had recent experiences of, or are particularly attuned to [a subtype called significant case bias])
- base rate neglect (we ignore the prevalence of a disease in our diagnostic reasoning, such as the “worst first” approach to diagnostics)
- anchoring (seizing a diagnosis based on early information and failing to adjust as new information becomes available)
- confirmation bias (giving more weight to information which seems to align with what we already think)
- search satisfaction (stopping once we have found something – the reason we miss the second fracture)
- premature closure (stopping too early in the diagnostic process because we think we have an answer)
- hindsight bias (forming opinions about early decision-making based on what happened later “it was obvious… I can’t believe they missed it”).
I probably also need to mention blindside bias – the tendency to hear a talk like this, to nod your head in agreement throughout thinking about all the people this stuff applies to and thanking God or the tooth fairy that you’re exempt (you’re not).
Emergency physicians do a huge amount of decision making and it’s something we aren’t really trained in – when we ask medical students the areas they think they need to focus on to be good doctors, they say things like “knowledge”, “practical skills”.
Diagnostic error doesn’t always lead to bad things happening. If we diagnose sinusitis and it was a viral URTI, it doesn’t really matter. The majority of our patients will get matter in spite of what we do to them (we know this through placebo-controlled trials).
The answer must be more tests!
We must order more tests: more CT scans, more bloods, troponins and d-dimers and CRPs for everyone!
Well, no, not just because of cost and radiation exposure – because the results of tests depend on the questions we are asking. All tests have false negative and false positive rates and for many the performance of the test is dependent on the prevalence of the disease within the tested population – not within the entire population.
We have to play off our own cognitive processes against an understanding of probability and in order to understand probability better we have to appreciate Bayesian statistics.
Bayes theorem states that
- The tests are not the event
- Tests are flawed (false positive and negatives occur)
- Tests give us test probabilities, not real probabilities
- False positives skew results (especially for rare diseases)
- People prefer natural numbers
- We need to convert test results into real probabilities for our patients
By working as Bayesian practitioners, we try to add meaningful information to improve our probability of getting the right answer.
One of the helpful components of information we can add is time. We see patients comparatively very early in their disease trajectory; there are very few diseases that fully manifest in an unequivocal way immediately. Most develop slowly; the problem is that patients decide when to come to the ED.
We work in a decision-rich, information-light environment. By nature some of our decisions and diagnoses are going to be incorrect as disease processes manifest themselves which might be why our excellent general medical colleagues always seem so wise and knowledgeable on the post-take ward round the next day.
Let’s go back to our case
Having thought about all these things, what should we do?
We don’t have a ready diagnosis to tie things together nicely a la Occam’s razor. We have several concerning features – the fever, the “bruise”, the vomiting – and a potentially treatable, potentially fatal possibility of meningococcal sepsis. We can make a decision to push the schizophrenia and the Freshers’ ‘flu, common as it is, out of the picture until we are satisfied there isn’t anything else going on.
We decide to give antibiotics and refer to the inpatient medical team. They repeat his bloods, finding his WCC now more than 20 and CRP in the 200s. We feel justified – ah, the power of confirmation bias.
Two days later, the blood culture taken by the very thorough registrar overnight at the same time as the WCC of 5 and CRP of 4 grew – N. meningiditis. It’s all so easy in hindsight 🙂
So thanks for indulging me on this journey into metacognition – I thought this was a great case for exploring the way that we as medical practitioners think and in particular the challenges of thinking in Emergency Medicine. That’s always in our interest because our job can be pretty tricky. And that’s the message I’d like you to take away from this – Emergency Medicine decision-making is often decision heavy and information light; we will get things wrong and sometimes we pass some of that decision-making responsibility on to you as our inpatient colleagues, because we believe that time will help us to make sense of what we see.
Want to know more? Try these
Emergency Medicine: A Risky Business – Part One, Two, Three, Four, Five, Six & Seven
First 10 EM on Cognitive Errors – Part One, Two, Three & Four
Search Satisficing/Satisfaction at LiTFL
Monty Hall problem explained (including simulator)
P.S.
When is a door not a door?
When it’s ajar 🙂
Nat
Before you go please don’t forget to…
- Subscribe to the blog (look top right for the link)
- Subscribe to our PODCAST on iTunes
- Follow us on twitter @stemlyns
- See our best pics and photos on Instagram
- PLEASE Like us on Facebook
- Find out more about the St.Emlyn’s team
Pingback: When is a Door Not a Door? Bias, Heuristics & Metacognition – Global Intensive Care
https://twitter.com/dymonite69/status/735962517252378626
Thanks Nat
Since you used the door analogy…….
Similar to ‘Norman doors’ where design is out of step with our cognitive patterns/ heuristics.
http://99percentinvisible.org/article/norman-doors-dont-know-whether-push-pull-blame-design/
However we can just design the doors better to fit with our heuristics . Unfortunately not an option with the patents 🙂
vb
I read that article recently and think of it every time I pull the door to access our ED (it’s a push door and totally counterintuitive).
Even though I walk through the same door at least 4 times a week, I still have to engage my conscious brain to open it correctly on the first attempt. Overcoming System I is pretty tough 🙂
Nat
Pingback: LITFL Review 277 • LITFL • Life in the Fast Lane Medical Blog
Pingback: On Search Satisficing – Medic Mindset
Pingback: LITFL Review 277 • LITFL Medical Blog • FOAMed Review