Marching Backwards into the Future: Why Healthcare Needs Fewer Modules and More Systems Thinking

By Jesse Spurr from the amazing Five Things Nursing Podcast, which and which is co-hosted with our very own Dr Liz Crowe. You should defo subscribe as relevant to all in healthcare. You can find it here or on your favourite podcast platform.

“We look at the present through a rear-view mirror. We march backwards into the future.” Marshall McLuhan

If you have worked in healthcare long enough, you start to notice a pattern. Something goes wrong, whether a near miss, a complaint, or a policy gap, and the organisational response arrives almost instantly.

Add training.
Build a module.
Require a refresher.
Remind everyone to pay more attention.

It is a reflex from a simpler clinical era. When I started nursing on a surgical ward, if you knew ten medicines well, the rest rarely surprised you. Today, we operate in a system built on decades of additive fixes. Each new risk generates a new requirement, another click, another password, another module that must be completed. We keep applying old logic to new complexity and the road ahead becomes harder to navigate.

The truth is straightforward. Systems rarely fail because people do not know enough. They fail because of complexity, constraints, design flaws, and competing priorities. Training used as a default solution is often ineffective, sometimes harmful, and always expensive. It can undermine psychological safety, waste resources, and distract from the system fixes that may actually solve the problem.

This post explores why this reflex persists and why a better default is possible.

Systems Thinking: Errors Are Signals, Not Indicators of Incompetence

Healthcare is a complex adaptive (insert messy human) system. Performance is shaped by the interaction of people, tools, processes, and context. Most errors do not reflect a lack of knowledge. They reflect barriers, friction, and design issues that sit far upstream.

Consider medication administration errors. A training first response assumes forgetfulness or non-compliance. But when we look closer, the contributing factors are many:

  • alert fatigue
  • time pressure
  • interruptions
  • staffing gaps
  • contradictory policies
  • clunky interfaces
  • paper and digital hybrids that require duplication.

Training cannot correct these influences. At best, it compensates temporarily. At worst, it creates a false sense of action. The phrase “We trained them, so the risk is closed” is one of the most dangerous forms of organisational self-reassurance.

Training as a default also sends a subtle but powerful message. It suggests that clinicians are the problem to be solved rather than key partners in identifying real solutions. This erodes positive regard, which is the belief that staff are competent and acting with good intent. Over time, psychological safety shrinks. People stop raising weak signals or near misses because they expect what will follow. Another module, another checklist, another layer of scrutiny.

When people closest to the work lose influence over solutions, the entire system becomes less adaptive and less safe.

Cognitive Load: Training Competes with Attention, and Attention Is Finite

Clinicians carry enormous cognitive burdens. They manage complex information, highly variable case-mix, constant interruptions, documentation requirements, occupational violence, deteriorating patients, and rapid re-prioritisation. On top of this sits an avalanche of:

  • mandatory modules
  • policy updates
  • annual competencies
  • safety alerts
  • inboxes that spiral
  • duplicated documentation

Cognitive load comes from three main sources:

  1. The complexity of the information
  2. The way that information is delivered
  3. The mental effort required to process and apply it

Most mandatory training adds to the second and third categories. And when training is not given protected time, it ends up squeezed into the cracks of the day. After hours, between patients, during missed breaks. The result is predictable. Learning suffers. Attention is fragmented. People complete modules for compliance, not competence.

When staff predictably fall behind, the language can imply personal failure instead of recognising that the system is structurally overloaded.

When training is genuinely needed, it must reduce cognitive load rather than add to it. It should be short, contextual, clinically relevant, supported at the point of care, and integrated with workflow through micro-coaching, huddles, performance aids, and in situ practice.

In other words, learning should fit the work rather than compete with it.

Emergence: Small Additions Create Big and Often Invisible Costs

Complex systems do not respond in simple, linear ways. A seemingly small addition like a new module can produce disproportionate effects.

Without resourcing or protected time, staff must compensate. They do this by giving something else up. Usually quietly. Usually invisibly. They might reduce supervision, truncate handovers, skip reflection, work through breaks, or sacrifice teaching moments. These small losses accumulate. Over time they lead to more brittle performance, more error drift, reduced teamwork, and thinner support structures.

Training intended to make things safer can unintentionally make things riskier by absorbing cognitive and emotional capacity needed elsewhere.

Fortunately, emergence also works in the positive direction. Streamlining documentation, fixing interfaces, simplifying workflows, or removing unnecessary steps can unlock time, mental space, and energy. These improvements often create broader benefits than any volume of training.

Small system changes can produce large safety gains.

Economics: The ROI and ROE Problem

Let us talk cost and value.

Imagine a one-hour mandatory module for 3,000 staff. With an average hourly cost of $70, the direct expenditure is $210,000. Opportunity cost multiplies this further. That is 3,000 hours not spent on patient care, supervision, improvement, or recovery.

Most mandatory training delivers poor return on investment because the root cause is usually not knowledge. It also delivers poor return on expectations because it seldom improves the outcomes leaders or patients and families care about. Safer care, better flow, stronger culture, more reliable performance.

Then there is the trust tax.
Every unnecessary training request signals a misunderstanding of real work. Staff disengage. Reporting declines. Cynicism grows. None of this appears as a budget line, but its impact is unmistakable in culture surveys, attrition data, and error rates.

In contrast, system redesign often costs less, lasts longer, reduces cognitive load, and strengthens culture. Better interfaces, clearer workflows, and more usable tools often outperform training by a wide margin.

Training should support system change, not replace it.

Mandated Training Without Resourcing Creates Chaotic Disinvestment

Any new training requirement without protected time introduces risk. Something will give way, but we rarely know what or where. Staff compensate by rushing, skipping fundamentals, abbreviating communication, delaying documentation, reducing supervision, or taking work home.

These compensations hide the true cost. But staff feel the impact immediately.

Culturally, the effect is significant. Training as punishment or correction communicates that individuals are the problem. New graduates feel watched. Experienced clinicians feel patronised. Everyone feels busier but not safer. Everything feels less human. We must need compassion training…

This is how a learning organisation becomes a compliance organisation.

A Better Default: Think Systemically and Respect Staff Time

Before approving training, leaders should ask (and if they are not, workers should question):

  • What is the real problem?
  • Is it knowledge or is it workflow, design, usability, staffing, or culture?
  • What must staff stop doing to make space for this?
  • What is the economic cost, including opportunity cost?
  • How will this affect psychological safety and agency?

If these questions cannot be answered clearly, training is almost certainly the wrong intervention.

A Six Step Guide to Interrupt the Training Reflex

Healthcare is an intricate and demanding system powered by clinicians who already operate at the edge of their cognitive and emotional limits. Training can be transformative when it is the right intervention. Well designed, well resourced, and embedded in real work, it can lift performance and culture.

1. Clarify the Real Problem

Identify whether the issue is knowledge, workflow, usability, or capacity before deciding on a solution.

2. Compare Training with Non-Training Solutions

Consider redesign, defaults, nudges, staffing changes, or interface improvements before defaulting to education.

3. Build an Economic Case

Calculate cost, expected benefit, and opportunity cost. If there is no measurable benefit, the training should not proceed.

4. Protect Time

If you cannot provide protected time or facilitation, the training is not ready.

5. Reduce Cognitive Load

Keep content short, relevant, and connected to real work. Focus on micro learning, coaching, and workflow integrated supports.

6. Evaluate and Adjust

Measure early. If benefits do not appear, stop or redesign.

Final thoughts

If we keep solving system problems with individual solutions, we will continue to march backwards into the future only faster.

vb

Jesse Spur

Jesse co-hosts the very successful Five Things Nursing Podcast withg Dr Liz Crowe. 100 episodes in 100 countries. You can find it here or on your favourite podcast platform.

Cite this article as: Liz Crowe, "Marching Backwards into the Future: Why Healthcare Needs Fewer Modules and More Systems Thinking," in St.Emlyn's, April 13, 2026, https://www.stemlynsblog.org/marching-backwards-into-the-future-why-healthcare-needs-fewer-modules-and-more-systems-thinking/.

Thanks so much for following. Viva la #FOAMed

Scroll to Top