Are you feeling satisfied? The GMC survey and EM.

Untitled design(5)
Some of our previous amazing, fantastic, clever and hardworking F2 trainees in Virchester

If you are a trainer or a trainee in the UK you will no doubt be familiar with the annual GMC survey. This national snapshot of training gives useful information on the state of training in all specialities across trusts in the UK. The GMC uses this data in many different ways, and it’s really keen that everyone has access to it and that trainers use it to improve UK training.

Emergency Medicine in the UK has come under a great deal of scrutiny in the last few years, and the GMC has undertaken specific work to look at training in our speciality. Concerns have been raised about workload, work intensity and as a result there are well described difficulties in recruitment and retention. It is therefore interesting to look a the GMC survey to see what it can tell us about satisfaction amongst UK trainees in different specialities, but most of all for those in UK emergency medicine training.

How does the GMC survey work?

In brief, an online questionnaire is sent out in March each year. Survey results are released in the Summer and most results are publicly available. You can read more about the survey here, and get access to the results here. The survey measures training in the following domains

  1. Overall Satisfaction
  2. Clinical Supervision
  3. Handover
  4. Induction
  5. Adequate Experience
  6. Work Load
  7. Educational Supervision
  8. Access to Educational Resources
  9. Feedback
  10. Local Teaching
  11. Regional Teaching
  12. Study leave

These all make sense to me and all are worthwhile, but as a trainer, educational manager and consultant I find the overall satisfaction rating the most helpful in giving a global picture to training in departments, trusts and regions. As a global measure it’s an indicator of other aspects of the post in a holistic way, for example I know of areas where satisfaction is high despite high (even excessive) workloads as a result of other positive aspects of the learning environment.

So how did we do in 2014? How do we compare to others, and how do we compare to each other? In all honesty it’s quite tricky to answer those questions as the survey reports in a myriad of different ways depending on whether you look at the data by grade, geographical area, location or even place of qualification. For edu-data nerds it’s nirvana. For the rest of us it can (at first look) seem confusing, so permit me to to take you through some of the data with a focus on overall satisfaction and emergency medicine.

All of this data is free to access from the GMC website and everything below has been taken from it. The reporting tool is OK, but there is one aspect missing that I find useful and that’s rankings. As an education manager I have found rankings to be a more powerful tool than simply giving out scores. At a recent education meeting a surgical colleague stated that he ‘did not care what his score was’, he just wanted to make sure that his department was ‘better than the same speciality in the hospital down the road’.

I get this. It’s human nature to be competitive and to measure ourselves against our peers, but I also have some concerns as when trainers talk about being ‘better’ they do need to define what that really means. We will discuss this later, but first let’s have a look at the data with a reworked and ranked analysis of overall satisfaction scores across the UK.

The GMC survey primarily reports on 4 different groups of doctors training in EM (for our international chums this may be helpful – see page 7).

  • 1. F1 and F2 docs on four-month rotations in emergency medicine. Very few of these intend to make EM a career and all are part of balanced 2-year rotations that encompass medicine, surgery and speciality placements. It’s great that they get exposed to EM, they learn loads, they are the most junior members of the workforce and they require significant support and supervision. These doctors are almost always in their first EM placement.
  • 2. General practice trainees in emergency medicine are usually on a 6-month placement in EM. This may be their first or second (rarely more) placement in EM. They are focused on training in EM for all the goodness that it can give to those pursuing a career in primary care.
  • 3. ACCS trainees are a mixed bag of AM, EM and Anaesthetics. It’s tricky to pull the data out on career EM trainees in this group.
  • 4. Higher EM trainees are 3-7 years out of foundation posts and are career emergency physicians. They are the middle grade physicians who are training to be consultants in the speciality.

Let’s ask some questions…

[DDET 1. How does EM compare to other specialities?]

Overall it’s a bit of a mixed bag, but with some worrying trends. Post in the first two years of training (Foundation 1 & 2) have pretty good levels of satisfaction, but these fall in the higher grades with Higher trainees in EM scoring (on average) towards the left of where I would want them to be.

Now, you can get different results if you put EM up as a grouping (including all grades) but I don’t think that’s as helpful as it then includes a whole range of different career ambitions and motivations. The same problem arises with ACCS trainees in EM. They are not identified as a separate group and therefore we cannot differentiate them from others. Having said that ACCS trainees (a mixed bag of EM, Anaes and Acute Med) score pretty low too.

Screenshot 2014-06-30 20.34.57

[/DDET]

[DDET 2. How do training posts in F2 compare?]

F2 posts are a large part of the UK emergency medicine workforce and are arguably an entry point for those wishing to pursue the job as a career. The range of overall satisfaction scores in the UK is pretty wide, though this may reflect relatively small numbers in some departments (tends to lead to outliers). Barnsley appears to be the highest ranked trust by F2s in the UK.

 

Screenshot 2014-06-30 21.01.35

[/DDET]

[DDET 3. What about GP trainees?]

The picture for GP trainees is similarly diverse, with the Royal Devon hospital having the highest trainee rating.

Screenshot 2014-06-30 21.08.07

[/DDET]

[DDET 4. Higher trainees in EM?]

This is arguably the group in which I have the most interest as these are our future consultant in EM. They are the future of the speciality and the people who are going to look after me when I do something crazy. All trainees are equally great, but these trainees are even greater (with apologies to Orwell).

Screenshot 2014-06-30 21.11.20

So great scores in Brighton and Stoke, both busy trauma centres with some amazing trainers.

[/DDET]

Is this relevant to me?

Gosh yes! Feedback is really important if you want to improve training. This survey is unique in the UK with data spanning regions, departments and trusts. Most importantly our trainees have taken the time end effort to fill it in and it would be jolly rude not to bother to look at what they have to say. We owe it to them to look at it and to use it to support training.

Use the data wisely though. We would all like to be at the top of the rankings, but that’s clearly ludicrous. Fifty percent of us will always be below average and there’s not a lot we can do about that, but we can use the information to lever change within and outwith our departments. I’ve seen some great examples of education leaders use data from the survey to make change in terms of working patterns (notably handover), shift patterns, access to teaching and in pretty much all other aspects of training. Like the surgeon described above, trusts don’t like to feel that they are down on the ranking scales and this can be very helpful in extracting resources to move people out of the red. Similarly, we should look to areas where consistent high scores are achieved. What are they doing and what can we do to emulate them?

Check your own data!

If I’ve missed something on the data analysis please let me know. It’s a bit tricky extracting the data out of the survey to manipulate in this way and so if you spot something that’s done astray let me know. It’s another good reason to go into the reporting tool and to check it yourself!

Musical interlude

This is what the EM workforce will look like if we don’t keep our trainees satisfied.

So what can we infer from these findings?

Although I like the overall satisfaction rating as a global measure we should be cautious about what it might mean. It is possible that trainees are learning lots in departments that do not score well and similarly it is possible that they are ‘coasting’ in others as factors other than the quality of teaching can influence overall satisfaction. For example staffing issues and workload can lead to low satisfaction scores even if there is great clinical learning, but we should not use that as an excuse. If we are to make EM great in the UK then we do need happy and satisfied trainees and whilst clinical education and working environments are not precisely the same they are inextricably linked in many ways. Happy trainees will work hard, learn lots, be a pleasure to have around and will stay in the UK. We therefore clearly want and need happy trainees.

My biggest concern is with our ACCS trainees. I can’t pick the EM trainees out in the survey and yet they are the group with whom I have the greatest concerns. It seems to be the point in training that we lose them to anaesthetics, primary care and the Antipodes. I’d love to know more about them (obviously I talk to them as well, but I feel they need representation here as well). ACCS years 1 and 2 is WAY down the rankings in terms of overall satisfaction and whilst I can’t pick out whether that’s all down to EM training, I suspect that it may well be. Anecdotally ACCS trainees seem to enjoy their Anaesthetics/ICU placements so,and with great sadness, I strongly suspect that there is significant dissatisfaction with EM training at that level.

I would strongly encourage everyone to visit the site and to delve into the data more deeply about their own department, trust, deanery and training program. There is good data there that might help us understand what factors lead trainees to decide on how they determine their ‘overall satisfaction’ in training.

As an example here in Virchester we have used the data to look at how we have done and we have some identified concerns around induction. Does that mean induction is ‘bad’, well it might, but it might not. What is important though is that we explore that with trainees, trainers, the department and probably with other departments (who score highly in this area) to see if there is anything that we should be doing differently.

I would also exercise caution about sample size in the survey. The GMC will report data so long as at least 3 trainees responded. Less than that and no figure is given (for protection), but that does mean that scores can be variable and at the whim of a particularly enthusiastic, or grumpy set of trainees. This will explain why many departments don’t feature in the survey, if fewer than three trainees responded then you will not appear in the tables. I suggest looking for trends in the data and on the GMC website you can look at scores from 2013 and earlier. Does your locality have a trend in satisfaction? If so then a trend is more likely to represent a true reflection of the training environment.

So, in summary. The data is out there and the data is public. If you’re not looking at the data from your department then I can assure you that someone else is. It’s up to you whether you delve into  it more deeply, but in life I have always found it valuable to be in possession of the feedback info before someone else knocks on the door with it.

Please let us know what you think about what the survey tells us about UK EM training.

vb

S

 

NB. I do work with the GMC as an education associate, but all of the views, thoughts and analysis above is entirely my own interpretation and nothing to do with the superb team at the GMC. However, colleagues within the GMC have consistently discussed how they want the GMC survey to be looked at, discussed, debated, shared and most importantly used. Hopefully this blog will help share the information and help us improve UK EM education.

 

 References

Medical Education on the Front Line. GMC 2013.

 

Cite this article as: Simon Carley, "Are you feeling satisfied? The GMC survey and EM.," in St.Emlyn's, July 3, 2014, https://www.stemlynsblog.org/feeling-satisfied-gmc-survey-em/.

3 thoughts on “Are you feeling satisfied? The GMC survey and EM.”

  1. Thank you for the summary. Our department has looked at our data and was very surprised by it. From a trainee point of view, I didn’t realise that only the GMC gets given the comments as some of the comments justified the answers. Some of the questions are daft:
    – How often do you do things you don’t feel competent to do – all the time…that’s called learning.
    – How often did you get study leave refused – often. Because it’s a team, and we can’t take SL on nights, if lots of people already have, or it’s deemed educationally unsuitable. That doesn’t make it bad.

  2. I like your approach to this, Simon. Often more effort is wasted on finding flaws in the questionnaires or coming up with excuses, than actually trying to see what one can learn from a survey like this. Any kind of quality measurement needs to be discussed and analyzed with all stakeholders, so it can be put into context. I would definitely choose the workplace where problems are acknowledged and there is a willingness to do better, over the one where staying at the top of the list is the main priority.

Thanks so much for following. Viva la #FOAMed

Scroll to Top