There has been a lot of activity recently around a case that happened several years ago. In a hospital in the UK a boy with trisomy 21 was admitted, with signs of infection, and he deteriorated and died. Several signs of his deterioration were present, but did not receive adequate response, finally he had a cardiac arrest and died. The senior trainee was eventually arrested and put on trial for criminal negligence causing death, a charge that can lead to prison time.
Several articles in a recent BMJ address the issues surrounding the case, which ended up with the trainee being found guilty, and sentenced to a 2 year suspended jail sentence. She also initially had her medical licence suspended, but it has now been completely revoked after the General Medical Council in the UK thought that temporary suspension wasn’t a severe enough punishment and appealed the suspension.
There is a very thoughtful article in The Times about the case, entitled “Are Hospitals Doomed to Repeat Their Mistakes?” The author notes that there were multiple factors in the case that increased the likelihood of errors, the Dr involved had just returned from maternity leave, was not oriented to her new role in an acute assessment unit, was covering several other physicians who were not available, eventually covering six hospital services over 4 different physical floors of the hospital; and she actually told the covering consultant physician that the baby had a pH of 7.07 and a lactate of 11, but the supervising consultant escaped any criticism because his trainee had not emphasized that the values were worrying.
I hope that in a similar case in my hospital, the multiple problems with the case would be addressed. I think the processes in place here, in general, would have addressed many of the systemic issues, rather than blaming a trainee. In fact blaming trainees is unlikely here, and certainly a trainee who informed her supervisor of seriously abnormal results would not have been blamed.
Trainees are supposed to be just that, to be under supervision. If a resident told me that a baby had a lactate which had risen to over 11 with a consistent serious metabolic acidosis, then, as soon as I knew that, the response to the abnormalities becomes my responsibility. If I fail to respond then there should be in place processes to ensure that there is follow-up, and not with the resident, but with me!
The author of The Times article draws parallels, as has often been done before between the culture of learning from mistakes in aviation, and the risks of a blame based culture, which the courts are designed to promote.
In the aftermath Bawa-Garba (the trainee involved) talked through the case with her consultant, an important process that enables doctors to reflect and learn from their experiences. To her credit, she confided in the consultant that she could have done better. This was a tribute to her candour. Everyone can get better. Everyone can improve. Yet that one sentence from a confidential reflection was later used against her in court. Her honesty provided a key plank of the prosecution argument. If she could have done better, doesn’t that imply she was somehow negligent?
It is difficult to exaggerate the significance of this case, not just to Leicester (where the incident occurred) but the entire NHS. Preventable medical error kills thousands every year. The only way to reduce these figures is to learn from every single one. This cannot happen if professionals are so fearful of being penalised for entirely honest mistakes that they live under a cloud of fear. Why would a doctor be open, a key prerequisite for institutional learning, if they could be put through years of hell for doing their very best in the most trying of circumstances?
Matthew Syed (the author of the piece) notes that there were no fatal jet plane crashes last year in the entire world (that is right not one), a result of the aggressive quality control initiatives that seek out correctable failings in the system, and where pilots and crew are encouraged to report every minor failing in the system, knowing they will not be automatically individually blamed.
I know that I have made significant errors in my career, one of which (many years ago) led to a child having a second surgery, because a new procedure with a new use of a medication was not interpreted correctly by me. I prescribed a dose of heparin which was 10 times too high, and I then went and checked on the dose as it seemed too much, but by the time I returned to correct the prescription the nurse had already given the excessive dose (which I did not realize at the time), and during the night afterward the patient started to bleed excessively and had to go back to surgery. The next day I clarified with the consultant that it was my error and not the nurse’s, and there were then safeguards put in place to prevent similar future events. If I had known there was a risk of a criminal prosecution, which would have ended my career, I might have been much more reticent about whose fault it was.
We can reduce medical errors if there is a culture of openness and transparency, if everyone is encouraged to report failures without a risk of individual blame, and if the response to an error is to find a solution, rather than to find the individual responsible.