BBC News - Education & Family

Friday, 23 July 2010

Can chance make you a killer?

Can chance make you a killer?

Surgeon (stock picture)
Can chance make you a killer? In his regular column, Michael Blastland invites you to try the deadly Go Figure Chance Calculator.
Imagine you are a hospital doctor. Some patients die. But how many is too many before you or your hospital are labelled killers?
We've devised a chance calculator to simulate this scenario. It is set up so that you are innocent of any failing. But bad luck might convict you all the same.
In the real world all kinds of factors make a difference, like surgical skill. But in the calculator, every patient in every hospital has exactly the same chance of dying and every surgeon is equally good. This is to show what chance alone can do, even when the odds are the same all round.
  • The calculator (below) shows 100 hospitals each performing 100 operations
  • The probability that a patient dies is initially fixed at five in 100
  • The government, meanwhile, says death rates 60% worse than the norm are unacceptable (in red)
  • So any hospital which has eight deaths or more out of 100 ops - when the expected average is only five - is in trouble.
  • We've assigned one hospital to you, with a box around it - it could come out green or red.
Start the calculator by clicking on the slider itself and see whether your hospital - the boxed one - is safe or deadly. Click "recalculate", in the lower right-hand corner of the module, a few times to see how you and others do.
Here's what happened when I ran the calculator a few times.
Chance calculator 1
First, some hospitals look dodgy, others brilliant. In this example (see image, right), one surgeon or hospital had 14 deaths (that's the red H out on its own beneath the big cluster), 1,300 per cent higher mortality than some others, who had only one, a huge disparity. Mine - boxed - was one of the unacceptables. So sack me.
But remember, these results are pure chance, computer-generated, based on exactly the same risk for every patient. So hospitals are not really good or bad, it's just chance, lucky or unlucky.
That sounds odd. The calculator seems to show fatal incompetence or maybe even - let's speculate what goes through the public mind - murder at one, medical genius at another.
Keep recalculating and sometimes only a few are unacceptable.
Chance calculator 2 and 3
The example above left has five "bad" surgeons. Roll the dice again and it comes up with a shocking 20 that failed to meet the standard.
Chance calculator 4 and 5
Next, move the slider up to, say, a 12% death rate. This imagines a more dangerous operation. But now there are fewer unacceptables as there tends to be relatively less variation around bigger numbers.
Finally, move the slider right down to a 1% death rate. Now, still using the 60% threshold, a huge number of hospitals are often unacceptable. That's because there tends to be relatively more variation around smaller numbers.
The same applies to the number of ops performed. Do more, and the variation tends to be relatively smaller. Do only a few and the results are more likely to be - relatively - all over the shop. The government says it would like to publish results right down to the level of the individual surgeon.
Chance v skill
Note that chance does not mean without cause. Every death has a cause, but sometimes these come together more often in certain places at certain times in ways that have nothing to do with anything we know or that can currently be known about the patient or the surgeon.

Think of it like this...

Think of a bag of 100 balls, five of them red. Pull a red ball from the bag and it means a death. What if you pull 100 balls from the bag, each time putting the ball back?
Your chance - and it is only chance - of pulling a deadly ball is 5 in 100, or one in 20, or 5%. But it's easy now to imagine that you might draw 14 red balls or more in 100 attempts, or none, purely by chance.
This is the same as the simulation in our calculator; every time you run it, it is like imagining that 100 hospitals dip into the bag 100 times each.
Does all this mean that every hospital mortality rate is pure chance? Of course not. There can be what's called special-cause variation - in contrast to the common-cause variation in our calculator. Special-cause variation is when something like surgical skill is the real reason for 15 deaths. The big problem - bigger than many people think - is how to tell the difference.
So what about a hospital like Mid Staffs, where hundreds are thought to have died unnecessarily? That's an example - probably - of special-cause variation, not chance. There's to be more of this kind of measurement as the government says the effort must shift from setting targets for how much work is done to judging how well it's done. That probably means more emphasis on counting bodies.
In practice, regulators who make calls about standards perform many hard judgments and calculations to work out what's chance and what's not.
But because chance has such a variable effect, they will often be unsure, even about the kind of huge differences we see in the calculator. Anyone thinking raw data is all we need, beware.
Before naming the good or bad, we need to understand the extent to which chance can make the innocent appear dangerous and make heroes of the ordinary.
Note: For a great but techie illustration of how hard it can be to determine whether a pattern of results tells us anything, there is a fascinating exchange on Ben Goldacre's other blog.
Add your comments using the form below
I've always thought these sorts of claims based on statistics are fairly dubious for the reasons above. The "unacceptable" category should only apply to hospitals where fault on behalf of the staff can be proved or where frequent complaints e.g. about cleanliness are made. The statistics should simply serve as a guide as to where to concentrate your search for this poor practice rather than a proof of their existence
David, Durham
From a personal anecdotal perspective may I add this; in 1990 (20 years ago) I underwent a cardiac quadruple bypass and today, in my 80th year, I am in remarkably good health and not on prescription medication. About 3 years ago on seeing the track-record of the cardiac surgeons still practicing it revealed that the surgeon who did my operation had the worst track-record. Okay, so I am lucky but could there exist unluck on the surgeon's side due principally to the health of the patients allocated to him (and the management of their own after care)? While there remains an unnecessarily high level of deaths due to mistake or carelessness could it also be that successful surgery more of a roulette game?
Robert , Kirriemuir, Scotland
It is not only the surgeons who are variable in quality-- the clinical state of the individual patients and the difficulty of the individual operations also throw in variables.
JD, Birmingham
Death rates can be very misleading. Hospitals which take on riskier cases where surgery is the final and only option may have a higher rate of deaths than another which does not have clinicians with the right level of skill to perform these often tricky operations. Does this make it a worse hospital? Of course it doesn't.
Andy, London
A good statistical evaluation WILL point out the laggards. This is because a good statistical evaluation will recognize the effects of what you are showing. It will not base the analysis on a single year's data, but on multiple years, and will take into account the danger of a particular operation. When those factors are considered, a good statistical analysis will accurately identify "outliers"--those hospitals or doctors whose performance lies outside of chance.
John Wendt, Kampala, Uganda
Suppose you're one of the failing hospitals above, and you're told to improve. You do nothing at all. Next year, you roll the dice again and, by chance, you're now in the pass category. You are praised. However, the results are nothing more than chance. This is regression to the mean - the tendency for extreme results to become mean results over time. The same would apply to any random "corrective" action you take - if next year's results are better, that corrective action is likely to be praised as effective even if it was totally useless, and your original problem was solely down to chance.
Dan Talmage, Cambridge
For anyone who thinks that this is just a bit of harmless fun, go searching for the similar case of Lucia de Berk, also highlighted by Ben Goldacre. She is a Dutch nurse who spent six year in prison convicted of the murders of child patients which were, in all probability, natural deaths. But because she happened to have been in the wrong place at the wrong time several times, by chance, helped by a witch-hunt which used some heavily-criticised statistics, she got the blame. Even the statistics used to condemn Mid Staffs hospital, and to allow credulous journalists to claim hundreds of excess deaths there, were strongly criticised by an article in the British Medical Journal at the time, but which seems to have been largely ignored by both the media and by the relevant politicians.
Dr JG, West Yorkshire
It would be equally fascinating to see such an experiment applied to investment firms and banks. Since the 1920s experiments have contsantly shown that their performance is no more than a random distribution. In fact random selection (literally throwing darts at the Wall Street Journal in the Harvard Economics staff room) gave better results. Yet The City continues to behave, and charge fees, as though they have some special insight. In reality they depend on the fact that most of us do not understand the sort of statistics Michale has just shown.
Des, London, UK
It's amazing how many situations this can be applied to. A good one (although slightly worrying) is the application in the financial world. After five years, an original pool of 1,000 individuals (could be traders, budding entrepreneurs etc) will diminish to 31 assuming they have a 50:50 chance of success each year and are fired if loss-making. We forget about the 969 individuals who lost and heap praise on the remaining 31 - not realising their success is pure luck and a result of basic probability.
RH, Manchester

Sex education 'could be better

Sex education 'could be better'

pshe lesson PSHE is part of the national curriculum in England but is not compulsory
Lessons about sex, relationships and health are not good enough in 25% of schools in England, inspectors suggest.
Teacher embarrassment and lack of knowledge were often to blame, Ofsted said in a report based on findings at 92 primary and 73 secondary schools.
It said in many secondary schools, pupils were taught about the biology of sex but not relationships.

Related stories

The government says all young people should have high-quality teaching in this area.
It will take Ofsted's findings into consideration in its review of the curriculum.
Ofsted looked at personal, social, health and economic (PSHE) education in 165 schools in England.
The subject has been part of the timetable in most schools for about a decade.

Case study

Holmleigh Primary School in Hackney has enthusiastically embraced personal, social, health and economic education into the timetable.
Half-hour sessions start at the age of five with issues such as relationships, bereavement and bullying. As children move through the school more sensitive issues are introduced such as substance misuse (aged six), puberty (age nine) and reproduction and childbirth (age 10).
Staff are given extra training in tackling sensitive subjects.
The hope is that by the time they leave the school the children will have a good understanding of issues they may face. Connacht, 11, said: "I never really knew much about drugs or alcohol but when we were taught them I understood and I have told my uncle to stop smoking."
She says the lessons are helpful in dealing with playground rumours such as "when girls start their periods they bleed to death".
Her classmate Rebecca adds that parents sometimes don't like talking about issues like sex so it's helpful that they can learn about it in school.
Ofsted found in more than a third (34%) of the secondary schools visited, students' knowledge of sex and relationship was "no better than satisfactory", while in a further three schools it was rated "inadequate".
The report says: "Students' knowledge and understanding was often good about the biology of sex but weaker about relationships.
"They said that their sex and relationships education was taught too late and there was not enough of it to be useful.
"Discussion was sometimes limited because of the teacher's embarrassment or lack of knowledge. In these schools, the students did not have the opportunity to explore the nature of relationships in any depth. They had not discussed managing risks, saying 'no', negotiation in relationships, divorce and separation, or living in reconstituted families."
In general, Ofsted found PSHE lessons in three-quarters of the schools surveyed were good or outstanding.
But in the remaining quarter, inspectors said the quality of teaching was variable and teachers' subject knowledge and expertise were not good enough.
Chief Inspector Christine Gilbert said: "It is pleasing to see that most of the schools visited were good or outstanding at teaching the subject. However, there were some weaknesses and schools should continue to promote professional development in PSHE education so that teachers strengthen their knowledge and skills in the subject.
"In addition, some schools still struggle to teach their pupils effectively about sensitive but important issues such as the misuse of drugs and alcohol."
In over half of the secondary schools visited, inspectors found students' knowledge about the social risks and physical effects of excessive drinking was "undeveloped".
And it was a common misconception that heroin and cocaine were the drugs responsible for most deaths every year, when smoking and drinking accounted for many more deaths.
Biological facts
PSHE teaching is not compulsory in England, unlike other parts of the UK, although it is on the national curriculum.
It is only compulsory to teach the biological facts of reproduction in secondary school science lessons
Parents have the right to withdraw their children from sex education lessons.
The Labour government had planned to make PSHE a compulsory part of the national curriculum.
A spokeswoman for the Department for Education said: "This report from Ofsted is a useful assessment of PSHE education in schools.
"We want all young people to benefit from high-quality PSHE teaching and we will take this report's findings into consideration as we continue to look at the curriculum across the board."
Read some of your comments
As a secondary school teacher, I am concerned at the way sex education is delivered to pupils. Whilst I agree that there needs to be some sharing of information, a whole day of it either bores the kids or gives them the impression that this is something that all children have to be doing in their teens.
Anon, UK
All children should be educated around major issues such as sex and relationships, drugs and emotional wellbeing.
Carol, Manchester
Since when did the responsibility for children's moral and sexual upbringing become the responsibility of schools? Strangely enough, I thought this would be the responsibility of parents - or is that yet another area that parents have now abdicated?
Paul, Derby
Sex education is not something teachers should have to impart but rather parents or guardians. The tendency under Labour is to blame all the problems of children and young people down to poor teachers.
Arthur, Chester
This is an area of the curriculum that needs better training and funding and a more professional acceptance and a non-judgmental stance. Perhaps it's time that this aspect is optioned out to agencies like BPAS and Relate because thier training is extensive, indepth and contains a level of expertise
Lizzie, Wales