A front page Telegraph story today reports the surprising finding that the introduction of 20mph zones in Bath and rural Somerset has lead to an increase in the number of people killed or seriously injured (KSI). Similar reports can be found in the Mail and Sun if you can be bothered to look them up. From the Telegraph:
Reducing the speed limit to 20mph has caused a rise in death and serious injuries, a council has admitted, but is refusing to reverse the scheme because it will cost too much.
This is surprising, but ok. Reading on, alarm bells start to ring (emphasis mine):
Bath and North East Somerset Council spent £871,000 bringing in the 13 new speed zones just 12 months ago.
But one year on, a report has found that the rate of people killed or seriously injured has gone up in seven out of the 13 new 20mph zones.
There are two problems here. First, seven out of 13 is as close as dammit to 50%, suggesting that in about half the zones, the number of KSI has gone down or stayed the same. This hardly seems like a significant effect of the intervention. Secondly, this gives us no idea how much of a change in the number of people KSI has been found, nor whether this change is reflected in areas were 20mph zones haven’t been introduced.
I decide to go to the original report from Bath and North East Somerset council, assuming that the press have misreported the findings. Jumping to the conclusions we find that the council report conveys a similar message to the Telegraph, and recommends abandoning the programme:
Casualty severity has worsened marginally in Bath and more so in outlying towns… this is reflective of the national situation for reasons which are not yet clear.
Overall, the speed limit programme in B&NES seems to have provided little in the way of persuasive argument for continuing the programme into the future.
The data on accidents, casualties, and KSI are summed up in a single table. As you can see below, the council looked at nine zones in Bath, four in rural Somerset, and included one “control” zone, where no 20mph restrictions were introduced. There are all sorts of problems with this study design (what can we realistically say from one control zone in a rural area?), but let’s go with it for now.
The report writers have handily coloured increases in accidents/casualties/KSI in yellow, and decreases in blue. Places where no change has occurred are rather confusingly coloured light yellow. Immediately noticeable is that the number of accidents and casualties appears pretty variable, and the change in KSI seems very small indeed. Let’s have a look at the data on a graph. This shows two things; firstly, that most of the variation in the number of incidents is explained by differences among areas, and secondly, that the number of KSI is indeed (thankfully) very low. If we perform a statistical test on the data there is no significant effect of introducing 20mph zones on the number of people KSI. So the report, and the newspapers are not correct.
Of course, any increase in the number of people killed or seriously injured is something to be avoided, and we should look at the data in a little more depth. Another important thing to consider is what is going in the control area, where 20mph zones were not introduced. Again this is not ideal, because the control area seems quite poorly chosen in this case. But it should take out any general changes in overall incident rates over time. Below is a plot of the number of incidents in each test area relative to the control area, before and after the introduction of 20mph zones.
This plot confirms that the control area isn’t great — most of the values are negative, suggesting that the control area has a relatively high rate of accidents and casualties. But, importantly, the plot also shows that any inkling of an effect of the introduction of 20mph zones on the number of people KSI disappears, because the number of people KSI also increased in the control area. You can see this from looking at the previous graph, and again a statistical test confirms that there is no effect of introducing 20mph zones on the number of people KSI.
Most troublingly, if you read the report, the authors are quick to discount decreases in the number of accidents and casualties based on the fact that decreases were also observed in the control area. This is the correct approach (given the caveat of the poor study design), but the report then goes on to make misleading claims and policy recommendations based on increases in the number of KSI, when in fact these changes are not significant and also occurred in the control population. Whether this is simply a matter of statistical incompetence, or whether the report wilfully misleading, I do not know.
So if there is no evidence that the introduction of 20mph zones has had any effect on road safety, was the scheme a waste of time? I would argue not, and this brings me to arguably the most problematic aspect of this whole story. Neither the report nor the newspapers have considered any of the many potential positive benefits this scheme may have had. Do people feel safer in these areas? Are they walking and cycling more? Have pollution levels been reduced? The answers to the questions may be “no”, but should play an important role in future policy decisions, especially when the policy has had no effect on accident rates.
Transport policy interventions like this really matter. Their implementation should be based on wide-ranging, carefully thought out and robust evidence, not a cherry-picked, statistically incompetent analysis of a half-arsed, ill-thought through study entirely lacking in vision.