Apples & Pears
Things annoy me. I’m aware that this is as much my fault as the fault of the things. I’ve tried to be more Taoist about it but basically I run on a kind of low grade suppressed rage. Anger, resentment and frustration are by-words for motivation in my tiny, twisted mind. I need some grist to my mill.
With this in mind I’m thankful to the Daily Mail. It may be crazier than a bag full of Yeti Crabs but its on-going campaign to clickbait the known world into submission does provide ample amounts of said anger, resentment and frustration.
About four weeks ago they unleashed a piece[i] on MailOnline that claimed that ‘6 out of 10 motorists’ are failing the new roadside drugs test which were introduced in March of this year. Within the article it is also pointed out that this compares poorly to drink-driving where around 5% of roadside tests result in failure.
The immediate reaction is surely, wow that’s pretty frightening. After all, the 2013-14 Crime Survey for England and Wales[ii] – broadly considered to be statistically sound though with limitations, after all it doesn’t survey prisoners and the homeless – found that just over a third of people (35.6%) had taken illegal drugs in their lifetime while 1 in 11 (8.8%) had taken drugs in the last year.
So what potential conclusions can we actually draw here? There are a few possibilities:
- The crime survey is massively underestimating the level of drug use in the UK;
- The numbers of failures is being vastly inflated by other factors such as only stopping dangerous drivers or by testing for some prescription drugs;
- That pretty much all drug users are taking to the roads with reckless abandon;
So, are we massively under-estimating drug use? While the British Crime Survey is robust enough to have been cleared by the Office for National Statistics even they admit that the figures should sit alongside other findings to provide a comprehensive understanding of illicit drug use in England and Wales. So, perhaps.. Still, how does this compare to the research presented in the Mail?
This is the tricky part. It’s not clear exactly how the Mail came up with their numbers. They say they ‘uncovered the figures’, but how? The likely answer is that they are government released stats, research from a thinktank or a Freedom of Information Request. Alas, a little digging brought no joy. No police stats I could locate or FOI request on whatdotheyknow.com. Only this piece of research by the Institute of Advanced Motorists, which would be fine except that their figures don’t tally with the mail and the article came out almost a fortnight later. Still, at least the IAM have the decency to explain how they got to their figures, something the national press seem broadly happy to ignore.
It seems to me, and perhaps I’m drawing conclusions based on too flimsy a set of facts (unlike, of course my ink-barrel buying friends), that if I were to trust one of these two sets of data more than the other I might just pick the official government statistics.
Right then, perhaps the figures are skewed by other factors. It’s certainly possible that the police might be stopping people they suspect as opposed to just randomly checking motorists. While this would most likely raise the numbers it’s just not clear exactly how the tests are being administered or whether this is administration is consistent across the country.
Next: The drugalyser test also covers 8 prescription drugs and it seems no one is quite sure how much, for example, Valium or Temazepam you could take without being over the limit. Still, the Mail tells us that 80% of the failures are due to THC, the active chemical in Cannabis so let’s put that to one side as well.
It is worth noting at this point something which the Mail doesn’t mention. The level at which the government are testing for illegal and prescription drugs is much lower than the level at which they test for alcohol. A breathalyser test is failed if the driver has a blood alcohol level of 0.8%. You can get there with a couple of pints though the body metabolises alcohol at about 0.15% an hour so you might well pass after a few hour’s kip. A drugalyzer test is failed if you have certain active chemicals in your body and is set at 2 mg’s per litre of blood for cannabis, 10 mg’s for cocaine and amphetamines and so on[iii]. In plain English, if you smoked a joint in the last 24 hours, have taken coke or ecstasy in the last 5 days, or speed in the last 3, you’re going to be caught by the test.
Now, I’m not here to argue about the right and wrong of these tests – though I would point out that while research has suggested there is a problem with drug driving[iv] it seems odd to introduce this legislation without a complete review of drugs laws. However, what I am saying is it seem disingenuous of a national newspaper to release articles which compare different things as like for like (Apples and Pears, if you will) which they then don’t support with any information about how the statistics were arrived at. Furthermore, there seems to be an increasing urge to produce controversial headlines which are not reflected in the article. Take the Mail piece again and the ‘6 in 10 claim’. Their own stats say that of 427 tests, 188 were failed. That’s around 44%, or between 4 or 5 in 10. My head hurts when I think about this.
The arrival of the Internet age has meant that news stories can wander rapidly from maybe to certainly in a very short space of time while the facts become increasingly distorted. Within days of this story it had been reiterated in both the Star[v] and Express[vi] but in slightly different ways.
The Express in particular, should take a long, hard look at itself. While the piece comes up in a opinion column, which is not required to be especially even-handed, Anne Widdicombe states ‘it is revealed that 56 per cent of motorists tested have taken cannabis or cocaine.’ Hang on is that really what was said in the Daily Mail piece. Really?! There has always been a danger that news stories take on a life of their own and that along the way the original point gets warped beyond recognition but now the web is acting as a catalyst to this process which is pretty scary.
And so, finally, to my actual point. Is it not past time that we ask for our newspapers to give us a greater degree of factual accuracy than they seem to want to? Would it be beyond the pale to ask that they at least explain where their numbers come from or that the sub-editors try to rein in headlines that are not reflective of the content of their articles. After all, the Editor’s Code of Practice states that ‘the press must take care not to publish inaccurate, misleading or distorting information’ and must ‘clearly distinguish between comment, conjecture and fact.’[vii] I’m not sure, for me, the above meets those standards and even if I was, I would still hope that the traditional print media would hold themselves to a higher standard.
After all, in age where news is everywhere it’s no longer about getting hold of it but about who you trust to give you something which is, broadly speaking, honest and trustworthy. Otherwise you may as well get all your information from crappy blogs like this one.