Ch 9. Statistics

… there is something comfortably definite about police statistics, not least that they can be traced back to actual victims. When the figures are published as tables or graphs they seem so tangible they must be real. Despite long-standing criticisms1 most policy-makers and commentators still take them at face value. The government even insisted that police statistics should be plotted on street maps and posted online so that, in theory, citizens can judge how safe they are.2

1The first rudimentary national crime statistics were published by the Home Office in England and Wales in 1805 and by the government in France in 1827, and, like parochial records before them, they relied on court statistics. The flaw was quickly spotted: ‘All we possess of statistics of crime and misdemeanours would have no utility at all if we did not tacitly assume that there is a nearly invariable relationship between offences known and adjudicated and the total unknown sum of offences committed.’ (Adolphe Quetelet quoted by T. Sellin and M. E. Wolfgang, The Measurement of Delinquency, Wiley, NY, 1964.) Police statistics were an improvement because, in theory, they also counted cases which did not go to court, but until at least the 1980s there were huge discrepancies in how different officers and forces defined what they should record. Incidentally, the US did not introduce the FBI’s Uniform Crime Reports until 1930.

2 Local crime mapping was recommended by an independent review chaired by Adrian Smith, a former President of the Royal Statistical Society, in response to widespread cynicism about national crime statistics; but Professor Smith acknowledged that the underlying data would be partial and biased. Ironically, British ministers had for decades wanted to downgrade the old-fashioned crime stats, partly because they looked so bad for the government, but ran scared because they knew that their political opponents and the press would accuse them of fiddling the figures. No sooner had they won the argument, and supplemented annual reports with a national victimisation survey, than they decided to put police statistics back centre stage.

 

image001

 

The arrival of a new chief constable or borough commander can have a huge impact on how the police operate, whom they target, and what they prosecute.

There has always been a good deal of room for discretion, sometimes creating controversy. In Manchester the arrival of a new homophobic chief constable in 1958 saw a seven-fold rise in prosecutions for male opportuning. More recently a chief officer in North Wales realised far more people were being hurt in road accidents than through crime and so, to the dismay of some traditionalists, focused on road safety.

 

image001

 

Officers have often been given targets, such as two arrests per month, and charges are inflated (from, say, drunkenness to harassment – which counts as violence) to meet the quota. The Police Federation, which represents the rank and file in Britain, has justifiably called it ‘ludicrous’.

Jan Berry, Chairman of the Police Federation, said at their 2007 conference in Blackpool, ‘When people are pushed to show results they will use anything they can to show they are doing a good job.’ In fact the manipulation of police statistics happens on an industrial scale with several well-established techniques for manipulating statistics to improve detection rates. They include ‘cuffing’ which gets its name by making crimes disappear as if by magic up one’s sleeve. A series of crimes might be recorded as a single offence; they can be downgraded, say from attempted burglary to vandalism; or ignored by implying that the complainant was mistaken. Vulnerable victims, including drug users are targeted to make them back out of allegations, including complaints of sexual assault. Another technique is ‘nodding’, so-called because known offenders are enticed to ‘nod’ to crimes they have not committed. Inducements to fasle confessions may involve alcohol, food or even sexual favours, or shorter sentences. ‘Skewing’ is a method for improving detection rates by shifting resources around to meet targets, such as prioritising drug use rather than child protection.

Details of the “endemic” extent of these manipulations caused consternation to the House of Commons Public Accounts Committee in 2013. Witnesses, included Detective Chief Superintendent James Patrick of the Metropolitan Police and Dr Rodger Patrick, former Chief Inspector, West Midlands Police. They testified that fudging of figures was sometimes accepted, and even encouraged by Police and Crime Commissioners and the Office of the Mayor of London. At very least, pressures on police from politicians, including PCCs, was a corrupting influence on statistical probity. [http://www.parliamentlive.tv/Main/Player.aspx?meetingId=14214, 19 November 2013.] This is why crime can only be properly measured by collating a variety of data including hospital figures and, above all, victim surveys.

 

image001

 

According to a poll for the Audit Commission, almost two-thirds of us would walk on by. We can’t be bothered, don’t want to get involved or don’t think the police would do anything anyway.

Victims and Witnesses, Audit Commission, 2003, p. 12. Sixty-one per cent polled said they would not report a crime that they witnessed.

 

image001

 

Avon and Somerset Police set up a small experiment in which a plain-clothes officer blatantly brandished bolt cutters to steal bikes, and though at least fifty people clearly saw what he was doing, not one person intervened or rang 999. The video went online and proved extremely popular.

BBC News, 8 November 2012. [http://www.bbc.co.uk/news/uk-england-bristol-20248535]

 

image001

 

When people are asked about crime they’ve suffered and whether or not they asked for help, it turns out that only 44 per cent of personal crimes are reported to the police.1 Even that reporting rate is a big improvement, caused partly by the spread of mobile phones.2 And it doesn’t count many of at least 9 million business crimes a year, most of which we only hear about through surveys3

1Audit Commission (see above), p. 5. [http://www.audit-commission.gov.uk/subwebs/publications/studies/studyPDF/3127.pdf]

This under-reporting is by no means confined to crime. No one knows even how many serious road accident casualties there are. In 2017 the UK government 30-60,000 pa, vastly higher than those shown in UK police data (23,000).

2 Police control rooms like that at New Scotland Yard have long taken such a high volume of information via cell phones that they had to introduce computers to cope with an incessant stream of emergency calls dialled inadvertently.

3 ‘Crime against businesses: Headline findings from the 2012 Commercial Victimisation Survey, January 2013’, Home Office, 2013. [www.homeoffice.gov.uk/publications/science-research-statistics/research-statistics/crime-research/crime-business-prem-2012/crime-business-prem-2012-pdf?view=Binary]

 

image001

 

A pioneering emergency surgeon – we shall meet him later – has systematically checked hospital records over many years and is blunt: ‘Police figures are almost hopeless when it comes to measuring violent crime.’

Professor Jonathan Shepherd, maxillofacial surgeon in Cardiff and coordinator of A&E crime audits, The Truth About Crime, Prog 1, BBC One, June–July 2009. Fewer than half of assaulted patients inform the police, and those who do are not necessarily the most badly injured. See more about Prof Shepherd in Chapter 22.

Cardiff University’s survey of A&E and minor injury units began in 2001 and recorded a steady decline in victims of violence attending for treatment. Some 211,000 people went to hospital in 2014, roughly one in every 260 members of the population, and 10% fewer than in 2013.

 

image001

 

One of the worst aspects of concealed crime is often dismissed as antisocial behaviour and is targeted at people with disabilities, causing huge distress and sometimes serious harm.

Only 1,500 hate crimes are reported to police each year covering low-level harassment to extreme incidents of violence. A comprehensive survey by the Equalities and Human Rights Commission called ‘Hidden in Plain Sight’ suggested this was ‘a drop in the ocean’. The Times, p. 22, 12 September 2011.

 

image001

 

In fact, some official theft rates do more to measure changes in insurance penetration than trends in stealing.1 One of the reasons that street crime appeared to be rise steeply in the late 1990s was that mobile phone companies were promoting handset insurance. On the other hand, people are cautious if they are insured and don’t want to jeopardise their no-claims bonus, as where a car is vandalised2 or broken into.3

1The BCS in 1984 was the first detailed exploration of why serious crimes went unreported. In 35 per cent of cases victims thought it too trivial, and 34 per cent thought the police would be uninterested or ineffective. Some 5 per cent reported to other authorities, 3 per cent feared reprisal, and 3 per cent ‘dealt with it ourselves’. Conversely the biggest motive for reporting minor crime was to support insurance claims. Ken Pease, ‘Judgements of crime seriousness: Findings from the 1984 British Crime Survey’, Research & Planning Unit Paper 44, Home Office, London, 1984, pp. 23–4. ISBN 0 86252 377 3.

2 Extrapolation from a YouGov poll of 4,000 motorists in December 2007 suggests 4.3 million British motorists were victims of car vandalism in the previous year, nearly a third of all car owners, mostly scratched paintwork or damaged wing mirrors. There is little incentive to report such crimes to the police if the damage is uninsured, below the insurance pay-out threshold or would jeopardise no-claims bonuses. (Source: The Times, 11 January 2008, p. 39.)

3 One ingenious survey found that car break-ins in a single city-centre car park exceeded the total number of reported thefts from vehicles in the entire police division. Colette Felvus, then an undergraduate student at Huddersfield University, counted heaps of toughened auto glass in a shopping centre car park in Sheffield. She discounted minor collisions by focusing where side windows are vulnerable in parking bays and found the vast majority of such heaps in one small section of the car park, which was relatively isolated and opened on to waste land. Source: Jeanette Garwood, Michelle Rogerson and Ken Pease, Sneaky Measurement of Crime and Disorder, 1999, in V. Jupp, P. Francis and P. Davies (eds), Criminology in the Field: The Practice of Criminological Research, London: Sage (with Jeanette Garwood and Michelle Rogerson).

 

 

image001

 

… under-reporting is rife in stabbings or even shootings, so much so that British police chiefs want the medical profession to break patient confidentiality and report patients treated for knife or gunshot wounds.

The Guardian, London, 29 October 2007, p. 1

 

image001

 

There are thousands of missing persons and no one knows if they are dead or alive unless a body turns up.

Child welfare charities reckon over 100,000 UK children disappear each year, mostly because they walk out of care homes or after a family row (Gwynther Rees and Jenny Lee, Still Running II: Findings from the second national survey of young runaways, The Children’s Society, London, 2005; Nina Biehal & Jim Wade, Children Who Go Missing; Research, Policy and Practice, Dept of Health, Leeds, 2002.) Many come back very quickly, most eventually return, others just vanish. What of child abductions, a crime that captures people’s hearts more than almost any other? In 2007 the British media went into a feeding frenzy when a three-year-old girl was plucked from her family’s rented holiday apartment in Portugal. Top politicians, big businesses and A-list celebrities climbed aboard a mawkish bandwagon, publishing the girl’s picture, wearing yellow ribbons and putting up a huge reward. Every child snatch is tragic, and the sincerity of public distress goes without question, but how often does such a thing occur? Nobody knew. Official figures vary so wildly year on year (from under 400 to over 1,000 in England and Wales) that there are obviously problems of definition. In 2002/03 the Home Office tried to analyse what was going on, and found half the snatched children were taken by a parent as part of a custody battle, of which a third ‘should not actually have been recorded by the police’. (Geoff Newiss and Lauren Fairbrother, ‘Child abductions: understanding police recorded crime statistics, Home Office Findings 225, London, 2004, p. 2. ISSN 1473-8406.) Other abductions were by relatives, partners or friends, sometimes because of disputes, and 10 per cent were underage girls who had gone away with boyfriends. Of about seventy youngsters taken by strangers, most were found quite quickly, but around twenty-five children remained unaccounted for after twenty-four hours. Two victims were subjected to a serious sexual assault but nobody knows for sure what happened to the others or how many were eventually found.

 

image001

 

Most years in England and Wales about 100 cases that are initially recorded as homicide become ‘no longer recorded’ as homicide because of reclassification.

On average it takes over three months before a suspicious death is formally assigned to homicide and in 5 per cent of cases it takes more than a year. Homicide figures are confounded by adjourned inquests, lengthy investigations, or delayed registration and recoding. (Tim Devis, Cleo Rooney, Recent trends in deaths from homicide in England and Wales, Health Statistics Quarterly,  No. 3, pp. 5–13, ISSN 1465-1645.) In addition, although the term ‘murder’ is used loosely, homicide covers several categories including manslaughter, infanticide and causing death by dangerous driving. Each year countless killings are redefined through plea bargains or other legal devices.

 

image001

 

Long-term trends are even more difficult because of gaps in the records, especially from the age before computers, when information was kept locally on cards or paper.

Review of homicide statistics, National Statistics Quarterly Review No. 25, Home Office, London, 2003.

 

image001

 

A third of all crime reported to the police is not recorded as a crime.

Audit Commission (see above). According to the British Crime Survey, of all the incidents that the police came to know about in 2002/03, approximately 68 per cent resulted in a crime being recorded.

 

image001

 

There will always be a lot of wriggle room. When is a young man with a screwdriver equipped for burglary; when is a small amount of drugs not worth bothering about; when is a discarded handbag indicative of a mugging; when is it best to turn a blind eye in the hope of gaining some intelligence; when is a drunken brawl best dealt with by calming people down; when if someone reports a disturbance should one finish one’s paperwork or rush round and intervene?1 Not infrequently these ambiguities are manipulated cynically, with offences being shuffled from one category to another to reflect better on police performance. As one officer famously put it, the books are frequently ‘cooked in ways that would make Gordon Ramsay proud’.2

1I came across one episode in London where a man was pistol-whipped and which (according to a rule that injury trumps all) was classed as actual bodily harm rather than a firearms offence. The detective on the case objected, arguing that carrying a handgun in Britain is of more serious concern, and that officers should know the offender might be armed, but she was overruled.

2 David Copperfield, Wasting Police Time, The Crazy World of the War on Crime, Monday Books, London, 2006, ISBN 0955285410. PC Copperfield also runs ‘The Policeman’s Blog’ [http://coppersblog.blogspot.com/].

 

 

image001

 

… in 2000 the Police Inspectorate found error rates ranging from 15 to 65 per cent and in 2013 the Office of National Statistics was still sufficiently concerned about big discrepancies that it warned police may be tinkering with figures to try to fulfil targets.

John Flatley and Jenny Bradley, ‘Analysis of variation in crime trends, Methodological Note’, Office of National Statistics, 24 January 2013, p. 10. [http://www.ons.gov.uk/ons/taxonomy/index.html?nscl=Crime+and+Justice]

 

image001

 

Lawyers, legislators and officials keep changing the rules. Karl Marx came across the problem somewhat before I did, correctly noting in 1859 that an apparently huge decrease in London crime could ‘be exclusively attributed to some technical changes in British jurisdiction’.

‘If we compare the year 1855 with the preceding years, there seems to have occurred a sensible decrease of crime from 1855 to 1858. The total number of people committed for trial, which in 1854 amounted to 29,359, had sunk down to 17,855 in 1858; and the number of convicted had also greatly fallen off, if not quite in the same ratio. This apparent decrease of crime, however, since 1854, is to be exclusively attributed to some technical changes in British jurisdiction; to the Juvenile Offenders’ Act in the first instance, and, in the second instance, to the operation of the Criminal Justice Act of 1855, which authorises the Police Magistrates to pass sentences for short periods, with the consent of the prisoners.’ Karl Marx, New York Daily Tribune, 16 September 1859. [http://www.marxists.org/archive/marx/works/1859/09/16.htm]

 

image001

 

The most blatant example of moving the goalposts was between 1931 and 1932 when indictable offences in London more than doubled because of a decision to re-categorise ‘suspected stolen’ items as ‘thefts known to the police’.1 More recently, changes in counting rules led to an apparent and terrifying surge in violent crime in 1998 and then again in 2002.2

1 Report of the Commissioner of Police of the Metropolis for the Year 1932, Cmnd 4294, p. 16.

2 From April 1998 police in England and Wales were ordered to add many new offences, greatly inflating the number of recorded violent incidents, but also frauds and drug offences. This had an immediate effect of reversing what had been a steady decline in reported crime. In 2002 a National Crime Recording Standard was introduced that required for the first time that all reports of incidents, whether from victims, witnesses or third parties and whether crime-related or not, must be recorded as a crime if, on the balance of probability, the police think the circumstances as reported amount to a crime defined by law and there is no credible evidence to the contrary. If in doubt the police should take a ‘victim-focused approach to crime recording where the public’s call for service is met’.

 

image001

 

From that point on, half of all police-recorded violence against the person involved no injury.

‘At least 48 per cent of all police-recorded violence against the person involved no injury in 2004/05.’ – Sian Nicholas, David Povey, Alison Walker and Chris Kershaw, Crime in England and Wales, 2004/2005, Home Office Statistical Bulletin, 11/05, July 2005, ISSN 1358-510X, p. 5.

 

image001

 

In 2008 violence was reclassified again and this time many less serious offences were bumped up to big ones. For example, grievous bodily harm now included cases where no one was badly hurt. Inevitably the Daily Mail reported ‘violent crime up 22 per cent’.

Daily Mail, 23 October 2008.The headline (‘Violent crime up 22 per cent as Home Office admits police have been under-recording serious offences for ten years’) was accompanied by a photo of a crazed and masked man pointing a gun. [http://www.dailymail.co.uk/news/article-1079927/Violent-crime-22-Home-Office-admits-police-recording-offences-years.html]

 

image001

 

On the face of it, Australia has seventeen kidnaps per 100,000 while Columbia has only 0.6. Swedes suffer sixty-three sex crimes for only two per 100,000 in India.

Some people actually believe this stuff.

[1] [http://www.bbc.co.uk/news/magazine-19592372] quoting UN comparisons of national reporting rates. Australian abduction figures include family disputes, while Sweden has a high-profile policy of identifying sex crimes.

image001

…for years the FBI was so cautious it sounded almost tongue-in cheek: “police data ‘may throw some light on problems of crime’”.

The US began to publish national crime data in 1930 and almost immediately adopted the warning that: ‘In publishing the data sent in by chiefs of police in different cities, the FBI does not vouch for their accuracy. They are given out as current information which may throw some light on problems of crime and criminal-law enforcement.’ The caution survived in that form for over three decades. See, for example, archives from 1949 at: http://archive.org/stream/uniformcrimerepo1949unit/uniformcrimerepo1949unit_djvu.txt.

However, by the 1970s at least one cynical observer pointed out that, ‘these and other disclaimers have never been more than perfunctory. The Bureau has never expressed any criticism of newspapers that assume the data to be valid.’ [www.academia.edu/1106101/Crime_Statistics_A_Historical_Perspective]

In recent years the FBI has become more specific in its warnings about unreliability and dangers of ‘simplistic’ interpretation. See for example: http://www.fbi.gov/about-us/cjis/ucr/crime-in-the-u.s/1995/toc95.pdf.

 

image001

 

Yet however shallow, however defective, however inconsistent the figures, they have almost always been treated as far more meaningful than they are.

As a result virtually every criminological theory, and a good deal of police work, is based on what may be false premises. So are most crime trend stories in the media, as we shall see. We continue to see the police as experts in crime trends even though they are largely in the dark, and politicians continue to rely on weak statistics whenever it suits their agenda

 

image001

 

Back in in 1973 when crime was racing up the political agenda, the US Census Bureau started asking people for their actual experience of crime.

The US National Crime Victimization Survey (NCVS) started out in 1973 as the National Crime Survey, sponsored by the President’s Crime Commission, which itself was a response to rising crime. There was widespread concern that FBI data recorded only crime known to the police and so did not accurately portray the volume and type of crimes Americans were experiencing. It is run by the US Census Bureau and interviews about 50,000 households twice every year. See The National Crime Survey: Working Papers, US Dept of Justice, 1981 [https://www.ncjrs.gov/pdffiles1/nij/75374.pdf]

 

image001

 

Other countries soon followed suit and over eighty now use a common methodology.

In 1989 thirteen industrialised countries pooled their data in the first International Crime Victims Survey, and since then the ICVS has become global, covering some eighty counties including Argentina, Cambodia, Peru and South Africa.

 

image001

 

The BCS only audits England and Wales – Scotland started its own SCS

Scotland has its own legal system which means that official crime statistics mean different things even within the UK. The Scottish approach is based on Roman Law, not Common Law as in England and Wales, so when the countries agreed the Act of Union in 1706 their courts were specifically excluded. Northern Ireland is Common Law-based, like England and Wales, but there are still differences, again reflected in how and what statistical information is recorded. Northern Ireland had occasional victimisation surveys, called the NICS, from 1984 and they became annual events from 2005.

 

image001

 

Even so it leaves a lot of gaps … Past surveys also neglected homeless people and those in communal dwellings like student halls of residence, old people’s homes or hostels.

Crime figures had become so mistrusted that a distinguished statistician, Prof Adrian Smith, was asked by the government to review crime statistics and one of the principal recommendations of the 2006 Smith report was that the BCS should be extended to include people not previously reached. [http://webarchive.nationalarchives.gov.uk/20110220105210/rds.homeoffice.gov.uk/rds/pdfs06/crime-statistics-independent-review-06.pdf]

 

image001

 

And it almost certainly undercounts the most vulnerable in society who are victimised repeatedly and whose complaints are arbitrarily capped at five.

 

Graham Farrell and Ken Pease, The sting in the British Crime Survey tail: multiple victimizations, paper presented at Cumberland Lodge Windsor, at a conference to mark the twenty-fifth anniversary of the BCS, October 2006, and subsequently published in Civitas Review, Vol. 4, Issue 2, June 2007. From its inception in 1981 the British Crime Survey of England and Wales arbitrarily capped repeat crimes against the same victim at five so that the survey sample was not distorted by exaggerations or rare events. But repeat victimisation is not a rare event and as Farrell and Pease point out, ‘If one asks people what happened to them and then disregard what they tell you, on the basis that you don’t believe people suffer that much, or you can’t afford the number of full interviews which would result from accepting their veracity, the result is schizoid at best and hypocritical at worst.’ They described the US crime survey as ‘a travesty of justice, seriously flawed’, and accused it of ‘systematic suppression’ of high-volume crimes.

 

 

image001

 

… in ten years, BCS crime fell 44 per cent, representing 8.5 million fewer crimes each year. Critics believed that this was just not credible and preferred police statistics which were far less encouraging and sometimes – on vandalism for example – continued in the opposite direction.

On vandalism the police and BCS trends sometimes went in opposite directions – between 1998 and 2004 the police recorded 29 per cent more of it, while the BCS reported 16 per cent less, and 20 per cent less by 2007.

 

image001