There is no single, reliable source of information on how
many CCTV cameras there are in the UK, but it is estimated that there are
between four and six million. It is easy to think that, like Big Brother in
George Orwell's 1984, the majority of
these cameras are government owned and controlled, that we are under constant
surveillance by the police and the state, however, the number of cameras
controlled by local government is thought to be as low as one in seventy.[1]
So the majority of the cameras that we see - or don't see for that matter, as many
are not in public view - are privately owned and used to protect private
premises.
Watch any TV crime drama in which CCTV is used to catch the
bad guys and the images are crystal clear, and the details sharp as a pin.
Zoom in and the detectives can read car number plates, the logos on clothing
and distinguishing features on people captured on film. Then watch any news
report, and CCTV captured that is then released to the public to try to identify
victims or villains is such a blurry, pixelated mess that it makes one wonder
how it can ever be useful in solving crimes.
CCTV imagery as imagined in TV fiction... |
...and the reality. |
Well, apparently more so than one
might imagine, as according to a study by Nottingham Trent University, CCTV
cameras provide evidence that is useful to the police in two-thirds of the
investigations in which they are available.
Useful CCTV images increased the probability of a crime being detected
from 23% to 48%, the study found, although their usefulness very much depended
on the type of offence, as this table shows.
Type of Offense
|
Usefulness of CCTV Cameras (in cases where images were available)
|
Robbery
|
62%
|
Serious Assault
|
61%
|
Theft from Shops
|
53%
|
Public Order Offenses
|
44%
|
Theft from Motor Vehicles
|
16%
|
The images from CCTV - no matter how grainy - that are used in
appeals for information from the public and in police investigations, often
rely on individuals, vehicles, places and the like being identified by other
people, but increasingly technology is being used, which is where a lot of
concerns have begun to surface.
Civil rights group Liberty, which describes itself as "an
an independent membership organisation that challenges injustice, defends freedom
and campaigns to make sure everyone in the UK is treated fairly," has
begun a campaign against facial recognition technology and practices which some
UK police forces have begun to introduce.
In Wales, where South Wales Police
have been trialling facial recognition technology for a few years, Ed Bridges
has started legal proceedings against the force, arguing that the use of the
tool "breached his human right to privacy as well as data protection and
equality laws."
Ed Bridges. Picture: BBC |
And in my part of the world, the Metropolitan Police have
also been trialling facial recognition, and in a recent deployment in Romford, a
man who covered his face to avoid it being captured by the camera ended up with
a £90 fine for disorderly behaviour after being challenged by police. Interestingly, the Met's website says, "Anyone
can refuse to be scanned; it's not an offence or considered ‘obstruction’ to
actively avoid being scanned," so the chap fined in Romford was not charged for avoiding the cameras, but for what happened after police challenged him, although whether police were entitled to challenge him is moot.
Mr Bridges is no doubt aware of the plethora
of other cameras that record his image, whether it be when he enters a bank,
shopping centre or railway station, and presumably is happy that they don't
breach his right to privacy nor data protection laws. I suppose his argument is
that neither shops nor transport hubs can use his image to see if he is wanted
for some criminal offence, but there again, that isn't the argument he is using
against the police. But if the average Londoner is caught on CCTV 300 times a
day, then in the majority of those cases, someone sensitive to their image
being captured - for whatever reason - would not even be aware that it had
happened.
The Met's trial of facial recognition technology has
concluded with the London Policing Ethics Panel publishing a report supporting
its further use so long as certain conditions, including that " It can be
evidenced that using the technology will not generate gender or racial bias in
policing operations," are met.
One of Liberty's objections to the use of facial recognition
technology is that while its usefulness in spotting terrorist suspects and
preventing atrocities is clear, the technology is being used for "more
mundane policing, such as catching pickpockets." I very much doubt that
the victims of thefts would describe the crime against them as mundane, nor do
I think it appropriate that the police are proscribed from using all of
the technology at their disposal in investigating certain offences. After all,
we would not say that fingerprints could only be used in investigating
robberies, or that DNA evidence could only be used in cases of sexual assault.
That would be like asking the police to do their job with a hand tied behind
their backs.
Liberty - and other opponents of the use of facial recognition
technology - do have a point, however when they say that women and racial
minorities are not identified accurately by the technology, but that in itself
is not sufficient reason for the technology not to be used, more a reason why
it ought to be refined and made to more accurately identify groups of people
that it currently finds hard to distinguish properly. Of course, that's easy for
me to say, I'm neither a woman nor a member of an ethnic minority, and therefore
statistically much less likely to be misidentified, but this sort of technology
only improves through use, that is to say it 'learns' and inevitably, there
will be misidentifications. When it was used at football's Champions League
Final in Cardiff in 2017, it scanned about 170,000 people and wrongly
identified 2,297 people out of 2,470 potential matches with custody pictures
(i.e. those wanted for various offences), which is a startling 92% error rate.
None of these people were arrested by the way, and I am sure that the errors in
identification were invaluable in improving the software.
I fully understand the concerns that are held about
misidentification, about the distress and inconvenience that people who are
misidentified will experience, but arguments against the use of the technology,
such as those advanced by Ed Bridges - who claims that as well as his right to
privacy being breached, he was distressed by having his face scanned - strike
me as specious. Anyone who has ever
owned a smartphone, conducted an internet search, opened a bank account,
entered a shopping centre, taken an overseas trip, bought anything online, or
driven a car has abdicated any rights to privacy that they might have thought they
had already. If you are worried about privacy, or what data organisations might
hold about you, don't do any other the things I've just mentioned. Facebook and
Google already know more about you and me than any police force is ever likely
to, and the chances of being erroneously
identified as being wanted by the police are, in the grand scheme of things, infinitesimal.
Many opponents of facial recognition technology are also
likely to be concerned about the levels of crime - especially knife crime - in
London, and will be keen that the police do something about it. Whether the
police are able to use all of the tools potentially available to them might
ultimately depend on whether one man's right to privacy trumps another man's
right not to be stabbed.
No comments:
Post a Comment