Part of the deal of living in a city like London is that you are going to be recorded in some way every time you leave the house…
According to one recent survey, we only rank behind the Chinese cities of Taiyuan and Wuxi in the competition for who has the most CCTV cameras per person (there are an estimated 73.3 surveillance cameras per 1,000 Londoners, in case you were wondering).
Last week it was reported that Greater London has more speed cameras than anywhere in the UK (995 camera in total, or 0.6 cameras per square Km).
Last month, Aldi began trialling ‘age estimation technology’ for customers buying alcohol in its checkout-free shop in Greenwich (by ‘age estimation technology’ read ‘facial analysis cameras’). Asda, Co-op and Morrisons are all installing similar systems.
And if you’ve used any of TfL’s Wi-Fi points then your location has been tracked “to understand how people navigate the network”.
There are plenty of examples like this, but for most people this is the trade off they make for living in a vast, modern metropolis. And, if you don’t want your data to be harvested, then most of the time there’s a choice you can make in order to limit your exposure.
But what about when a person is not given a choice? When someone’s privacy is handed over to a force that doesn’t require their consent and which might rely on rudimentary and possibly discriminatory algorithms in order to try and take away their liberty?
In this issue we’re highlighting four technologies currently in use by London’s police force that have been in the news in weeks or months (not necessarily for the right reasons) but which might have gone under your radar amongst all the other recent news surrounding the Met.
The Gangs Violence Matrix
In August of last year we published an issue on the tenth anniversary of the London riots, asking if it could happen again. While trying to answer that question we looked at a recent study that showed “Black people are still far more likely to be targeted by police than white people,” and quoted one expert who said that “there will be times in the coming months where the police will be confronted with difficult situations, then everything rides on are how well they handle those conditions.”
Ironic then that one of the things set up by the Met in the wake of the riots was the Gangs Violence Matrix. This ‘predictive policing’ tool was supposed to identify those at risk of committing gang-related violence by taking variables such as previous offences, social media activity and friendship networks and calculating a “risk score”. At one point, sharing videos of grime or drill music was considered a key indicator of gang affiliation.
How do you think that went?
At the start of last year a thousand names of young, black men were removed from the database after a review found “they posed no or little risk of committing violence”. This month the human rights organisation, Liberty threatened legal action against the Met over the Gangs Matrix, because they say it “discriminates against ethnic minority groups, especially Black people who are disproportionately represented on the database at 79%.”
In their response the Met admitted that the Gangs Violence Matrix “targets Black people because it believes they are more likely to commit crimes.”
The latest figures on stop and search were published just this month. They show that almost half of all stop and searches take place in London and that Black people are nine times more likely than white people to be stopped and searched.
Live Facial Recognition
On 27 January the government relaxed the requirements for mask wearing. The very next day, the Met deployed their live facial recognition technology in Oxford Circus and arrested four people as a result.
What does ‘deploying live facial recognition technology’ mean? Well, essentially it means that the Met send out a ‘facial recognition van’ (above) to drive around and record people (along with “around 25 uniformed officers and 25 plainclothes officers” in this case).
Those images are then “streamed directly to the Live Facial Recognition system. This system contains a watchlist: a list of offenders wanted by the police and/or the courts, or those who pose a risk of harm to themselves or others” (that’s the definition from the Met’s own website).
On this occasion the system processed the biometric information of 12,120 people that was checked against a watchlist containing 9,756 images (which doesn’t sound like a very targeted list to us) and seven people were stopped.
According to Computer Weekly’s report, “one arrest was of a man wanted on an extradition warrant related to alleged drugs offences and serious assault, while the other three were for unspecified drugs offences, an unspecified traffic offence, and a man wanted in connection with alleged death threats.”
“Unspecified traffic offence” doesn’t really sound like the type of “serious and violent crime” that the Met say they are using this technology to stop. And we’d really like to know exactly what those “unspecified drugs offences” were.
Silkie Carlo, the director of Big Brother Watch, personally witnessed four of the stops the Met made and said that “two shouldn’t have happened – one was in relation to outdated data and another was a straightforward misidentification.”
Retrospective Facial Recognition
On the 19th of August last year, the Mayor of London’s Office for Policing and Crime (MOPAC) agreed to award a company called Northgate Public Services a £3,084,000 contract for the “implementation and annual running costs” of Retrospective Facial Recognition software.
When it comes to RFR, the Met’s website is a little less forthcoming than it is about LFR. Instead of explaining what RFR is, the Met just say that they “are currently working to integrate the updated capability and develop a suite of documents to ensure we have the right controls and safeguards in place to use the technology.”
If that makes you think that there is a risk that RFR could be abused or mishandled, then you’d be right. Because, while LFR checks against a (supposedly) specific watchlist, RFR checks against a far wider list, one that could include “images that have been captured by cameras at burglaries, assaults, shootings and other crime scenes. They could also be images shared by or submitted by members of the public.”
That explanation is lifted directly from MOPAC’s decision to award the contract. Here’s the next section:
“The RFR use case is very different to Live Facial Recognition and seeks to help officers identify persons from media of events that have already happened and does not involve members of the public walking past the system ‘live time’. As such it would be a tool that helps aid the investigative process, by analysing still images or images that have been specifically extracted from a media source. The result of this analysis will present investigators with additional leads to consider.”
For his excellent Wired article about the deployment of RFR, Sam Woodhams spoke to Ella Jakubowska, policy advisor at European Digital Rights, who says that “Those deploying [RFR] can in effect turn back the clock to see who you are, where you've been, what you have done and with whom, over many months or even years. [It can] suppress people's free expression, assembly and ability to live without fear”.
Woodhams also mentions that “The London Policing Ethics Panel, an independent scrutiny group set up by the Mayor’s office, has been tasked with reviewing and advising the Met on its use of the RFR.”
In the minutes for their meeting on 8 November of last year the LPEP note that they had “a detailed discussion of the MPS’s plans for trialling the technology in the operational live environment.” That’s it.
It’s worth mentioning that, in October last year, the European Commission called for an outright ban on the use of facial recognition on mass CCTV footage by police, citing the risk of misidentification or prejudice.
Body Cams
Body-worn cameras are a positive innovation, surely. They improve accountability in the police force and help provide irrefutable evidence of misconduct. Right?
Well, you’d definitely hope so, given that the Met has 26,500 body-worn cameras in operation right now. That’s the equivalent of having a camera on four out of every five serving police officers in London.
That stat comes from a Freedom of Information request obtained by Huck at the end of last year. A couple of weeks ago Huck wrote up the results of their findings in an article headlined Body-Worn Cameras Are Quietly Taking Over London. In that article the Met defend body cams as useful tools that can “enrich the qualitative detail of incidents to support the criminal justice process and provide an independent account,” but a Liberty spokesperson tells Huck that “the evidence does not show that police-controlled technology is effective for police accountability”.
Similarly, while the Sewell report released in the wake of the Black Lives Matter protests recommended that body cams would “counter ethnic disparity” during stop and searches, Huck points out that “filming incidents alone will not correct the imbalance in the number of people actually stopped by police.”
Huck also notes that, while the Met’s cameras don’t use live facial recognition, the footage from them can of course, be analysed using retrospective facial-recognition.
News bits
As the fallout from Cressida Dick’s resignation continues, the first name to come up as a potential successor is that of Dame Lynne Owens. Owens (who is apparently Prita Patel’s first choice for the role) stepped down as head of the National Crime Agency last year after being diagnosed with breast cancer, but is now fully recovered and “seeking a new challenge”.
While the Home Office is “said to be looking at all options to freeze Khan out of the selection process,” the mayor has written an editorial for the Guardian saying that he doesn’t want to go back to the “bad old days of the Met from my childhood [when] it was commonplace to hear stories of racist, misogynistic and abusive conduct by police officers.”
If you have a Times subscription you can read the editoral by Nick Bowes of the Centre for London (and Sadiq Khan’s former director of mayoral policy) arguing that Cressida Dick’s resignation is proof the Met must be scrapped.
The London LGBTQIA+ Film Festival opens on March 16 at BFI Southbank. It will open with ‘Girl Picture’, which won the Dramatic Audience Award at Sundance; and it will close with ‘Tramps!’ Kevin Hegge‘s documentary about the rise of the New Romantics in 1980s London.
The Michelin Guide has awarded 16 new Bib awards for those restaurants it deems to be offering “good food at competitive prices”. Five of the sixteen are in London and they are Brutto in Clerkenwell, Pahli Hill in Regent’s Park and Marylebone, manteca in Shoreditch, and Humble Chicken and Imad’s Syrian Kitchen, both in Soho.
A new exhibition at the Horniman Museum promises to “explore the scientific, social and cultural role that cats and dogs play in our world”. It’s called, simply, Cats and Dogs and it’s on until Ocotiber.
Cara Delevingne is going to be starring in a new ‘eco-thriller’ called The Climb, which will “tell the true story of female activists who protested Shell Oil's plan to drill in the Arctic by scaling the London Shard”.
Lib favourite, cafe Royal Books, has just released David Hoffman — Whitechapel Markets 1972–1977, which includes fantastic images like this one:
Oh dear - even the Comparitech article notes the London number for CCTVs may well be completely bogus - “ However, in some instances, it may not be clear what cameras are included, meaning some private camera figures may also be included in the totals. We believe this may be the case for London, Indore, and Sydney.”
CCTV counts for London are notoriously dodgy. And it all goes back to people (like security companies) using a survey based on walking around two streets in Putney.