top of page

The Surveillance State

By Sara Chitseko:

The Metropolitan Police have announced they will be rolling out live facial recognition technology (FRT) in another flawed attempt to address crime in London. The technology works by allowing faces captured on CCTV to be matched in real-time, to image databases and watch-lists. There is substantial evidence to demonstrate that FRT is ineffective and inaccurate, particularly when it comes to identifying Black and Brown people (Gal-dem, 2020). The deployment of FRT is likely to fundamentally worsen the already tenuous relationship between racialised, over-policed communities and criminal justice agencies and is a further assault on our civil liberties.


A key concern with FRT, is the risk of misidentification. A study by the US National Institute of Standards and Technology found that people from racialised groups were 100 times more likely to be misidentified than white people (Washington Post, 2020). Furthermore, given that the technology has been created by humans, we can assume that it will have human tendencies to racism and prejudice programmed into it. This is particularly worrying if it is to be deployed in combination with existing discriminatory policing processes and methodologies. It runs the very real risk of implicating people in crimes they did not commit and consequently having to prove they are not who the FRT claims they are.


The decision to implement FRT cannot be isolated from the wider commitment to position policing as the primary solution to social issues. London is already a 'city of surveillance' with around 420,000 CCTV cameras, making it the second-most monitored city in the world after Beijing (Financial Times, 2019). In the wake of recent terror attacks, alongside rising serious violence across the UK, it is critical that community safety is prioritised. But these approaches perpetuate the notion that there is an ever-increasing need for surveillance, rather than looking at the root causes of crime and offering tangible support and resources to reduce it.


Surveillance as a method of social control is not new. The family of Stephen Lawrence had their privacy invaded by police spying and activist groups such as Extinction Rebellion have been put on an extremism watchlist (The Guardian, 2020). These are just some of the cases that were brought into the public domain. Many more individuals experience the pervasiveness of unjust police surveillance through 'gang' databases and electronic tags, among other methods of social control. It is paramount that attempts to reduce crime are actually effective, rather than paying lip service to 'public safety,' while actually doing more harm than good.


A conglomerate of human rights organisations, race equality organisations, technology experts, parliamentarians and academics have released a statement calling for an urgent stop to facial recognition surveillance (Big Brother Watch, 2020). The European Commission have announced that in order to prevent FRT being abused, they are considering a 5 year ban on the technology in public areas in EU countries (The Guardian, 2020). At present, the UK does not have a statutory law with a clear and binding code of practice which relates to the use of FRT (ibid). Storming ahead with deploying new technologies which have the potential to fundamentally alter the relationship between police and the public is not only an invasion of privacy, it sets a worrying precedent for further measures which paint a picture of a society not so far removed from a 'big brother state,' deeming our privacy rights worthless.


We must take action to resist these encroaches on our civil liberties. We must educate others on how measures like FRT feed into a wider government operation of increasing securitisation of Black and Brown communities. And we must take action to build community resilience to subvert the injustices forced upon us.


References:


Big Brother Watch. (2020). Face Off – Big Brother Watch. [online] Available at: https://bigbrotherwatch.org.uk/all-campaigns/face-off-campaign/ [Accessed 13 Feb. 2020].


Financial Times. (2019). London sets standard for surveillance societies. [online] Available at: https://www.ft.com/content/70b35f8a-b47f-11e9-bec9-fdcab53d6959 [Accessed 13 Feb. 2020].

gal-dem. (2020). Facial recognition can't tell black and brown people apart - but the police are using it anyway | gal-dem. [online] Available at: https://gal-dem.com/facial-recognition-racism-uk-inaccurate-met-police/ [Accessed 13 Feb. 2020].


The Guardian. (2020). Facial recognition cameras will put us all in an identity parade | Frederike Kaltheuner. [online] Available at: https://www.theguardian.com/commentisfree/2020/jan/27/facial-recognition-cameras-technology-police [Accessed 13 Feb. 2020].


Washington Post. (2020). Study confirms racial bias of many facial-recognition systems, casts doubt on their expanding use. [online] Available at: https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/ [Accessed 13 Feb. 2020].

37 views0 comments

Recent Posts

See All
bottom of page