End Pre-Crime
Data and content is being weaponised to criminalise people without cause.
The police and criminal justice authorities are increasing using tech, data and AI to identify people who they believe are at ‘risk’ of committing crimes.
These flawed tactics, which include the Met’s gang matrix, the data mining of social media and the Prevent programme, can result in people having action taken against them even though they haven’t committed a crime. These programmes undermine our presumption of innocence and embeds and exacerbates discrimination that is inherent in our criminal justice system.
That’s why we urge transparency, public consultation and a moratorium on the use of tech in policing that will only amplify systemic oppression.
Social Media Weaponisation
Online content is being mined and used as digital evidence to create gang narratives. In conjunction with extended criminal liability, the weaponisation of content and data is increasingly being used to imprison young Black people and people of colour for offences they have not commited.
OPen letter to andy burnham
15 Civil Society Groups call for an overhaul of discriminatory police practices in the wake of the case of the Manchester 10.
Find out more07 Oct 2022 By Sophia Akram
Young people are being criminalised for content
20 Mar 2023 By Sophia Akram
What’s wrong with ‘gang’ surveillance in the UK?
25 May 2023 By Sophia Akram
George Floyd’s Murder, Three Years On: Insitutional Racism Hardwired in Police Tech
The Prevent Duty
Prevent operates in the pre-crime space, in which no offence has taken place, but rather people are surveilled and viewed as suspicious. It operates by extracting data and policing information that further securitises the spaces of marginalised and vulnerable communities.
14 Feb 2023 By Sophia Akram
Prevent: Shawcross Review fails to address data harms and rights
28 Jun 2022 By Sophia Akram
‘Prevent’ and the attack on free speech
Facial Recognition
Live facial recognition technology generates sensitive biometric data. Its previously disproportionate misidentifications among younger Black men in particular highlights the discriminatory nature of the technology.
18 May 2023 By Sophia Akram
Don’t use Beyonce to normalise live facial recognition
15 Feb 2023 By Sophia Akram
UK Facial Recognition – No Consent, No Oversight
KEEP UP TO DATE WITH OUR CAMPAIGNS
Subscribe to our newsletter to receive updates on the latest developments affecting your digital rights.
Sign up now