Privacy Campaigners Slam Labour for Developing ‘Predictive Crime’ Algorithm Surveillance System
The Ministry of Justice has been slammed for developing a new surveillance programme which uses algorithms to trawl through personal data of thousands of people in order to predict the likelihood an individual will be a serious criminal in the future. According to an FOI request by civil liberties group Statewatch, the project – originally called the “Homicide Prediction Project” under Sunak’s government, since renamed the “Sharing Data to Improve Risk Assessment”– now gathers vast troves of personal information from people without a criminal conviction – specifically the victims of crime. Officials insist only data from those with at least one criminal conviction has been used in the ‘research.’ The surveillance state strikes again…
Rebecca Vincent, Interim Director of Big Brother Watch, tells Guido:
“We are alarmed by reports that the government is developing a programme enabling machines to predict who might become a murderer. We know that algorithms can get it wrong, that AI can get it wrong, and that police themselves can get it wrong, even when crimes have already taken place – they must not be allowed to use pervasive technology to target innocent people who have not committed any crime. The privacy implications are enormous, representing a human rights nightmare reminiscent of science fiction that has no place in the real world, and certainly not in a democracy. This dangerous programme should be immediately scrapped and should never see the light of day.”
Data handed to the government will include names, dates of birth, gender, ethnicity, and a unique identifier from the Police National Computer. I’m arresting you for the future murder of Sarah Marks…