Data-based profiling tools are being used by the UK Ministry of Justice (MoJ) to algorithmically “predict” people’s risk of committing criminal offences, but pressure group Statewatch says the use of historically biased data will further entrench structural discrimination.
Documents obtained by Statewatch via a Freedom of Information (FoI) campaign reveal the MoJ is already using one flawed algorithm to “predict” people’s risk of reoffending, and is actively developing another system to “predict” who will commit murder.
While authorities deploying predictive policing tools say they can be used to more efficiently direct resources, critics argue that, in practice, they are used to repeatedly target poor and racialised communities, as these groups have historically been “over-policed” and are therefore over-represented in police datasets.
This then creates a negative feedback loop, where these “so-called predictions” lead to further over-policing of certain groups and areas, thereby reinforcing and exacerbating the pre-existing discrimination as increasing amounts of data are collected.
Tracing the historical proliferation of predictive policing systems in their 2018 book Police: A field guide, authors David Correia and Tyler Wall argue that such tools provide “seemingly objective data” for law enforcement authorities to continue engaging in discriminatory policing practices, “but in a manner that appears free from racial profiling”.
They added it therefore “shouldn’t be a surprise that predictive policing locates the violence of the future in the poor of the present”.
Computer Weekly contacted the MoJ about how it is dealing with the propensity of predictive policing systems to further entrench structural discrimination, but received no response on this point.
MoJ systems
Known as the Offender Assessment System (OASys), the first crime prediction tool was initially developed by the Home Office over three pilot studies before being rolled out across the prison and probation system of England and Wales between 2001 and 2005.
According to His Majesty’s Prison and Probation Service (HMPPS), OASys “identifies and classifies offending-related needs” and assesses “the risk of harm offenders pose to themselves and others”, using machine learning techniques so the system “learns” from the data inputs to adapt the way it functions.
Structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly Sobanan Narenthiran, Breakthrough Social Enterprise
The risk scores generated by the algorithms are then used to make a wide range of decisions that can severely affect people’s lives. This includes decisions about their bail and sentencing, the type of prison they’ll be sent to, and whether they’ll be able to access education or rehabilitation programmes while incarcerated.
The documents obtained by Statewatch show the OASys tool is being used to profile thousands of prisoners in England and Wales every week. In just one week, between 6 and 12 January 2025, for example, the tool was used to complete a total of 9,420 reoffending risk assessments – a rate of more than 1,300 per day.
As of January this year, the system’s database holds over seven million risk scores setting out people’s alleged risk of reoffending, which includes completed assessments and those in progress.
Commenting on OASys, Sobanan Narenthiran – a former prisoner and now co-CEO of Breakthrough Social Enterprise, an organisation that “supports people at risk or with experience of the criminal justice system to enter the world of technology” – told Statewatch that “structural racism and other forms of systemic bias may be coded into OASys risk scores – both directly and indirectly”.
He further argued that information entered in OASys is likely to be “heavily influenced by systemic issues like biased policing and over-surveillance of certain communities”, noting, for example, that: “Black and other racialised individuals may be more frequently stopped, searched, arrested and charged due to structural inequalities in law enforcement.
“As a result, they may appear ‘higher risk’ in the system, not because of any greater actual risk, but because the data reflects these inequalities. This is a classic case of ‘garbage in, garbage out’.”
Computer Weekly contacted the MoJ about how the department is ensuring accuracy in its decision-making, given the sheer volume of algorithmic assessments it is making every day, but received no direct response on this point.
A spokesperson said that practitioners verify information and follow detailed scoring guidance for consistency.
While the second crime prediction tool is currently in development, the intention is to algorithmically identify those most at risk of committing murder by pulling a wide variety of data about them from different sources, such as the probation service and specific police forces involved in the project.
Statewatch says the types of information processed could include names, dates of birth, gender and ethnicity, and a number that identifies people on the Police National Computer (PNC).
Originally called the “homicide prediction project”, the initiative has since been renamed to “sharing data to improve risk assessment”, and could be used to profile convicted and non-convicted people alike.
According to a data sharing agreement between the MoJ and Greater Manchester Police (GMP) obtained by Statewatch, for example, the types of data being shared can include the age a person had their first contact with the police, and the age they were first the victim of a crime, including for domestic violence.
Listed under “special categories of personal data”, the agreement also envisages the sharing of “health markers which are expected to have significant predictive power”.
This can include data related to mental health, addiction, suicide, vulnerability, self-harm and disability. Statewatch highlighted how data from people not convicted of any criminal offence will be used as part of the project.
In both cases, Statewatch says using data from “institutionally racist” organisations like police forces and the MoJ will only work to “reinforce and magnify” the structural discrimination that underpins the UK’s criminal justice system.
Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed Sofia Lyall, Statewatch
“The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Statewatch researcher Sofia Lyall.
“Like other systems of its kind, it will code in bias towards racialised and low-income communities. Building an automated tool to profile people as violent criminals is deeply wrong, and using such sensitive data on mental health, addiction and disability is highly intrusive and alarming.”
Lyall added: “Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.”
Statewatch also noted that Black people in particular are significantly over-represented in the data held by the MoJ, as are people of all ethnicities from more deprived areas.
Challenging inaccuracies
According to an official evaluation of the risk scores produced by OASys from 2015, the system has discrepancies in accuracy based on gender, age and ethnicity, with the risk scores generated being disproportionately less accurate for racialised people than white people, and especially so for Black and mixed-race people.
“Relative predictive validity was greater for female than male offenders, for White offenders than offenders of Asian, Black and Mixed ethnicity, and for older than younger offenders,” it said. “After controlling for differences in risk profiles, lower validity for all Black, Asian and Minority Ethnic (BME) groups (non-violent reoffending) and Black and Mixed ethnicity offenders (violent reoffending) was the greatest concern.”
A number of prisoners affected by the OASys algorithm have also told Statewatch about the impacts of biased or inaccurate data. Several minoritised ethnic prisoners, for example, said their assessors entered a discriminatory and false “gangs” label in their OASys reports without evidence, a decision they say was based on racist assumptions.
Speaking with a researcher from the University of Birmingham about the impact of inaccurate data in OASys, another man serving a life sentence likened it to “a small snowball running downhill”.
The prisoner said: “Each turn it picks up more and more snow (inaccurate entries) until eventually you are left with this massive snowball which bears no semblance to the original small ball of snow. In other words, I no longer exist. I have become a construct of their imagination. It is the ultimate act of dehumanisation.”
Narenthiran also described how, despite known issues with the system’s accuracy, it is difficult to challenge any incorrect data contained in OASys reports: “To do this, I needed to modify information recorded in an OASys assessment, and it’s a frustrating and often opaque process.
“In many cases, individuals are either unaware of what’s been written about them or are not given meaningful opportunities to review and respond to the assessment before it’s finalised. Even when concerns are raised, they’re frequently dismissed or ignored unless there is strong legal advocacy involved.”
MoJ responds
While the murder prediction tool is still in development, Computer Weekly contacted the MoJ for further information about both systems – including what means of redress the department envisages people being able to use to challenge decisions made about them when, for example, information is inaccurate.
A spokesperson for the department said that continuous improvement, research and validation ensure the integrity and quality of these tools, and that ethical implications such as fairness and potential data bias are considered whenever new tools or research projects are developed.
They added that neither the murder prediction tool nor OASys use ethnicity as a direct predictor, and that if individuals are not satisfied with the outcome of a formal complaint to HMPSS, they can write to the Prison and Probation Ombudsman.
Regarding OASys, they added there are five risk predictor tools that make up the system, which are revalidated to effectively predict reoffending risk.
Commenting on the murder prediction tool specifically, the MoJ said: “This project is being conducted for research purposes only. It has been designed using existing data held by HM Prison and Probation Service and police forces on convicted offenders to help us better understand the risk of people on probation going on to commit serious violence. A report will be published in due course.”
It added the project aims to improve risk assessment of serious crime and keep the public safe through better analysis of existing crime and risk assessment data, and that while a specific predictive tool will not be developed for operational use, the findings of the project may inform future work on other tools.
The MoJ also insisted that only data about people with at least one criminal conviction has been used so far.
New digital tools
Despite serious concerns around the system, the MoJ continues to use OASys assessments across the prison and probation services. In response to Statewatch’s FoI campaign, the MoJ confirmed that “the HMPPS Assess Risks, Needs and Strengths (ARNS) project is developing a new digital tool to replace the OASys tool”.
An early prototype of the new system has been in the pilot phase since December 2024, “with a view to a national roll-out in 2026”. ARNS is “being built in-house by a team from [Ministry of] Justice Digital who are liaising with Capita, who currently provide technical support for OASys”.
The government has also launched an “independent sentencing review” looking at how to “harness new technology to manage offenders outside prison”, including the use of “predictive” and profiling risk assessment tools, as well as electronic tagging.
Statewatch has also called for a halt to the development of the crime prediction tool.
“Instead of throwing money towards developing dodgy and racist AI and algorithms, the government must invest in genuinely supportive welfare services. Making welfare cuts while investing in techno-solutionist ‘quick fixes’ will only further undermine people’s safety and well-being,” said Lyall.