Using Artificial Intelligence to Identify Perpetrators of Technology Facilitated Coercive Control Report January 2023

Report Available on:

A collaboration between researchers from London South Bank University, Edge Hill University, Brighton University and De Montfort University were awarded £117,000 by the Home Office Perpetrators Research Fund 2021/22 for this study.

In 2019, the government publicly acknowledged that the Criminal Justice System is failing victim/survivors of rape and sexual assault resulting in an erosion of public trust and confidence. The abduction, rape, and murder of Sarah Everard by a serving police officer started a series of controversies that served to decrease public trust further. Serious sexual offences are also taking the longest time on record to go through Crown Courts in England and Wales, with the time from the first Crown Court hearing to the end of a case averaging nine months.

The government recognises that the volume of digital data and the length of time it takes to analyse it are significant factors in these delays and that they undermine police investigations and the prosecution process. Police forces report being overwhelmed by the exponential growth in the volume of digital evidence, with over 20,000 digital devices waiting to be processed. Victim/survivors of domestic abuse are also waiting up to four and a half years for the police to return their phones following an investigation. This coupled with victim/survivors feelings of being ‘digitally strip searched’ (i.e., police interest in seemingly irrelevant information such as search histories relating to shopping or holidays) is contributing to high numbers of survivor/victims withdrawing from cases.

This study is one of the 21 projects funded by the Home Office for research on perpetrators of domestic abuse. It is interested in a specific form of domestic abuse known as Technology Facilitated Coercive Control (TFCC) and focussed on the digital communication between (alleged) perpetrators and victim/survivors held on mobile phones. The purpose of this feasibility study was twofold,

  1. to test the viability of an Artificial Intelligence (AI) programme to identify () perpetrators (including alleged perpetrators) of domestic abuse using digital communications held on mobile phones
  2. to examine police and victim/survivor attitudes towards using AI in police investigations.

Using digital conversations extracted from court transcriptions where TFCC was identified as a factor in the offending, the research team tested data sets built on different methods and techniques of AI. Natural Language Processing (NLP) tools, a subfield of AI, were also tested for their speed and accuracy in recognising abusive communication and identifying and risk assessing perpetrators of TFCC.

Conscious of national concern about policing practices relating to Violence Against Women and Girls and that any AI programme would be futile without the co-operation of both the police and the public, two online surveys were devised to measure opinion. The first sought insight into the attitudes of victim/survivors, viewed as experts in domestic abuse, about using AI in police investigations. The second involved the police and questioned their views of using AI in this way.

Organisations who support victim/survivors of domestic abuse and who are known to the research team were approached for their help recruiting victim/survivor participants. To the team’s knowledge, this is the first time the views of survivor/victims about the role of AI in police investigations have been sought. Individual police officers or those with connections to the police service were approached inviting them to complete the questionnaire. These organisations and individual participants were also asked to distribute the link to the survey amongst their wider networks. The link was also posted and promoted at regular intervals on Twitter and LinkedIn. As an incentive, victim/survivors of domestic abuse were offered the opportunity to enter a draw for the chance to win a £100 voucher.

A total of 81 victim/survivors from diverse demographics took part in the survey. Results showed that 70% victim/survivors of domestic abuse were willing to share their digital data with the police if AI technology was used. Victim/survivors’ feelings of being ‘digitally strip searched’ was less clear as the responses were more evenly distributed. Comments in the text boxes suggest that victim/survivors of domestic abuse are curious about how AI can be used to help police with their enquiries but have concerns about the bias of such a programme which is, at least in part, linked to a mistrust of the police. Victim/survivors were also aware of the importance of understanding TFCC within a wider context and were unclear as to the programme’s ability to do this. More qualitative research is required to gain an in-depth understanding of survivor/victims concerns and hopes for using AI in police investigations in the future.

AI’s ability to understand the digital data within the wider context was echoed by some of the 28 police staff who participated in this survey. The issue of bias by this technology was also an issue. Research shows that concern relating to AI bias is often misplaced as it is the data that is subject to bias, not the programme itself. To mitigate against this potential bias, further exploration is required utilising larger data sets. The result of both surveys’ also suggests that educating the public to dispel some of the myths around AI technology would be beneficial.

Domestic abuse cases involving TFCC were identified from newspapers and public databases. Six court transcripts were obtained, the digital communication between (alleged) perpetrators and the victim/survivors was removed, anonymised, and entered onto a data base. This provided a usable dataset of 219 messages. Because this research focussed on understanding the behaviour of (alleged) perpetrators of TFCC only communication threads of (alleged) perpetrators were used. This provided a total of 250 relevant messages which were enriched with an additional 242 perpetrator messages obtained from online repositories. Data instances that represent the absence of coercive and controlling behaviour were retrieved from twitter, bringing the total number of messages used in this research to 1012.

Three classifiers (software systems that process text data at scale) were used in this study namely Random Forest, SVM Linear and RBF. All were trained with embeddings from BERT, GPT2, GloVe and Word2Vec. Results showed the technologies are both fast and accurate in predicting perpetrators of domestic abuse. Based on these encouraging findings further research is necessary, with larger data sets, to train models to have an in-depth understanding of TFCC and test its application to diverse real-world scenarios.

This research has tested the feasibility of AI technology to address government concerns about the rate of convictions relating to cases of sexual and domestic abuse. Findings indicate public support (albeit cautious) on behalf of police and victim/survivors for using AI in police investigations. Early results suggest that this technology would quicken the police’s ability to process digital data, cut down on the length of time they hold victim/survivor phones, limit delays in court processing and reduce the number of victim/survivors of TFCC who withdraw from cases. Further research is required to test the generalisability of this project and determine how it could best be used to increase the efficiency and effectiveness of the Criminal Justice System when dealing with domestic abuse cases.

Vanessa Bettinson

Professor of Criminal Law and Criminal Justice. Research interests include criminalisation of domestic abuse/coercive control through a doctrinal, socio-legal, interdisciplinary and practice lens.