Recently, the UK College of Policing conducted a consultation on new guidance in relation to the use of data driven technologies (DDT) and data ethics in policing. DDTs process a wide variety of digitised data, including biometric data to inform policy decisions. Examples of DDTs include artificial intelligence (AI), machine learning and automated decision-making.
This was an important consultation as the breadth of personal, sensitive, biased and inaccurate information that can be obtained by data driven technologies poses significant risks. For example, recently a woman in Spain was murdered by her violent husband, after the Spanish police relied on a decision made by an algorithm (which scored how likely a domestic violence victim is to be abused again) and sent her home without any further protection being granted. Further, data driven technologies used for recruitment by private companies have also shown bias against women.
When deployed by the police, DDTs can be utilised in a way that negatively affects individuals’ lives, employment, state benefits and immigration status. There are significant concerns about the inaccuracies of the technology, which coupled with the reputation of some of the police forces as racist, misogynistic and homophobic, leaves a lot of concerns surrounding the police use of these invasive data driven technologies.
The rapid advances in the field of artificial intelligence and machine learning, and the deployment of new technologies that seek to analyse, identify, profile, and predict, by police, have and will continue to have a seismic impact on the way society is policed. The implications come not solely from privacy and data protection perspectives, but from the ethical question for a democratic society of permitting the roll out of such intrusive technology. The technology risks violating our dignity and contradicting the essence of our rights.
Dr Ksenia Bakina and Angel Pavon-Perez from The Centre for Protecting Women Online (CPWO) responded to the College of Policing consultation. The response highlighted that there were numerous issues and concerns regarding the proposed authorised professional practices (APPs). In particular, the APPs lacked clarity, consistency and foreseeability across police forces as well as a gender perspective in relation to data driven technologies and data ethics.
Both APPs had crucial information missing, which made it difficult to understand how the data driven technologies will be implemented and used across forces. There were limited explanations provided regarding the risks that could arise as a result of deployment of these technologies, and it was unclear how these risks would be prevented during the deployment. The use of invasive data driven technologies poses serious threats to privacy and other fundamental rights, and therefore must be not only subject to strong oversight, safeguards, and transparency measures, but also these measures must be clear and implementable. The APPs do not explain in sufficient detail how the police forces will be safeguarding people, promoting accountability, and understanding. Further, both APPs also fail to provide information regarding who might be accountable for any unfair and biased decisions stemming from the use of DDTs.
Both APPs left a lot to each individual police force to decide which inhibited understanding, predictability, foreseeability, and consistency. It would be quite difficult to achieve consistency of implementation of both APPs, as they are vague in many places and this could lead to numerous inconsistencies as to how the procedures, risk management and governance amongst other things are applied and how any purported safeguards are implemented. There are no provisions for audits to take place across forces. As a result, a lot of these parameters will be subject to project leads’ understanding. This is unlikely to lead to consistency, especially in the absence of any regular auditing requirements. The APPs also state that project leads should ensure that testing should provide clear evidence, as far as possible in a controlled environment, that the DDT is robust, accurate and effective for its intended purpose, and works better than existing approaches and/or delivers a new policing capability. However, it is not clear how this will be measured, who will evaluate and oversee these decisions. These decisions can be subjective and can lead to different outcomes, thereby hindering consistency.
The data driven technologies and data ethics APPs also failed to emphasise the risks faced by women. The rapid development of technology and AI has seen replication of pre-existing gendered bias stemming from historical data reflecting societal inequalities. Despite this, both APPs are gender blind and silent on the particular harms that can be caused to women when data driven technologies are deployed. A recent study on online violence against women highlighted the mistrust victims of online violence have towards the police because of situations involving police officers such as Wayne Couzens and David Carrick. As a result, more needs to be done to safeguard women when invasive technologies are being deployed by police forces.
Considering the above, the response has made the following recommendations to the College of Policing:
Select this link for more information about the full response from the College of Policing consultation.
This blogpost was authored by Dr. Ksenia Bakina.
Figure acknowledgement: Photo by ThisisEngineering on Unsplash