The UN Human Rights Council’s Advisory Committee (HRCAC) sought input and recommendations about technology-facilitated gender-based violence (TFGBV) and how to tackle this growing issue.
The Centre for Protecting Women Online (CPWO) submitted an inter-disciplinary response which included perspectives from law and policy, psychology, technology and AI as well as policing. This response highlighted the current issues and gaps in relation to tackling TFGBV.
The Four Nation Study, conducted by Professor Olga Jurasz (CPWO), examined the scale, impact and societal attitudes towards TFGBV across the UK. This study found that in the United Kingdom, across the four nations, 1 in 10 women (15%) have experienced online violence, while 3 in 10 women (30%) have witnessed online violence. It also found that most incidents (82%) of online violence that women experienced in the 12 months preceding the survey were text-based while approximately a third were image-based (33%).
Despite the prevalence of TFGBV, there are still significant gaps in understanding of the full scope of online violence. Current approaches tend to fall short of addressing the variety of online harms, with a predominant focus on image-based abuses such as ‘revenge porn’ or ‘deepfakes’. This narrow lens is prone to overlooking other forms of harms such as doxxing, online harassment and cyberstalking, all of which are equally harmful and pervasive.
TFGBV infringes on numerous human rights, including but not limited to, their freedom of expression, right to privacy and freedom from discrimination. Many forms of TFGBV, such as online harassment, can have a ‘chilling’ effect which is essentially women and girls reducing sharing their opinion online or outright withdrawal from online spaces. This negatively impacts women and girls’ ability to access vital information, express their opinions or engage in activism.
The right to be free from discrimination is equally violated by TFGBV. Content related to misogyny and sexism or inciting violence to women and girls reinforces gender inequality as it discriminates on the basis of sex and/or gender. This is interlinked with freedom of expression as limited participation online often creates barriers to accessing information and opportunities which also reinforces discrimination.
The Four Nations Study found that adolescent girls and young women are disproportionally targeted by TFGBV. Also, LGBTQ+ and gender diverse people who are subjected to intersecting forms of oppression, such as gender inequality are more likely to experience TFGBV. Younger women and girls are less likely to report instances of TFGBV, this could be due to the societal normalisation of online violence. Further, although Black and Afro-Caribbean women are no more likely than average to have experienced online violence, they are more likely to have witnessed it.
The role of AI in facilitating TFGBV is often overlooked when discussing TFGBV. Decision-making AI systems are becoming more frequently relied upon within organisations such as law enforcement, healthcare and financial institutions. These systems can reinforce gender stereotypes and prejudices, this is detrimental to the protection of women and girls. The Barkley Haas Center for Equality, Gender and Leadership found 44% of AI systems showed gender biases, with 25% showing gender and racial biases. This exacerbates the inequalities women and girls face online.
Due to a lack of large-scale studies, it is difficult to really understand the full scope of TFGBV and its impact on women and girls. Many studies tend to focus on selected types of violence (e.g. image-based abuse) or particular context (e.g. violence against women in politics). Whilst these studies are valuable, they only give a partial insight into the problem.
Having fragmented responses around the severity and impact of TFGBV leads to a lack of understanding overall. This affects how the government and other organisations can develop and adapt preventative measures and interventions surrounding TFGBV. This, in turn, leads to a failure to capture the full extent of harms arising from specific and distinct acts of TFGBV which, by extension, has impact on women and, in the legal context, the available avenues of redress.
This is especially acute when considered through an intersectional lens, where factors such as race, sexual orientation and disability intersect with gender to create distinctive vulnerabilities. This makes it more difficult to develop effective preventions and strategies but also further marginalises more vulnerable women and girls.
Research into the behaviours and motivations of perpetrators of TFGBV would allow for the development of effective preventative interventions to deter individuals from contributing to violence online. Having additional funding would allow for public awareness to be improved, alongside stronger accountability measures for digital platforms and greater victim support. Further research on the effectiveness of interventions is crucial to be able to determine effective strategies and preventions to deter TFGBV. Evaluating existing legislation and policies aimed at addressing TFGBV will establish how legal framework can be adapted and strengthened whilst ensuring perpetrators are further deterred from participating in online violence.
Whilst prevention of TFGBV is a current focus for the UK government, many of the strategies outlined, such as the Tackling Violence Against Women and Girls Strategy 2021, mainly focus on offline violence or focus on specific forms of TFGBV. Current approaches tend to fall short of addressing the variety of online harms such as doxxing, online harassment and cyberstalking, all of which are equally harmful and pervasive. The National Audit Office have recently reported that these efforts have not improved outcomes for the victims of these crimes or the safety of women and girls more widely.
The Online Safety Act 2023 (OSA 2023) was designed to place a duty of care on online platforms to risk assess and remove harmful content in order to protect everyone online. This legislation addresses online harms such as harassment, image-based sexual abuse and other communications offences. There are provisions which require online platforms to prevent and remove any harmful material relating to TFGBV. However, the Online Safety Act 2023 has been criticised for failing to provide any enforcement mechanisms to ensure that the online platforms comply with requirements to prevent and remove harmful content. There are many gaps in this legislation in terms of other harmful types of online violence.
Here you can find more information on the full consultation.
This blog post was authored by Iona Black