With the COVID-19 pandemic reshaping our daily routine, like everyone else, young people are spending even more time online, therefore, increasing the risk of suffering online abuse.
A recent survey reported by the National Society for the Prevention of Cruelty to Children (NSPCC) before the pandemic reveals that 200,000 young people in the UK may have been groomed on social networks until 2019, and 21% of surveyed girls aged 11 to 17 had already received a request for a sexual image or message.
However, it is hard to know how many children and young people are affected by online abuse, mainly because many refrain from telling anyone due to feeling ashamed or guilty, not knowing whom to talk to, or simply not realising that they are being abused.
NSPCC Childline and Child Exploitation and Online Protection Command (CEOP) are the main national stakeholders for assisting children who are feeling victims of online abuse. A chatbot, computer programs that simulate human conversations, could increase their capacity to support children in self-assessing their situation.
This investigation led by Dr Lara Piccolo, a human-computer interaction researcher at the Knowledge Media Institute, explored whether young people would rely on a chatbot to get this support, in which circumstances that could work, and how such a chatbot should be designed.
To obtain these answers from a user-centred perspective, more than 100 school children aged 11-17 years participated in one of eight workshops she organised in collaboration with the Safety Centre Alley MK in schools in Milton Keynes and London from October to December 2019. The workshops participants shared their views about using such technology and, as a co-creation activity, performed stories with Lego figures to simulate situations in which the chatbot could help a young person facing a stressful situation online.
The results reveal that most children welcome the possibility of interacting with a bot as they will not feel embarrassed and or judged. Mainly the younger participants expressed that the chatbot could be the help they need when parents let them down or are unable to help. They expected the chatbot to help them mainly in assessing the severity of the situation, raising their confidence to take any further action, supporting them emotionally and providing advice for taking the next steps.
The dialogues generated by the participants during their 'performances' were analysed, suggesting adequate vocabulary, level of formality, ethical and privacy concerns, as well as the level of psychological support the potential users expect to receive from the chatbot.
This research report can be accessed here.
Lara can be contacted here.
Thursday, March 6, 2025 - 10:30 to 12:30
Microsoft Teams