In 2020 a team led by Professor Emma Barrett and Professor Steve Pettifer was commissioned by GCHQ to examine how child exploitation and abuse may evolve in light of the latest emerging technology trend: the widespread adoption of immersive ‘eXtended Reality’ (XR) technologies, including Augmented Reality and Virtual Reality. Their report synthesises research on online child sexual exploitation and abuse in the context of XR. In this blog, Professors Barrett and Pettifer explain how offenders use technology and how they might use XR to create new, and further existing routes, to access, exploit and abuse children. They make several recommendations for policymakers and industry leaders to prevent harms from XR technologies affecting children, before it is too late.
- XR technologies are increasingly affordable and accessible to a wide variety of users – including children.
- Existing regulation and safety measures do not go far enough to protect children using XR technologies.
- The Online Safety Bill should address XR safety explicitly, give details of how XR platforms can assess and mitigate XR risks to children, and mandate the development and implementation of safety standards.
What are XR technologies?
Although XR technologies and content have been around for several decades, the late 2010s saw a rapid increase in interest and use. This accelerated in the early 2020s when Meta (formerly Facebook) declared that they were investing heavily in the ‘metaverse’, a term that encompasses technologies and applications (including XR) that allow users to live, work and socialise in a digital universe. Meta’s announcement prompted a wave of media and consumer interest, and efforts by other companies to follow suit.
Examples of XR technologies include:
- Augmented Reality (AR) – user can view digital content overlaid on the physical world, perhaps via a smartphone app or through a specially designed headset
- Mixed Reality (MR) – a believable synthetically generated 3D environment is blended with the physical world around the user
- Virtual Reality (VR) – an artificially mediated, immersive experience – usually simulating a 3D world with sensory input from the physical world.
Experiences available through XR are becoming more accessible and less expensive, making equipment, games, and applications affordable for more consumers. VR applications include interactive experiences, such as exploring virtual social worlds, engaging in single- and multi-player gaming, and fitness activities. Other applications let users explore an environment, watch film and video in a virtual cinema, and attend live and recorded performances. Many applications rely on other characters being depicted in the environment interacting with the user. In multi-player games or social VR applications these characters include ‘avatars’ embodying fellow human participants, but may also include synthetic, computer-controlled characters.
The creation of XR digital environments no longer requires particularly sophisticated equipment or expertise; ‘amateur’ immersive video is common, including in the rapidly growing adult VR sector.
What are the risks to children?
We can think of the potential risks to children as:
- content – being exposed to sexually explicit or violent material
- contact – being approached and engaged by adults
- conduct – engaging in risk-taking behaviours such as seeking illegal content or sharing intimate personal images
Recently, public attention has been drawn to some of these risks to children in VR. For instance, journalists ‘undercover’ in VR social spaces reported that children under 10 were approached by adults, exposed to pornographic and violent content, and to abusive and explicit comments.
Reducing risks is challenging as very few applications have been designed with child safety in mind. Platform providers like Meta VR and Sony VR state that young children should not use their headsets – as if this resolves the need to ensure that applications are safe for children. There is currently nothing to stop children using headsets registered to other family members or friends, and limited ability for parents and guardians to monitor or restrict their children’s VR experiences. Although Meta announced in March 2022 that it would introduce some parental controls, including the ability to restrict access to some apps on headsets registered to over-13s, these controls are relatively modest and children could still use someone else’s headset to access inappropriate apps. At present, it is not difficult for curious children to seek out adult experiences, including sexually-oriented content and social spaces, and for adults to use VR social spaces as a new venue for child grooming. However, age assurance in VR spaces may be possible: for instance, sensors in a VR headset can be used to detect if it is being used by someone much shorter than the registered user, which could be a useful to protect children.
Attempts at moderation also cause issues. Users can block and report other users, but that can be time-consuming and difficult, and dangerous behaviour is not always immediately obvious (grooming for example). Once a user has reported someone, they rarely get feedback on what action has been taken. Plus, if a child is using someone else’s headset they may not want to draw attention to themselves by contacting the platform. Proving that abusive or dangerous behaviour has occurred can also be problematic if users do not record all of their interactions. Live VR interactions may leave very little in the way of a meaningful digital footprint. Another approach is for social spaces to be ‘policed’ by moderators. For example, it is common in some spaces in AltSpaceVR to prevent trolls disrupting meetings. But anecdotal accounts suggest that even when moderators are present in spaces where children are at risk, little action is taken.
How might XR change the nature of abuse?
Recent years have seen the development of immersive video, erotic games, sexually-oriented social spaces, and a burgeoning sex-tech industry, largely focused around the use of haptic devices which replicate real world sensations and create tactile user experiences. The content available on these platforms and services is currently under regulated, and has the potential to be used by those with an interest in abusing and exploiting children.
Another area of concern is computer-generated sexualised images of children in XR. Simulated abuse against child avatars has been present in online virtual worlds such as Second Life for many years, so it is unsurprising to see adult users adopt or create child-like avatars to be used in simulated sexual activity in AR and VR. Although real children may not be harmed directly, these virtual depictions can normalise the idea of sexual abuse of children. It is unclear whether current legislation banning possessing of indecent images of children would cover computer generated images of child avatars, even if the images are not of real children (Coroners and Justice Act 2009). If images or avatars like these are found to be illegal, this raises questions about how simulated sexual activity between XR avatars, which takes place in real time and may not result in a recording or other meaningful digital footprint, will be evidenced for a prosecution. Both these points should be addressed in the upcoming Online Safety Bill.
What can be done?
Consumer adoption of XR technologies will continue to grow, driven by improvements in mobile augmented reality and internet capability, development of better hardware, reduced costs, and the availability of immersive content. Industry commentators predict that use of XR tools will be commonplace in a few years – but issues of child safety need addressing now. It is critical that the harms we’ve identified are addressed so that they can be stopped.
Recommendations that policymakers and industry should action include:
- The Online Safety Bill, currently passing through Parliament, includes a range of measures to protect children online. Although the government has stated that the ‘metaverse’ will be in scope of the Bill, little consideration has yet been given to how the specific challenges of safety in VR and AR might be addressed. As it passes through the committee stage, we welcome efforts to scrutinise the drafting of the Bill to ensure it will apply to emerging XR contexts.
- The Bill places an obligation on platforms which host potentially harmful material to carry out risk assessments. Our analysis shows that platforms need to consider several types of risk to children, not just the widely publicised risk of grooming.
- The Bill requires platforms to have effective ways of mitigating these risks. But comprehensive moderation of content and activity will be all but impossible in virtual reality: human moderators cannot be everywhere and automated detection of harmful behaviour in real-time online XR interactions is not currently possible. This is another area where standards and guidance will be welcome, as will transparency in reporting on how instances of abusive and harmful behaviour are identified and dealt with.
- Technology firms are only just beginning to tackle grooming and other abuse-related activities in XR environments. In an ideal world, tech companies and XR app developers should anticipate and mitigate safety issues before their products are rolled out. We should be supporting and accelerating work to develop standards and guidance, to support XR ‘safety by design’.
- We are pleased to see companies like Meta begin to introduce VR parental safety tools for teenagers. But these must be rolled out more quickly, and new measures need to be developed to protect pre-teens as well.
- A key concern is determining how abusive and dangerous behaviour in XR can be identified and tracked and, if necessary, offenders prosecuted, while still protecting freedom of expression and user privacy. This will require innovative thinking and new tools for digital investigation and digital forensics, and consideration of how evidence of CSEA activity in XR could be laid before a jury.
- Meanwhile, children and their caregivers need much more education about the risks of XR technologies so they can make informed decisions about when and how to let children explore immersive worlds.
You can read the full report here and Jo, Counter Child Sexual Abuse (CCSA) Mission Lead at GCHQ said the following about the report:
“GCHQ works with law enforcement, government and the third sector to significantly reduce harm to children and create a safer environment online by countering the volume and scope of offending. It’s important we’re prepared for future threats and this paper starts to map out how offenders might seek to abuse children in eXtended Reality (XR). This research helps to better understand the threat and take action early to ensure child safety is considered throughout the design and development of these emerging technologies.”
Policy@Manchester aims to impact lives globally, nationally and locally through influencing and challenging policymakers with robust research-informed evidence and ideas. Visit our website to find out more, and sign up to our newsletter to keep up to date with our latest news.