’Big Buddy’ could help prevent virtual bullying
New forms of artificially-intelligent moderators which oversee 3D virtual spaces and intervene against bad behaviour could help protect children from online bullying and harassment, a new study suggests. Researchers from the University of Glasgow collaborated with parents and their children to gauge their reactions to 'Big Buddy', a prototype virtual moderator for online social spaces in virtual reality developed by the team. During the study, Big Buddy helped parents stay informed about their children's experiences in virtual spaces, and helped children feel safe and more secure by reacting to misbehaviour with punishments similar to those meted out by teachers in real-life classrooms. The team's research, which will be presented as a paper at the ACM Interaction Design and Children conference on 20 June, could help inform the design of future AI-controlled moderators for use in virtual spaces like the Metaverse. A group of 43 children aged between eight and 16, recruited with assistance from the Scottish anti-bullying service RespectMe and Giggleswick School, took part in the study along with 17 of their parents. The children put on virtual reality headsets which placed them at a desk in a classroom environment the researchers created using the game development tool Unity 3D. The children used handheld VR controllers to play rounds of a game which tasked them with building towers of blocks in a timed competition against another classmate, whose actions were pre-programmed.