/I thought he was playing an innocent game

I thought he was playing an innocent game

Roblox sceneImage copyright
Roblox

Image caption

Roblox is a multiplayer platform where players can create their own games and join in with others

Online multiplayer game Roblox, which has 90 million users worldwide, is marketed at children – but there are fears it also being used to groom them. One mother explains how this happened to her young son.

“They were talking about rape. They were talking sexual activities that were pornographic,” Sarah – not her real name – says, recalling some of the graphic messages sent to her child.

He had been playing Roblox online – where users build their own games and create characters with coloured blocks.

For Sarah, it initially seemed like an “innocent game”.

She had turned on parental controls, so her son – not yet a teenager – could not send messages.

But, over time, she noticed a change in his behaviour.

He would no longer want to join in with family activities he usually enjoyed.

Concerned, she decided to check the game – and discovered he had been communicating with others on a third-party app.

It was at that point she realised her son had been groomed into sending sexually explicit images of himself.

“We came across some pictures,” she tells the BBC’s Victoria Derbyshire programme. “It was horrifying. I was physically sick.”

Roblox told the programme it was unable to comment on individual cases but was committed to protecting the online safety of children.

It said its in-game chat had very stringent filters and any photo exchange would have been done on a third-party app, that is not “affiliated or integrated with Roblox”.

It added: “It’s extremely important to be aware of these chat apps, particularly [those with an] ‘overlay’ feature making it appear to be part of whatever game is being played.”

Image caption

John Staines and John Woodley, from Est-E Safety Training, visit schools to warn children about the worst-case scenarios in online gaming

It is a situation former police officer John Woodley knows other parents have experienced too.

He visits schools across the country with colleague John Staines, warning children about the worst-case scenarios in online gaming, and says parents do not realise people still find ways to communicate with children despite parental controls.

On third-party apps, he says: “They can get them to send pictures and hold verbal conversations with them.”

For Amanda Naylor, Barnardo’s lead on child sexual abuse, the industry must do more to safeguard children.

She says while Roblox can take action if issues are reported to them, children often do not understand the abuse that is happening to them, so do not report it in the first place.

In April, it was announced that internet sites could be fined or blocked if they failed to tackle “online harms”, such as terrorist propaganda and child abuse, under government plans.

The Department for Digital, Culture, Media and Sport (DCMS) has proposed an independent watchdog that will write a “code of practice” for technology companies.

Senior managers could be held liable for breaches, with a possible levy on the industry to fund the regulator.

‘Skilled up’ parents

Ms Naylor also believes parents should be “skilled up” in how to protect their children online, without being judged.

It is also important that when instances of grooming do occur, she adds, children are given adequate support afterwards – as it can have an impact on their future relationships.

Sarah says in her case, she contacted Roblox to ask them how they had “allowed” her child to be groomed.

“They didn’t respond at all,” she says.

And when she took the case to the police and officers wanted access to the IP addresses of the suspected groomers, Roblox “refused”.

“They wouldn’t let our police have anything to do with it because we were in the UK and they are an American company,” Sarah says.

The police force Sarah was in contact with told the Victoria Derbyshire programme it had the authority to investigate criminal offences that had occurred in the UK only – and in this case the people contacting Sarah’s son were in another country.

Roblox told the programme players could report inappropriate behaviour using the “report abuse system” and users could then be suspended or have their accounts deleted.

‘Sexualised manner’

Sarah’s story is an extreme case but other issues have been highlighted with Roblox’s gameplay.

Last year, a US mother wrote a Facebook post describing her shock at seeing her child’s avatar being “gang raped” by others in the online game.

She posted screenshots that showed two male avatars attacking her daughter’s female character.

Roblox said it had banned the player who had carried out the action.

Image caption

Iain said the sexualised actions of other people’s characters in the game were “disgusting”

One father, Iain, tells the Victoria Derbyshire programme he had similar concerns, after he took control of his son’s character to understand more.

He says one player told his character to lie down, then laid down on top of him and began moving in a “disgusting” sexualised manner.

As he stood up, the other person threatened to kill themselves if he left.

Iain says he contacted Roblox – but never had a response.

Roblox told the programme it was relentless in shutting down inappropriate material and had 24-hour moderators.

‘Life-destroying’

But according to both Sarah and Iain, more needs to be done to protect children.

Sarah says her son is still “in a very bad way”.

“He’s broken, and so are we. It’s life-destroying,” she says.

“I’ll never be able to take those pictures and words out of my mind.”

If you have been affected by any of the issues raised, support and advice is available via BBC Action Line.

Follow the BBC’s Victoria Derbyshire programme on Facebook and Twitter – and see more of our stories here.

Original Source