Sponsored By

Forget AI, online safety is the biggest risk facing gaming companies

While the media frenzy around the transformative and potentially perilous potential of AI continues, gaming companies risk missing the most dangerous trip hazard in their industry today: online safety.

John Brunning

February 9, 2024

3 Min Read

While the media frenzy around the transformative and potentially perilous potential of AI continues, gaming companies risk missing the most dangerous trip hazard in their industry today: online safety.

Keeping gamers, many of whom are children, safe online is the biggest challenge facing gaming companies, which operate in an increasingly connected but hitherto largely un-policed digital ecosystem.

As gaming continues heading towards more user interaction, personalisation, community, and communications – both within games and on external platforms (for example via video sharing on twitch and chat on discord channels) – gaming businesses will need to prioritise online safety.

Governments around the world are moving to set rules and parameters for social media platforms and other online networks where users are exposed to possible harm, including abuse and fraud.

The EU Digital Services Act will be applicable from February 2024 and the UK Online Safety Act will also be likely to apply towards the end of this year or early in 2025.

The comparative maturity of online safety regulation stands in stark contrast to the still fairly nebulous rules surrounding AI – for example, the EU AI Act and parallel UK regulations, which are still at the draft bill stage.

When new online safety laws come into force, affected gaming companies can expect steep fines (up to 10% of annual global turnover in the UK, 6% in the EU) and even criminal liability for some senior individual employees for non-compliance.

These regulatory penalties are in addition to the potentially devastating harm that may be suffered by gamers who are not sufficiently protected, and the reputational damage to gaming platforms that would flow from such incidents.

What should online gaming companies do next?

First, identify where you fall within scope of the regulation.

For example, online gaming firms should consider whether or not their game or platform enables users to generate and share user content; whether it contains chat functionality (voice or written chat) or allows other communications between players; and where geographically their players are based, as this will affect what regulations they need to comply with.

Even if you are not located in the UK or the EU, you will still be caught if Europe is one of the target markets for your games or you have a significant number of users in any of those countries. This is important news for US games businesses who should avoid thinking they don't need to worry about these new requirements.

Second, conduct an online safety risk assessment to understand their user base, the types of content generated and the risk of harm to those users.

Part of this assessment will include considering whether the game is likely to be accessed by children, with the bar for 'likely access' set low.

If children are considered probable users of the game, you will be required to protect under-18s in the UK from "harmful" content, even if the game is not aimed at children and even if the content in question is not criminal.

Categories for "harmful" content will be set out in secondary legislation in due course – but it is sensible to take a common sense approach to considering what these categories may involve.

Third, where any issues or potential risks are identified, don't hesitate to act: implement changes, such as new complaints procedures, user reporting, internal training and updates to terms as soon as possible to ensure compliance.

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like