TIGA on protecting players in video games

Dr Richard Wilson OBE is CEO of TIGA, the award-winning trade association representing the UK video games industry. Here, he looks into issues around safeguarding players – particularly children – in video games and how the industry can tackle the problem.

We must ensure that players, vulnerable people and children are not exposed to harm, online or offline. We all have a role to play – whether we are parents, teachers, businesses or organisations. This undoubtedly includes the video games industry. There are lots of ways that we can tackle safeguarding issues as an industry and we should take responsibility to look at solutions irrespective of what others are doing.

Issues around safeguarding players – particularly children – in video games and the online world have become a regular fixture in the news over recent years. This is understandable given 77 per cent of 12-15 year olds play games, while 91 per cent go online for nearly 21 hours a week on average. It is the industry’s success and popularity that makes its collective decisions so important.

Games businesses will want to protect their players for both commercial and moral reasons. TIGA research shows that studios including Jagex and Lockwood Publishing are taking concrete steps to safeguard players. TIGA has published a new report, Safeguarding Players: Responsibility and Best Practice to promote and to share best practice amongst games businesses.

Our report points developers towards TIGA’s six point checklist: pre-empt the way content can be abused, make the standard of acceptable behaviour explicit, make reporting technologies and systems of protection clear, provide the player community with the tools they need, understand the parasitic website challenge, and consider the distinct impact of VR.

To take one example, we encourage developers to ensure that community management protocols are robust enough to deal with all forms of problem behaviour. This includes looking at how our content may be abused, creating easy methods of user reporting and setting clear to follow community standards. We have made great strides from the early days of online gaming.

There is also an opportunity in using new technology for safeguarding. Artificial intelligence offers a unique way to police online communities. No studio can be expected to have vast numbers of staff to monitor online worlds. Instead, AI technology offers opportunities to improve online safeguarding and as an industry we should discuss the options this technology presents. Spirit AI has an interesting offering in this regard.

The checklist also reminds us that the new and unique issues of virtual reality gaming is an area that we need to pay special attention to. Social VR and other connected experiences are increasingly presented as the future of the medium, meaning game studios need to apply the same, rigorous community management processes applied to any contemporary online world. Developers should also look past traditional safeguarding methods. Patrick O’Luanaigh of nDreams, for instance, has suggested that VR games could include a ‘safe button’ and personal space bubbles around players using social VR.

The UK games industry already has some excellent examples of best practice. By continuing to lead the way, we have the opportunity to make the online world as safe as possible.

MCV gives the industry a platform for its own views in its own words. Do you have a burning hot take for the world of games? Get in touch!

About Guest Author

Check Also

Games Growth Summit 2024: Navigating Transition in the Gaming Industry

The gaming industry stands at a crossroads, grappling with job cuts, reduced capital, and shifting …