Gen Alpha and Gen Z video gamers face a new threat that their parents didn’t: Other players



Today, 45% of Americans over age 50 play video games. These players have witnessed the evolution from classic games such as Tetris and Pong to immersive titles like Grand Theft Auto. While the early games enraptured players, they were generally regarded as pretty tame. Atari and GameCube were usually played alone or with friends in one’s living room and only occasionally dipped into some kind of cartoon violence that was more comical than macabre.

Fast-forward and kids today are playing within a radically different gaming landscape. Setting aside the shocking, gory, and inappropriate content children can choose to explore, today’s video games are rife with abuse and harassment inflicted upon gamers by the other players themselves, who are often strangers. User-generated content (UGC) is a new danger kid gamers face that their parents didn’t–and it’s the most alarming aspect of gaming today.

In the context of video games, UGC refers to any content created by the players of the game themselves–think message boards, voice chat, usernames, and even metaverse interactions. And it can turn ugly fast. Users of the highly-anticipated game Silent Hill: Ascension recently reported that the game’s built-in chat feed is an “absolute cesspit” of unchecked profanity and slurs, and reports about video games and online chats being “hunting grounds” for sexual predators abound.

Why the gaming industry embraces UGC

Despite its risks to users, UGC has become a key part of the gaming experience. The gaming industry continues to embrace multiplayer, immersive experiences that centralize UGC in the name of profitability and popularity among their user base.

To gain the attention of the more than 212 million Americans who regularly play video games and grow their share of a $56.6 billion market, gaming companies ruthlessly compete for user engagement, and research shows the best way to do so is through interactive gaming experiences.

At the 2023 Code Conference, Roblox CEO David Baszucki spoke of his desire to turn their wildly popular online game community into a communications platform for all forms of virtual interaction and connection. Of course, there are benefits to this vision of interconnectedness, but we must acknowledge obvious downsides that include safety risks to vulnerable populations, particularly children.

Parents see UGC’s harmful impact on kids firsthand

Parents are troubled by the rise of UGC in video games. Our survey of more than 1,000 American parents who are gamers themselves shows over half of them believe that today’s video games carry more of a risk for kids than those of their youth, with 44% citing in-game UGC as one of the most harmful aspects of video games for kids.

Over half of the parents we polled have encountered concerning and even illegal examples of UGC when they play themselves, including unchecked bullying, hate speech, and predatory behavior. Gamer parents believe this content can cause real harm, with 42% noticing adverse effects on their child’s mental health from playing video games.

The message is clear: Parents aren’t as concerned about the impact of hyper-realistic in-game experiences, which research has debunked as causing offline harm, as they are about the content from the users themselves.

What parents are asking for and how UGC could change

Despite the risks, parents also cite definite benefits to their child playing video games, including enhancing spatial reasoning and social skills and giving their kid a community they can feel good about.

However, 66% of gamer parents also want to balance these benefits with enhanced features on their gaming consoles that facilitate more effective control over inappropriate content. As developers work to safeguard all players, we should see the introduction of tools to restrict access to video games based on maturity ratings, disable chat functionality, activate privacy settings to regulate the child’s interactions, and impose limitations on overall screen time.

While there certainly should be a zero-tolerance policy for crimes like sexual grooming, we should also see gaming companies leaving more room for corrective remediation, since children players may be at the whims of imperfect impulse control. Gaming giant Ubisoft recently released a guide aimed at helping players address their own toxic behaviors online while gamer chat tool Discord is shifting to a more nuanced approach to inappropriate behavior that considers violation severity and focuses on rehabilitation.

Protecting future players

The future of online gaming will incorporate more in-game user interaction, not less. Despite eye-watering revenue losses, the Metaverse is still happening, and users are finding new ways to interact across all virtual spaces.

The gaming industry will continue to increasingly adopt player-to-player engagement, so it must also be vigilant regarding the potential harms of user-generated content and its effects on users, particularly children. The parents who decide which video games their kids play are certainly paying attention to this threat, so we must address the growing harms of in-game UGC now–before it’s game over.

Alex Popken is the VP of trust and safety for WebPurify, a leading content moderation service.

More must-read commentary published by Fortune:

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs of Fortune.

Subscribe to the new Fortune CEO Weekly Europe newsletter to get corner office insights on the biggest business stories in Europe. Sign up for free.



Source link

About The Author

Scroll to Top