GGWP is an artificial intelligence device that detects and combats toxicity in video games
Speaking of online games, we all know that the “report” button is useless. Regardless of genre, publisher or budget, games are released every day without an effective system for reporting abusive players, and some of the world’s biggest games are constantly apologizing for hiding in toxic environments. Franchises including League of Legends, Call of Duty, Counter-Strike, Dota 2, Overwatch, Arc and Valorrent have such hostile communities that that reputation is part of their brand – including in recommending these games to new players Be cautious when they will experience in chat.
The report button is often seen sending complaints directly to the dumpster, which is lit by one’s restraint department every three months. According to legendary Quake and Doom Sports Pro Dennis Fong (better known as Thrash), this is far from the truth for many AAA studios.
“I won’t name names, but you know, some of the biggest games in the world aren’t going on in public,” Fong said. “It goes into an inbox that no one is looking at. You feel that way as a player, don’t you? You’re frustrated because you report the same person 15 times and nothing happens.”
Game developers and publishers have spent decades fighting gamer toxicity, and they still haven’t. Fang did the same.
This week, he announced GGWP, an artificial intelligence system that collects and organizes data on player behavior in any game, allowing developers to manipulate every incoming report through a mix of automated feedback and real personal comments. Once you’re in the game — “it’s literally like a line of code,” Fong says — the GGWP API collects player data to create a community health score and build common toxic types for the game. After all, every game is a big snowflake when it comes to chat abuse.
The system can also assign reputation points to individual players based on AI-led analysis of reported games and a sophisticated understanding of the culture of each game. Developers can then assign feedback to specific reputation scores or even specific behaviors to warn players that their ratings are dropping or break bans. The system is fully customizable, allowing games like Call of Duty: Warzone with different rules than RoboLux.
“We quickly realized that, first of all, many of these reports are the same,” Fong said. “Because of that, you can use big data and artificial intelligence to help solve these problems. Most of these things are almost fully prepared. It’s time for AI to handle this. And people haven’t gotten close to it yet.”
GGWP is the founder of Fung Crunchyroll, Kwun Zhao, an expert in data and artificial intelligence. The brainchild of George Ng. It has raised $12 million in seed funding so far, backed by investors including Sony Innovation Fund, Riot Games, YouTube founders Steve Chen and Streamer Pokemon, and Twitch creators Emmett Shear and Kevin Lynn.
Fong and his colleagues started creating GGWP more than a year ago, and given their industry ties, they were able to sit down with AAA studio executives and ask them why they were so restrained. They think the problem is twofold: First, these toxic studios don’t think it’s a problem they created, so they’re not responsible for it (we could call it a Zuckerberg special). Second, there is a lot of unmanageable abuse.
Fang said that just one big game in one year has received reports from more than 200 million players. Other studio heads talk to general characters in nine figures, and players generate millions of reports per title per year. The problem is even bigger.
“If you get $200 million for a game that players report to each other, the scale of the problem is much bigger,” Fong said. “Because we just talked, people gave up because it wasn’t going anywhere. They just stopped reporting to people.”
Executives told Funk they couldn’t hire enough people to keep it right. Also, they are generally not interested in collaborating to create automated solutions – if they have an AI human, they want to create games, not supervise.
In the end, Fong found that most AAA studios process about 0.1% of reports per year, and their restraint teams tend to be ridiculously small.
“Some of the world’s largest publishers have fewer than 10 players on anti-drug teams,” Feng said. “Our team is 35 years old, 35 years old, they are all product, engineering and data scientists. So we as a team are bigger than almost any global publisher team, which is sad. We are very loyal and committed Try to help with this.”
Fong hopes to provide a new mode of thinking in GGWP games that emphasizes the implementation of instructional moments rather than outright punishment. The system is able to identify beneficial behaviors, such as sharing weapons and resuscitating teammates in adverse situations, and in response can apply rewards to that player’s reputation score. This will allow developers to apply real-time in-game notifications, such as warning you to “lose 3 reputation points” when players use unacceptable terms. Fang said that would prevent them from repeating the word, thereby reducing the amount of coverage of that game overall. Studios need to do some extra work to implement this notification system, but according to Fong, GGWP can handle it.
“We have completely updated the review system,” he said. “They just have to be willing to try.”
Source: Jessica Condot, Engadget, Direct News 99