Gamasutra: The Art & Business of Making Gamesspacer
"Troll Pooling" Game Players with Negative Social Values
Printer-Friendly VersionPrinter-Friendly Version
arrowPress Releases
April 18, 2014
PR Newswire
View All
View All     Submit Event





If you enjoy reading this site, you might also want to check out these UBM TechWeb sites:


 
"Troll Pooling" Game Players with Negative Social Values
by Dmitri Williams on 01/02/14 02:39:00 pm   Expert Blogs   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

 

The social impact from customer to customer is usually a beautiful thing, but well, not all human beings behave beautifully. In any social network there are going to be those who--how do I put this delicately--are assholes, as immortalized in this classic Penny Arcade gem. There is little doubt that these people are out there and that they cause harm. If you’ve ever made the mistake of connecting to XBox Live voice chat with random teammates in a shooter title, I’m sure you know what I’m talking about. It’s some 12 year old kid (or someone behaving like one) screaming obscenities, telling you you suck, etc., etc.

 

The Internet parlance for this sort of person is of course “troll.” Some of these players are awful and don’t realize it, while others take direct glee from the way they harm and annoy others. What does science have to say about this? Like just about anything, it can be measured, and in this case the process also suggests a solution.

 

When measuring the Social Value of players, it’s important to know that this number isn’t always positive. By definition, a real troll is someone who’s social negatives outweigh their social positives. Some indirect measures of this are correlations between their sessions and others’ churn, high complaints, etc. I think far too often developers assume that they are the price one must pay to have a public game, but there are some science-based ways to deal with them.

 

Most of our approach is built on the logic of contagion (we’re far more prepared for the zombie apocalypse than you are, truly), and in that tradition we know that things spread through networks. It may be a virus, or a behavior, a meme, an idea, or even a mood. Well, a troll is a bad mood incarnate, and letting a troll interact with happy players is an infection in the network, spreading a bad mood and converting happy players into unhappy ones. Unhappy players play less, spend less and often quit.

 

In dollar terms, we see many players who have negative Social Values. This means that interacting with them directly causes others to play and spend less. Not everyone who comes into contact with them experiences this, but to get a negative Social Value, you’d have to be a net negative on others. The % of players who fall into this category is roughly 5-10%, but it varies wildly by genre and by the culture of the game. You would probably not find the same %s of trolling in Club Penguin as you would in say, Halo 3. We hope.

 

Are these 5-10% “bad” players? No, and they may even be good customers. What matters is putting their negative Social Value up against their LTV. If they have a higher LTV than that negative, then they aren’t costing you money in the long run. And if their LTV is huge, they may actually still be very valuable customers. Let’s call them “Neutral  Trolls.”

 

But of course there are some apples who are more bad than others. In rare cases—less than 2% in most games--these players literally cost the developers money because their negatives outweigh their LTVs. They lose the developer more money than they spend themselves. These are the very, very few players who you might want to flat out ban, or, you know, send to the cornfield. Let’s call them “Evil Trolls.”

 

Why not always ban the Evil Trolls? They still spend money. The logic again goes back to the network, and to the “ripple on the pond” metaphor. You don’t want players who are socially negative to pollute the pond, so what do you do? You put them in a separate pond. Depending on your game’s culture and mechanics, you might try experimenting with putting them together into a “Troll Pool.”

 

Let them pollute only each other and see what happens to their LTV. In some cases they’ll drop out altogether because they’ll be deprived of their main source of enjoyment (the suffering of others). You should be careful about whether you put both Neutral and Evil Trolls into these pools. Neutral Trolls after all are still valuable customers you don’t want to lose. They may be happy with each other, or they may dislike being with the pure haters, or they may actually miss the regular population. Remember that the Neutral Trolls are still a net positive to your bottom line.

 

You know your game best, so there’s no one-size-fits-all solution here. Analytics without context are an epic fail. Consider the variations: you may be concerned that having any trolling will create a culture that keeps new players away. Some players are likely to be less tolerant of trolls and could easily be lost with one bad experience, or even hear about it from their friends: “Dude, don’t bother playing Knights of Evil because the culture is so toxic it’s not worth it.” Or you may have a particular culture that celebrates trolling or is so laissez faire that player behavior tracking is besides the point (*cough* EVE Online *cough*).

 

Or, you might think about using this negative Social Value thinking to guide player behavior through some kind of feedback mechanism. Riot games has done some fantastic work enabling their players to play a role in reporting positive and negative behaviors and then even passing judgment. Sometimes a meaningful penalty can reform a troll. Other times they really need to go into their own special pool...

 


Related Jobs

Mapzen
Mapzen — New York (or San Francisco), New York, United States
[04.18.14]

Mobile Graphics Engineer
The Workshop
The Workshop — Marina del Rey, California, United States
[04.18.14]

Sr. Programmer (Generalist)
The Workshop
The Workshop — Marina del Rey, California, United States
[04.18.14]

Senior Graphics Programmer
Machine Zone
Machine Zone — Palo Alto, California, United States
[04.18.14]

Senior Software Engineer (Server-Side)






Comments


Joseph Hannes
profile image
I really like the concept of the troll pool fueling and feeding itself in vicious cycle. What are the ways to identify players as trolls without having a reporting system? Does anyone have any good examples in doing this to share?

Luis Blondet
profile image
So, be happy, shut up or banhammer?

What's keeping the developer from labeling customers that complain or protest their decisions as trolls?

I wonder how long it will take for devs of online games from banning players that criticize them in OTHER forums or venues because they are "costing them money"?

If a dev uses exploitative design and it is criticized by some of their own players for it, who is the one hurting the game community, the devs or the critics?

Brian Peterson
profile image
"What's keeping the developer from labeling customers that complain or protest their decisions as trolls?"

The thing that keeps (smart) developers from doing that is the need to build trust in their players and communities. Censoring players unfairly could generate negative PR, which could hurt a game's revenue much more than simply allowing a few voices of discontent. This is especially true of F2P games in which devs depend on dedicated players for a constant revenue stream.

If you read any DOTA, LoL, Wow, or Starcraft forum, you'll find the occasional player griping about their ban or mute. They always claim that they didn't do anything wrong, and sometimes that the developers banned them because they were critical of the game in public forums. Invariably, a community manager or other forum member will view that player's replays to find and publicly post the incredibly toxic behavior that they conveniently "forgot" to include in their original post.

Eventually, the community no longer trusts players who post about unfair bans anymore because the devs have proven that they can be trusted. Consumer trust is MUCH more valuable for monetization than silencing complaints or protests from individual players.

Dmitri Williams
profile image
I don't recommend classifying players as trolls for forum posting, and definitely not for criticism. I think it's relatively rarer for forum behavior to cause other players to play (or spend) less, if for no other reason than a small % of players are on the forums. And, as you say, the idea of banning players for criticism of the developer would be a colossally bad idea for any number of reasons.

The idea here is about in-game behaviors directed at other players. This impacts more players, and impacts them more heavily.

How do you detect and objectively measure these players? You can put in player reporting systems a'la Riot, or you can use some objective and unseen system such as the one we developed. Doing this right requires using the logic of social network analysis--you're looking at the interactions between players rather than their interactions with you, the developer.

Maria Jayne
profile image
It all depends on accountability, if it's easy to make a new account and new persona such as in a free to play game, then having a naughty corner won't really help.

I've often felt some sort of cross game accountability may be beneficial though. Especially for game testing, have a pool of volunteer testers for multiple titles, anybody gets banned from one for violating an NDA or being abusive, they are banned from all future game tests on all future games advertised to the pool. By participating in a test you get some sort of point accrument which in turn makes you a more valuable/successful applicant for future game test acceptance.

Getting every developer to collectively agree on such a pool would be tricky though...somebody would probably control that information and charge for accessing it.

Carter Gabriel
profile image
Interesting perspective. Its a great solution to a problem, but IMO the problem ismt with the consumers ppl lke to villainize. It is with the developers. At least in many games. However, it doesn't take the insight of a true designer to understand the communities toxicity is the flaw of the developer.

It is no coincidencr that some types of games are more toxic than others. It is the design of games like moba games, and the failures of the designers, who create toxic communities.

If I were wrong, then the Moba genre would not naturally create some of the most vile, toxic communities in gaming. If you want to be scientific, measure the mechanics that increase toxicity. Or just ignore "science" in favor of common sense. Anyone who looks at a Moba design will see why it breeds toxicity. I mean, really? Competitive play, unbalanced player skill levels among 10 players, a design where a single player can ruin the game for all other 9 people on purpose or by accident, punishment for players who value their time, negative attitudes, unprofessional tournament players, free to play, huge time investment to be competitive, extremely low player age, and forcing players to waste hours of their lives or face a ban...all cumulating in high emotions and easily angered mechanics... easy targets for fun.... oh dear God.... Moba games are a trolls wet dream, and transform even a good person into a foul, toxic monster.

FFA PvP games rely on trolling through griefing and MMOs one sided combat. Of course people will get angry when unfairly beaten in a match they didn't want to participate in. Of course those who get joy from trolling ppl will love such a game.

A developers design decisions can create trolls from normally good ppl at worst, and draw higher than average numbers of negative personality types at best. Or they can encourage cooperation and reduce trolling by limiting the fun that results from trolling.

Imagine the effrcts of muting ppl, and making them unable to even read chat from others. What fun is there in trolling people when you can't read their responses bc you were kicked from chat but not banned from the game? How willing will they be to troll when their chat ban lifts? Yet they still play bc they weren't banned from a game they paid good money for. Evem better, what if they were unaware their targets couldn't read their words? What about a system where AI begins speaking to the troll but usimgbthe targets name? Responses to destroy the fun of trolling. Or responses to give the troll giggles while the real players are unaware anyone is even speaking?

Carter Gabriel
profile image
So for the bad grammar
Using my tablet away from home seems to want to autocorrect or maybe my fat thumbs in the way

Nathan Mates
profile image
I wouldn't say it's necessarily 100% the fault of the designer. In a PvP game, anything that gives you an advantage over other humans can and will be used. And, sad to say, but one player archetype is the type that likes to hurt others. That player archetype will be most effective in a PvP, but sometimes those players manage to hurt games that ought to be PvE. Example: stories of players in early Ultima Online and/or Everquest luring monsters from a graveyard (where they're expected) into towns (where they're not expected) to make life a pain for other players.

I would submit that that PvP that has persistence and rewards -- unlike say early FPSs like Doom/Quake/Unreal that reset state every round -- will be where toxic players have the most effect. And that's where the toxic players will get their fun. Is it the designer's fault for adding persistence and rewards? No. Should the designer have forseen all the ways jerks could try and grief their games and tried mitigations, even if only matchmaking jerks in their own pool? I'll give you that.

Steven Christian
profile image
I don't think you can blame players for wanting to effect the world in a meaningful way in an RPG game.
MMO's are traditionally quite bland; everyone is the hero; everyone saves the day; every boss dies countless times, only to be reborn again for the next group.

Taking a monster or lethal debuff into a densely populated area where it is not expected can allow the player to have real effect on the world.
It is an example of emergent gameplay with a genre where there is generally none
(themepark MMO's: you get on the different rides; you get off again; you have no real agency).

If players want more agency, then perhaps we can give it to them in a positive way, rather than punishing them.

Andrew Wallace
profile image
So an evil person can become 'neutral' by spending more money?

Might want to think about your word choice.

Dmitri Williams
profile image
Definitely use other terms if they fit your context better. This was an attempt to provide a general set of labels.

Consider a player who has an LTV of $20, and who's presence in the community is a positive. She generates another $20 among others because players like playing with her. Clearly she's a positive.

Consider a second player who has an LTV of $20, but who's presence drives away $40. That person is a net negative, and actually costs the developer $20. Clearly he is a negative.

Now consider the third player who has an LTV of $20, but who's presence drives away $10. How do we think about this player? They are impacting the community negatively, but their own spending outweighs this. From a purely objective point of view, with any morality put aside, this player has a net contribution of $10. But they're not exactly a good guy, either. So, I go with neutral.

Whatever labels you're comfortable with, the larger point is that separating their personal LTV from their impact on others lets a developer come up with more nuanced, appropriate solutions that fit the context of their title, mechanics, CRM possibilities and community.

Kasan Wright
profile image
I believe LOL does something like this, but in reverse. They tend to group players who have been honored with special ribbons together with players who have also been honored with that same ribbon. I have a green 'Teamwork' ribbon in LOL and I'm almost always grouped with players that also have a green ribbon. In contrast, the enemy team usually has either no ribbons or red ribbons.

I'm guessing the idea is that players who are consistently identified as good teammates are more pleasant to play with and, in a way, that accomplishes the same idea you have here–everyone who is NOT honored as a team player is more likely to be grouped with others like them.

-Kaz

Ben Sly
profile image
This is a good way to stop players of a free game from getting out of the "time out corner" by making a new account: give good players rewards instead of just punishing bad players. The issue with this is that new players also get hurt by not playing with the good players. It doesn't mean that it's a bad idea overall, but that it should be implemented while being aware of the tradeoffs.

Doug Binks
profile image
Interesting, particularly as even more games appear to be adding multiplayer elements to traditionally single player gaming (mingleplayer for example).

Creating a separate match making pool for players who troll seems a decent alternative to banning. This does have the problem that it creates a potentially toxic pool, and 'troll' players may dislike playing with other 'trolls', exasperating the situation.

We could instead let players vote on how much they enjoyed/disliked playing with someone and use connectivity metrics based on these to matchmake - making it more likely to match you with players who are within a close 'like' ratio to you.

For example if player A liked playing with player B, and player B liked playing with player C, then player C is 2 like distances away and so more likely to be matched with A than someone further away or who you didn't like playing with. Disliking someone both disconnects you from their connectivity graph and potentially adds a negative modifier, though it's likely you want the algorithm to allow people to draw closer as their playing style evolves.

The beauty of this system is that it is non judgemental from the developer perspective. It's likely that an angry player who verbally abuses another because of their play will result in both players not wanting to be matched again with each other. It also permits players who change, for example an angry solo player becoming a supportive team worker, to rebuild connections much like in real life. New players can be matched with people other new players enjoyed playing with, and as their skill develops they can evolve towards playing with others who suit their play style. Highly competitive players can co-exist with those who play for fun without any need to label groups.

However, I also agree in part with Gabriel that developers share some responsibility to engineer the gameplay to support the style of play they desire. Awarding points and in game resources for event completion to the player who finishes the action can create a toxic potential as it creates an exploit which damages player relationships - a player can 'steal' from another by letting the other player do most of the work only jumping in at the last moment to secure the rewards. A better system would be to share the points & rewards based on work done towards the completed goal, with a bias system in place to reward those who complete.

For example, if player A is actively working on a goal and is about to complete but player B steps in and finishes the task, then player A should be rewarded more than B. But if player A starts the task, almost completes but then backs off and player B pursues the task then the rewards should be more equal or even potentially in B's favour. A simple system to solve this is a time based exponential decay of the reward share - once a player stops a task their potential share of the goal decays over time.

This requires a non integer reward system - 'Player A Completed X Tasks' becomes 'Player A Contributed to Completing X.X Task', or better wording depending on your in game context!

Dmitri Williams
profile image
+1 Doug.

Lots of good ideas here and in the thread about how to correctly incentivize good behavior and disincentivize bad. And I like the ideas here about how to form and structure the pool. Ultimately, whatever the solution, it'll need to be measured for effectiveness so the dev knows "what works." This is essentially social AB testing, and it's very early days in us collectively knowing what works. I don't see developers using these kinds of transitive network hop ideas yet, but it's very fertile ground. The social science is very clear that some form of them will work.


none
 
Comment: