Gamasutra: The Art & Business of Making Gamesspacer
View All     RSS
October 22, 2014
arrowPress Releases
October 22, 2014
PR Newswire
View All
View All     Submit Event

If you enjoy reading this site, you might also want to check out these UBM Tech sites:

How to reduce antisocial behavior in your game.
by Travis Ross on 11/02/12 12:45:00 pm   Featured Blogs

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.


Recently Bungie (Halo 4)  and NCsoft (Guild wars 2) have both taken a very aggressive stance on antisocial behavior in chat channels. Both companies have begun to ban players who use hate speech. Now, as far as I can tell banning such behavior can be difficult and I haven’t heard reports of how well it is going. Trolls and griefers can be crafty and may find other ways outside chat to reduce the enjoyment of the community. As these companies have taken some excellent steps toward getting rid of bad behavior, I wanted to help. In order to do so, I’ve compiled some suggestions for how to make more prosocial online communities based on my own research and that of others in the field of social policy. Also, for more writing at the intersection of social science and games check out my blog Motivate. Play. (shameless plug).

1: The overall goal should be to build community norms.

My dissertation focuses specifically on how game developers can use norms reduce antisocial behavior in the online communities in games.  Norms are powerful in that they emerge from community behavior. When a norm exists the community can reduce the cost of surveillance. If norms exist then a developer can rely on community members to report or even sanction (punish) players who are behaving badly. This can reduce costs for developers and even empower community members possibly increasing the feeling of self-determination.

2: There are two types of norms. Both are important.

That’s right, there are two types of norms. The first are known as descriptive norms – these communicate what other players are doing. In other words, descriptive norms can be observed in the behavior of others (in chat channels they are broadcast). This information is extremely important for developers because humans have a propensity to copy others, and to use relatively small amounts of social information to inform decisions and form scripts about their environments. What this means is that it is possible for antisocial behavior to spread through a community – especially if it is intrinsically motivating – and that a small amount of bad behavior can set off a chain reaction of bad behavior. Think flame wars, white-knighting, etc. When people get mad at trolls and griefers revenge or good intentions can actually lead to more antisocial behavior.

The second type of norm is called a social or injunctive norm. These are important because they put social pressure on individuals. They communicate what others expect. Yet, games often lack clear communication of these norms, which draw their power from shame and social expectation. In addition, it seems like players in online environments are less responsive to shame – as they probably recognize there are no lasting reputational implications. In the real world social norms are generally also accompanied by sanctions for transgression.

I want to make an important note about these two types of norms. They should be treated as separate motivational forces. In addition, research indicates that descriptive norms out-rank social norms. What I mean is that if descriptive norms communicate that antisocial behavior is common then social norms will not hold up. If enough people are behaving antisocially no one will believe that other people expect them to be prosocial. Interestingly existing social norms can unexpectedly collapse if the public perception of descriptive norms changes  – Scott Page talks about how this can happen.

3: Sanctions

Given that the worst trolls and griefers actually like when others respond to their behavior (see flame wars) social norms (the expectations of others) are not enough to stop individuals and may sometimes encourage them. To stop the worst trolls and griefers you must punish them by taking away what they enjoy. One option for punishment is for developers to pay police officers to sanction deviants. This can work, but when there are a lot of players or games policing can be expensive – telemetry and machine learning can certainly help.

Another option if for the players to do the sanctioning. After all, policing griefers and trolls isn’t dangerous and if norms of prosocial behavior are in place players should feel expectations to sanction others bad behavior. Sanctioning systems can actually be complex and difficult to implement. Why? Well if they are too powerful and easy to use then they become a tool for griefing (ironic). However, if they are too weak or costly then they won’t be effective or be employed by community members. I won’t go into all of the details about how to make a great sanctioning system – in fact there are still many questions for community designers and researchers to address. However, I will give an example – like any game design this probably needs some iterative testing:

Points for antisocial behavior

  1. Players can earn points for bad behavior.
  2. Other players can assign points.
  3. Points expire after an amount of time.
  4. Points are multiplied when multiple players in one session report a transgression.
  5. Players who cross certain point threshold are hit with a graduated sanction (sanctions increase with the multiplier). First take away voice communication, then take away the game.
  6. Players can file an appeal within x days.
  7. Other players are given tools to research an appeal (telemetry data) and are paid in virtual currency for answering appeals (three random players must review the appeal and come to a consensus. If they do not it is passed to an actual customer service agent. Players earn trust ratings for arbitration.
  8. Players that lose arbitration hearings earn addition points.
  9. Players that sanction a player who wins an arbitration earn points that reduce their ability to sanction – for a long period of time.

4. Should sanctions be graduated?

One thing that is still uncertain in community management and research is if players should be banned or if sanctions should be graduated. There are arguments for both.

Ban Hammer

First, it seems that descriptive norms for antisocial behavior already exist in the chat channels of many online games. “That’s just gamers being gamers.” Or “Antisocial behavior is normal in these games.” Creating norms for players to sanction bad behavior maybe difficult. And why would players sanction unless there are expectations that they should? There is something called the 2nd order free-rider problem where group members don’t sanction because there is a cost. To get rid of the perception that antisocial behavior is normative or OK, the ban hammer maybe required. In addition,  sexism, racism, foul language deserves severe punishment.

Graduated Sanctions

However, what if people can be rehabilitated? What if individuals simply are following the status quo for FPS? In competitive environments it can be difficult to control ones emotions and sometimes people get frustrated. Could this be a teaching moment? Do people deserve a warning? Could this actually help people be more prosocial in real life? One of the findings of research in social policy is that very severe sanctions can actually be detrimental to a community. It doesn’t allow for second chances and can frustrate or create enemies. This is especially the case when descriptive and social norms of a certain behavior don’t exist or are not clearly communicated. In other words when players feel like they didn’t get a warning or understand what was expected of them, but still get the ban hammer. After all antisocial behavior has been normative in these environments for some time.


In conclusion, a significant amount of research exists that could help community managers build and sustain communities where prosocial behavior is normative. What I’ve talked about is really only the tip of the iceberg and there are still a lot of questions about behavior in online communities that need to be answered. It is up to game developers and researchers to keep trying to figure out how to construct societies that promote prosocial behavior.

If anyone things this is interesting and would like to apply it to their communities I’d be happy to talk about it in more detail – just leave a comment, tweet me or shoot an email. 

Related Jobs

DeNA Studios Canada
DeNA Studios Canada — Vancouver, British Columbia, Canada

Analytical Game Designer
University of Texas at Dallas
University of Texas at Dallas — Richardson, Texas, United States

Assistant/Associate Prof of Game Studies
Avalanche Studios
Avalanche Studios — New York, New York, United States

UI Artist/Designer
Bohemia Interactive Simulations
Bohemia Interactive Simulations — ORLANDO, Florida, United States

Game Designer


Robert Marney
profile image
Descriptive norms are very powerful. Dark Souls for instance has mechanics that actively encourage anti-social PVP behavior, but the community has well-publicized, community-generated "codes of conduct" that completely reverse the trend. Given the game's abstruse nature, players are very likely to be visiting the fan wiki pages, forums, etc. where a dedicated core of pro-social fans can have a broad community impact. I've received many a respectful "bow" emote from people who could have stabbed me in the back with no in-game consequences.

Travis Ross
profile image
Rob, thanks for the comment. Its always awesome to get actual examples from real games that I can tie to my theoretical work. I'll have to check out this dark souls community. I played Demon Souls, but it was literally right before the big migration.

I agree with you that descriptive norms are powerful in games. I think that is partly because expectations are more difficult to communicate in game where behavior is so obvious, can be aggregated as statistics, and is tied directly to success or failure. Establishing community wide prosocial behavior is very difficult. And that's why I'm doing research.

One of the things I don't talk about in this piece is how context is also an essential element regarding the effectiveness of norms. Game theory actually can provide a method for identifying situations where norms will be powerful (coordination games and mixed motive games). Of course those aren't the only situations where norms appear, and the perceptions that individuals have for a game are often different (meaning the payouts of the games are structured differently). It seems like griefers, who have an intrinsic preference for dominating others or for disruption, have different preference functions game than individuals who prefer prosocial behavior and cooperation. In addition, player-types are certainly dynamic rather than static. Meaning I might feel a motivation to push boundaries and grief others one day and just work religiously to get an achievement the next.

Eric Schwarz
profile image
One thing you have to consider with certain multiplayer games is the social contexts and states they exist in within the outside world. MMOs and casual console shooters are typically played by mass audiences and are meant to be enjoyed over long periods - many months, or years - on a weekly or daily basis. Many players use them to relieve stress and to relax after a long day, by traveling to a separate world which has different social boundaries.

The problem is that when you have a game which is designed to be played over such a long term, repeatedly, and with relatively repetitive gameplay whose novelty wears off in favor of repetition (grinding, repeat matches, etc.) fairly quickly, that gameplay becomes routine, and the game transitions from being an immersive, interactive experience into an alternate reality players inhabit, where not all attention is directed towards the game itself and instead it becomes more of a "hangout" or similar social environment.

Given such a game, I don't think it's possible to truly build game mechanics, systems or even communities which fully eliminate the problems you describe. In fact, in my experience, the games with the best communities aren't ones that have certain mechanics and aesthetics directed towards cooperation, but rather ones which are simply more mechanically complicated and require a higher level of attention, skill, dedication and even intelligence to enjoy. That might sound a little elitist, but it tends to be true - the players who are attracted to those games, as well as the expected social conduct within them, simply make harassment and similar issues far less prevalent.

Aleksander Adamkiewicz
profile image
"In fact, in my experience, the games with the best communities aren't ones that have certain mechanics and aesthetics directed towards cooperation, but rather ones which are simply more mechanically complicated and require a higher level of attention, skill, dedication and even intelligence to enjoy."

Games like EVE Online show you can have a mechanically complicated game with shifting emergent gameplay and yet have repetitive gameplay both.

I think its more than just the complexity and dedication that influences it, but they are factors.

Travis Ross
profile image
I wonder if players who get bored with games are more likely to grief. Its an interesting question. On of the points that T.L. Taylor makes in play between worlds is that she things griefers are often exploring the boundaries of play. I'm not sure that I completely agree with this, but I do think some players become bored with play in the game, and start pushing on the social boundaries of the "hangout".

Also, to follow up on your complexity statement. Aleksander mentions EVE. EVE is such an interesting case because it seems like there is a lot of in-group out-group antisocial behavior. You get spaces where everyone is highly connected and spaces where there is lots of antisocial behavior. My own experience with EVE is limited so correct me if I am wrong. I know EVE University is a cool example of prosocial behavior, but I've also heard horror stories of greifing, piracy, never feeling very safe.

Thanks for the discussion and thoughts guys.

Laura Stewart
profile image
Although I don't play multiplayer FPS regularly, I have been a member of several fan communities. Those that I've seen last the longest have self- adopted codes of behavior, such as a Rules thread thrown up for the first few days, then "signed into law" by the Admin. And Admins with Ban Hammers. And Thread Locks, that shutdown sections of the site if a flame war is ongoing, thereby providing incentive for people to not egg on combatants.

I've seen smaller scale versions of the Sanction process outlined above, but I've never seen them work successfully. Either popular instigators can avoid punishment, eroding confidence in the system, or it becomes another means to flame someone, by creating ghost accounts and over-reporting someone. Although perhaps additional failsafes might be added.

Travis Ross
profile image
Laura, I think you are right that getting communities to actively enforce norms is a real challenge. You make the point perfectly that there is often no real sanction that hurts for griefers or trolls - often they can just make another account and keep on doing what they were doing. One way I can think of that might work is a tiered system where every new player is only a few sanctions away from getting chat disabled, temp banned or sent to "outland" where there are no filters. The thing I worry most about with a system like this is an error would be very costly to the company because a new player could get the boot before they had time rank up for good behavior. Also like you point out, sanctioning becomes a tool for griefers.

One of the games I play iRacing has a leveled license system (which is where I have seen ranking) that promotes good racing. They also have a very clear and strict "Sporting Code". Part of the EULA says you agree to the code. From what I can tell it works pretty well. Bans are employed for big things like intimidation and griefing. However, in the heat of a race its always easy for the 2nd place person to wreck the first place person and so on down the line. This creates a big incentive to wreck other players if you are behind them. To combat this they've actually locked the ability to get the best content until you "level-up" so players have an incentive to drive nice to level-up and drive better tracks and cars. Of course this creates an interesting problem where new players are stuck in the land of the terrible drivers. I think they still have some work to do teaching\incentivizing players to sanction bad behavior and it certainly creates a situation where some people never get out of the land of bad drivers, but then those people would ruin the fun for those who are trying to find an accurate simulation of real racing.

Wow, so maybe we are on to something here... instead of just leveling up for being good at killing stuff, you also have to learn to follow a prosocial code of conduct. Its like a good kindergarten - you learn to get good grades and share. Thanks for the idea!

Georgios Christou
profile image
These are all great suggestions Travis! I've written some on the subject in a blog post (shameless plug:
anisms_For_MMORPGs.php#.UJtMxW_MhJA) where I outlined four suggestions on how to use sanctions as game rules to implement a policy of "social" behavior (using social as the opposite of anti-social here) in MMORPGs.
The way that you've framed the argument, in my opinion, applies more to the extraneous game artifacts, such as forums and wikis, and less to in-game behavior. This is not criticism, by the way. There is a need for regulating both in- and out-of-game behavior if a game wants to be "social-friendly". The reason I say this is because your suggestions target the communicative aspects of the game, more than the actual game rules / gameplay, at least when we refer to MMORPGs. I've spent countless hours observing the behavior of players in both instances, raids, and in major hangouts in several games. The behaviors that you describe will be evident during the "hangout time" in all of these cases. Therefore, we also need something to regulate, or to reward, good instance or good raiding behavior as well.

Thus, I definitely agree with you that social behavior needs to be rewarded somehow. Maybe this is what the game designers of SWTOR were trying to do when they implemented social points. However, their attempt, based on grouping rather than on actual behavior when in-group, does not get the desired results.

As for a pro-social code, it is very hard to implement, especially if you are interested in creating a large following for your game. People want to role-play, and sometimes they will take on the role of the "bad guy" (in trying not to use a worse characterization...) to examine those aspects of socializing, because being that is also socializing, albeit in a destructive form. So maybe you will be losing players if you decide to promote and apply a pro-social code, when in fact you want to have good retention and good uptake of your game by the largest possible population.

Keep up the good work!

Travis Ross
profile image
Thanks George, thanks for the encouragement :) I'll check out your post. I totally get what you are saying (I think). One of the things I struggled with when writing my dissertation was - How important are norms for game developers? The fact of the matter is the incentive structures and physical properties of game worlds can be changed - so why bother with trying to get players do adopt norms when we can tweak the structures in the game to change behavior? For developers I think being aware of how norms emerge and shape behavior is important, but definitely not the only or maybe in many cases "full" solution. One of my colleagues was saying to me the other day - hey you know you keep referring to norms and prosocial behavior as this "good" thing, but remember if you are aiming for entertainment diversity and antisocial behavior can be good. And of course he is right, but I sure would be nice if we could teach people - OK roleplaying the "bad guy" is fine, but actually being a bad person is not. <- maybe a bit of a paradox there. Must do research.

My interest in norms comes from a broad interest in how the designed environments interacts with motivation to shape individual behavior and thus collective behavior. I'm just getting my work off the ground so I'll be continuing to do research in this area and think about all the dynamic processes in games that lead to particular social outcomes.

Always open to collaboration and further discussion. Cheers.