Community management in the culture wars
Raph Koster, Richard Vogel, and Gordon Walton say harassment and conflict are only getting worse, but there are ways to address it
Gaming has always been a popular form of escapism, but that might be changing. Games are increasingly social, allowing players to express themselves and their beliefs to one another. Combine that with a larger player base than ever before and plenty of fractious topics on which people disagree, and it's entirely predictable that many of the debates in the real world would spill over into virtual ones.
That said, disputes in game communities are nothing new, as a trio of industry veterans--Richard Vogel, Raph Koster, and Gordon Walton--all of whom worked on the MMORPGs Ultima Online and Star Wars Galaxies--shared their own experiences and advice today in a Game Developers Conference session titled "Managing Game Communities Within the Culture Wars." It's not the first time the group has done this at GDC; it's just been 14 years since the last time they discussed the topic.
"That's part of the insidious problem of filter bubbles. It's that we actively collaborate in building them."
Raph Koster
"If this talk doesn't piss you off at some point, then maybe we're doing it wrong," Koster said.
Koster began by recapping what the industry knows about community trends today, with greater polarization of views, an increase in apparent harassment campaigns, and more contentious relationships between developers and players. Koster said there are a few reasons for it, starting with one explained in the book "The Filter Bubble." The internet has been designed to filter people's search results based on what big companies like Google think they're like, Koster said. So someone searching for "abortion" in North Carolina may get a link to adoption agencies, while someone who does the same search in San Francisco may get a link to Planned Parenthood. In politics, these companies are looking to never show users content that disagrees with their world view, and Vogel said it's only going to get worse as wearable computing takes off and these companies understand more about where you go and what you do.
"That's part of the insidious problem of filter bubbles," Koster said. "It's that we actively collaborate in building them."
Koster then brought up The Parable of the Polygons, a free web browser game that attempted to explain why innocuous choices add up to harmful trends. The game particularly talks about humans self-segregating into homogenous groups, where everyone around them is the same. Vogel said if people were confined in this GDC room for a week, they would very quickly start forming cliques and groups and those would eventually give rise to friction and violence.
When you have groups that strongly identify with that group and then refuse to communicate across boundaries, what happens is almost exactly like inflammation in the body," Koster said.
He pointed to Switzerland, a diverse country, but one where the different groups are divided into homogenized areas. He then showed a heat map of Switzerland showing crimes, and noted that the areas where crime was most prevalent were the areas where these different groups bordered each other. Contact between the groups is what causes inflammation. Koster calls it disturbing because it sounds like segregation, but Vogel and Walton pointed out that it's a very basic, very human reaction. Koster suggested that with the internet increasingly herding people into homogenous groups, and those groups causing problems when they interact, the problems going on right now are only going to get worse.
However, there are factors that could work against that. Koster said that when people know they're likely to interact with one another in the future, they tend to treat each other better. So community managers who interact with their players on a daily basis can build up trust and create faith in the developers. Vogel said that talking about a game early can actually be harmful, because people's perceptions and expectations of the finished product will be formed, and any changes to that can only hurt that trust and break that faith.
"When I interviewed community manager, I put them through a pretty serious interview just to see if they could hold up."
Jeff Vogel
Koster said a lot of problems are being aggravated by the current gaming environment. The free-to-play trend in particular is based on bringing in as many people as possible and builds in a certain amount of churn, something Koster said hurts community building. Walton added that the low barrier to entry and lack of persistent reputation in free-to-play games tends to hurt player behavior as well, since problem players can keep coming back again and again.
Developers rely on peer pressure to create good communities, Koster said, but free-to-play gamers may not have the same investment in the community. Since they didn't pay to get in, they might not care about the game being fun for everyone, or worry about being booted because of bad behavior.
The scale of communities also matters, Walton said. If you want to be a criminal, it's better to be one in a big city, he said. If you're the only criminal in a town of 60 people, it's much more likely that you'll be caught.
Language is another complicating factor, Koster said. Formerly clear boundaries like those between developers, funders, and the audience are blurred by trends like Kickstarter and Early Access, and the industry is still figuring out what separates a journalist from a critic from an academic.
Koster criticized Twitter, Reddit, Facebook, and the Chan sites as setting up their policies to encourage the sort of controversy and bad behavior that makes up huge problems for developers, from creating filter bubbles to light or no moderation or enabling anonymity. Koster pointed to Yik Yak as a particularly awful example "basically designed to make teenagers kill themselves," as it only lets users see posts from people in the immediate area, requires anonymity, and lets people down-vote posts.
Koster also brought up mobbing, the practice of having massive numbers of users attack people on social networks, often inspiring depression, panic attacks, even suicide attempts. It's difficult to prosecute under the law, because so many individuals might only send one tweet, which doesn't meet the threshold for criminal harassment. Vogel said it's how cliques work, putting pressure on people who aren't like them and pushing them out. Koster said it's always bad when this sort of behavior happens within communities, because it's basically creating a civil war. And community managers will need to be prepared to meet that sort of abuse head on.
"When I interviewed community manager, I put them through a pretty serious interview just to see if they could hold up," Vogel said, noting that the abuse they take in the line of duty would be far worse than what he subjected them to.
"It can be very hard to sit in a different culture's shoes and realize that their means of communication are very different than your own."
Raph Koster
With the audience sufficiently depressed, Walton said it was time to discuss possible solutions. Koster said it was important to subdivide the community into manageable sizes and common cultures or interests. Ideally, no community would get above 150 members before being split up. Once those communities are divided, Koster said it's time to try and minimize the reasons they might clash. Try to create a United Nations model where players can have delegates interact between developers communities. Vogel said it was an approach taken in Star Wars Galaxies, and it worked because the community wanted feedback loops. They want to know they are being heard and see progressive change happening.
Anonymity is also a problem, Koster said. Using anonymous community venues is a risk factor, especially when they aren't designed with a strong up-vote system. Reputation is crucial, Walton said. In the real world, the law often acts as a substitute for reputation, laying down harsh punishments for people who don't follow through with their obligations. Koster said it's helpful to remove any and all down-vote systems from communities. On top of that, it helps to create a meta-identity for users, a common ground for them to identify as and rally around. Part of the reason for the low crime rate in Switzerland, he said, is because the overriding cultural identity of everyone being Swiss helps to gloss over differences in the sub-groups.
It's also helpful to put oneself in the shoes of other people, Koster said. In Chan culture, for example, there's a strong value on the complete freedom of speech, so much so that the idea of being offended at what someone else says or moderating one's behavior in response to that is anathema.
"It can be very hard to sit in a different culture's shoes and realize that their means of communication are very different than your own," Koster said.
Vogel said that apologizing as a community manager is a very hard thing to do, and opens you up to liability and lawsuits from people. He prefers a call to action, telling people what you'll do to make it better rather than expressing remorse for previous failings. Walton disagreed, saying apologizing is often disarming, and may give them pause instead of flying off the handle.
"Language is really important, especially in what community you're in and how you talk to them."
Jeff Vogel
"It's all in how you write it and what tone you do in your writing," Vogel said.
Koster also suggested using "we words" instead of "you and I words," because it's building that group identity. He acknowledged that it can feel like pandering, but if your goal is to reduce inflammation, then it's a helpful thing to do. Similarly, avoid words that mean different things to different people, as they're likely to just argue over those words. For example, fights over "What is a game?"
"Language is really important, especially in what community you're in and how you talk to them," Vogel said.
Personal commitments can also be helpful, Koster said. Instead of giving people a long terms-of-service agreement to use the forums, it can be more effective to simply make them click to agree to a short, declarative statement like, "I will behave on the forums."
The last topic the panel touched on was doxing and personal info, and they all implored the audience to treat themselves as potential targets of doxing and harassment, and to do what they can to limit the amount of personal information available online. Every single person involved with the game should treat all social media presences as public.
"You may as well all consider yourselves public figures now," Koster said.
Koster claimed that he has been "off the grid" since 1998, but showed a wealth of personal information he was able to dig up on himself with quick internet searches. Koster said to unlist phone numbers, disable your phone's geolocation features (especially for photos). Vogel said it was important to have an escalation plan in case you are the target of a mob, but Koster said if the worst comes to pass, no amount of preparation will seem sufficient.
"If that happens, the best advice I have is to disconnect completely and take care of yourself first," Koster said. "The battle will be there another day."