Twitch sees "glimmer of hope" in battle against toxic behaviour
Co-founder Kevin Lin tells GamesIndustry.biz he believes positivity will perpetuate out from smaller, more controlled channels
Leading livestreaming platform Twitch remains hopeful about the industry's ongoing efforts to crackdown on abusive behaviour among online games fans.
Speaking to GamesIndustry.biz, co-founder and COO Kevin Lin says that while toxicity may seem to be particularly common within gaming circles, there are pockets of respectful and positive users that will hopefully serve as the foundation for the wider community going forward.
"It's little by little," he tells us. "If you go to smaller channels [on Twitch], with hundreds of concurrents rather than tens of thousands, you'll see a lot less [toxic behaviour].
"You'll also see broadcasters who have very carefully managed their community carefully and communicated what they will allow in their channel. You'll find they're quite conversational, they're pleasant but it does take a lot of work as a broadcaster. As that perpetuates through more and more broadcasters as they grow, the communities will self-police - we've seen that many times.
"So it's possible. There's a glimmer of hope there. Does it spread to the rest of the internet? Doubtful."
Lin added that you see similar occurrences on other platforms, such as hugely popular forum Reddit. He observes that the voting mechanic that allows users to rate individual posts goes some way to identifying and managing trolls, but again this is most commonly seen in individual subreddits.
"Can we beat it [everywhere]? I don't know," he adds. "It will take a while, but I see glimmers of hope."
The Twitch co-founder's optimism is refreshing following months of reports about the increasing number of toxic and abusive players gathering around best-selling games. In October, Overwatch director Jeff Kaplan spoke out about the trouble his team has keeping up with the abuse seen in the shooter's community, adding that it "often feels like there's no winning" against such behaviour.
Blizzard has dedicated more and more resources to dealing with this, but the result is progress on Overwatch's ongoing development has slowed.
Lin says the Twitch team is also constantly working hard on finding new ways to help streamers tackle any abuse in their channel.
"In games and in digital media, the unfortunate reality of the internet is anonymity [means] sometimes you see the best in people, sometimes you see the worst, but we're very proactive in allowing creators to build the community that they want," he says.
"Initially we just had tools like choosing moderators in your chat, who would then basically get admin rights to purge lines of chat or drop the banhammer and kick people out of chat entirely. That worked to some extent for the smaller channels, but for the larger channels it's very difficult.
"Then we built AutoMod and released last year, which you can dial up and down. It's a machine learning tool that helps you remove unsavoury stuff from chat. You can dial that all the way up if you want to manage your community more tightly and we're constantly iterating on that tool. We also work really closely with game companies - they face the same problems we do.
"We prefer to keep things positive. Our goal at Twitch is not only to help creators but also to connect people around the world and help them understand difficult cultures, backgrounds and upbringings. So we always keep an eye towards that.
"It's not great that's a common experience we all have on the internet, but we're doing what we can to reduce that."
We also spoke to Lin about the rise of audience interaction in games built with Twitch streaming in mind, a concept he expects to go "full-blown Hunger Games" - read the full interview here.