As an ecosystem, Twitch has a lot to answer for. Users taking advantage of the IRL tag to livestream their sexual harassment and bullying of others, banned accounts often coming back after just 24 hours regardless of the severity of the acts committed, and barely a finger lifted by moderation staff to find the bad apples.
The best way to get an idea of what's going on is to speak to those in the firing line, so today we’re focusing on those most affected by Twitch’s lack of moderation oversight and action. We spoke to both successful streamers with large audiences and those just starting out about the state of moderation on Twitch, where the service is failing to protect users, and where it could improve.
Following last week's articles, we put out a call for Twitch users with fewer than 100 subscribers who had experienced harassment, and who had quit the service due to a lack of support. We were absolutely inundated with responses, to the extent this post could've been 10,000+ words long, and this is only a fraction of the shocking stories people had to share. Of particular note was the number of people in minority groups or with physical disabilities that had been targeted.
I first joined Twitch over a year ago. Since then I have had many people giving me abuse simply because I'm physically mute. I have a Twitch command that tells people what it is I have and why they can’t hear me talk, but that doesn't help. I have been called all the names under the sun.
I’ve also had an online stalker who to this day I don't know the identity of, and Twitch has done nothing to help me stop them. I reported them multiple times for coming onto my streams and just making me aware they were watching me, Twitch didn't do anything.
I recognised them by their user name, they had been following me around for a while. They had an issue with me just because I'm trans and I was moderator for a site they were banned from.
"Uhm excuse me sir, can I please talk to the man in charge of this stream. He listed his gender wrong."
I banned him and several of his friends who had swarmed me with transphobic harassment. Others tried to get me to stop banning them, telling me I was a bad streamer or that if I keep it up no one would be left to watch. I banned them all eventually.
Twitch did nothing, but I don't feel like there was anything they could do.
I'm worried those harassing users might come back but I also don't want to give in and let them win. I'm currently no longer Streaming, as much as I wish I could come back to the platform.
My story starts around 3 years ago on Twitch. I was fortunate enough to get my hands on an Alpha Key for Hearthstone. I started my first stream and I got immediately around 100 people in my chat. After 3-4 days I had over 300 people watching me, while I was struggling to learn an unreleased game.
With the struggling came the hate, backseat gaming, and even hate speech. People were claiming I was a Nazi and should kill myself. People were telling me I should get cancer so that another streamer better at the game could play instead. I didn't have any human mods. After 3 days, I quit Twitch.
Across these accounts of harassment on Twitch, the same story kept on popping up. Smaller creators with fewer than 100 subscribers would get hit early on with a wave of harassment, and Twitch’s reporting and moderation tools were simply not up to the job of protecting those users in realtime. Twitch is a livestreaming service, but its moderation practices are slow.
Kotaku UK has first-hand experience of Twitch’s response times against harassment. On Friday we reported 25 harassment-focused Twitch accounts, both through the on-site reporting tools and a direct email to Twitch staff including the evidence, and five days later all 25 accounts are still active. We're a media outlet and we can't get a response, which emphasises how hopeless this whole process must feel to individuals.
So what can be done? We asked long-term Twitch creators about their experiences on the platform, where they see the problems as coming from, and how they would go about improving things for new creators without moderation teams.
Twitch, like YouTube, lets you block certain words from chat. There are automatic lists, but you can opt out and create your own. I have an extensive list that updates regularly, but all this list does is removes comments featuring that word. That's great, but I'd love a system where if someone uses X number of your banned words over a certain time you can suspend or even block them from your channel automatically while you’re mid stream.
At the moment no automated systems are good at taking down comments that don't use words on these block lists. Just common words and phrases set to offend without anything blockable in them. At the moment we get around this by adding strings of words to block lists after the fact. Maybe Twitch could run a new, regularly updated blocklist that stops more recent harassment phrases/memes etc?
I will always be in support of muting systems! It makes Twitter bearable these days. Mute comments that trip that banned words list, and leave the commenter thinking they were heard so they don’t just make a new account to get around your moderation.
When I first joined Twitch, it was pretty rabid; mainly because the viewers didn't really know the rules or what was acceptable in my community. However, it only took a few months of running Moobot and having a really solid team of moderators to get it to the state it's at now, which is a warm, friendly atmosphere with a no-nonsense attitude to trolls or abuse. I’m lucky I had those resources available to me.
I was very lucky that I had volunteers, but I do agree there should be more support for smaller channels, even if that's just more active responses regarding reports of abuse. Moobot and other moderation bots are great up to a point but often can't step in if there's an active issue happening in real time. This means the streamer has to pause to deal with the situation, which can be distracting or upsetting and can lead to the individual causing drama ultimately getting what they want.
I appreciate it's a challenging area to try and tackle; perhaps a similar setup to the recommended artists for emotes might work, where users with a clear record can sign up as volunteers to help moderate for smaller streamers? Once again though, that does put the emphasis back on the community to clean up Twitch, instead of Twitch putting in the effort to clean up themselves. It also doesn’t account for differences in what different streamer deem acceptable in their chat.
I think issues such as repeat harassment and/or stalking need to be handled a little more delicately than your average abuse report due to the nature of how laws are changing regarding online abuse. People need to know their reports are all being acknowledged and stored away safely, and that action is indeed happening. Right now people are not informed of the results of these kinds of reports.
I don't think automated mod tools can achieve this in the manner that's needed. They're a great starting base if you set them up correctly for you and your audience, but they don't yet have the finesse to see the bigger picture or to help the streamer emotionally cope with deliberate abuse that doesn’t trip set blocked words.
I've become way more cautious since I first started on Twitch, not as open. Anyone who starts asking a bunch of questions in my chat immediately raises a red flag, no matter if they are a new viewer curious about the streamer or an actual harasser. I've got way more safeguards and modding tools in use than when I started. Over time I’ve kept ramping that up.
Twitch has the auto-mod feature which can be used to filter out certain words, but you still have to be able to respond to things depending on the level you choose (1 which is very lax, to 4 which is super strict).
I think if Twitch can give streamers a package to manage moderation and make it useful, and responsive then it could be useful. Give me a way to ban, report and get a response in one package deal, it would be fantastic. One thing they don't do is respond to a report of harassment, whether successful or not.
Reporting abuse/harassment leaves an open-ended problem for the caster. We don't know if action was taken or not, So it makes reporting feel useless. They just tell you the report was received. It's even less useful than Twitter reporting, which at least tells you if they aren't doing anything with your report. I've had better luck reporting tech issues than harassment on Twitch.
Often if you get language barriers or people who don't know the ways folks use Twitch and interact etc, you can get someone who comes off in a bad way but really is just curious about the streamer or what they are playing.
For instance, someone once came into one of my streams and started asking a bunch of questions about the state of feminism in gaming, etc. At first, it wasn't clear what they were up to. Maybe it was someone who actually had questions but as they continued to write, it was clear it was thinly veiled "well, actually" and an attempt to engage dishonestly to start something in the chat.
A bot couldn't make that call. Auto mod tools are great but they just keep things from going directly to chat. The streamer and mods still have to deal with it.
I would like to see Twitch expand the phrases that will trip auto mod. I’d like to see them talk to streamers who have been harassed and get info from them. Follow up with reports of harassment and actually ban people, keep them from creating new accounts by IP banning and similar measures. Don’t make it easy for banned users to return to the platform.
If someone is banned for something like what happened to Charleyy, then leave them banned. If not, it tells me clearly that Twitch is siding with harassers when they ban a harasser and then days later reverse it.
From streamers starting out to full-time creators, the people who actually use Twitch seem to have a firm idea of the problems, and good ideas for solutions. But there's a sense, among many we spoke to, that Twitch is detached and basically trying not to engage. If they lose ten users for harassment, there are fifty more signing up, so why worry.
It's easy to forget that Twitch is no longer some plucky underdog, but a billion-dollar tech company owned by Amazon. Its business is livestreaming, and the strain of this culture that has led to brigading and targeted harassment needs to be confronted sooner rather than later. It should not only be acting on reports of harassment, which it's currently doing a bad job of, but giving those who report the decency of a response. Twitch needs to improve the harassment-muting tools that creators can use to self-police, educate new streamers in what they can do, and take a lead role in improving the culture it enables and is built around. And yes, it needs that thing that tech companies hate: more humans who can make moderation judgement calls when Auto-Mod doesn't cut it.
More than anything Twitch needs to listen to victims of harassment or, to put it another way, their customers. Kotaku UK has been flooded with people wanting to tell us about awful experiences, and many feel that no-one at Twitch is listening. There is no magic bullet for the problem of online harassment, of course, and it will be an ongoing battle for tech companies. But just because something's hard to solve doesn't mean they should get away with not trying.