The Washington PostDemocracy Dies in Darkness

Burnout, splinter factions and deleted posts: Unpaid online moderators struggle to manage divided communities

From the pandemic to systemic racism and volatile politics, the real world is seeping into online communities and making them harder to moderate.

August 25, 2020 at 6:00 a.m. EDT
(iStock)

For four to six hours a day every day, Caitlin Welch busies herself as a volunteer moderator for a number of Facebook groups while her baby sleeps. Much of that time is spent on a collection of evidenced-based parenting support groups including Motherhood Without the Woo, of which she is an administrator. She engages with members, reviews flagged posts, deletes problematic articles and boots troublemakers.

The job has been harder in recent months. Welch, a stay-at-home mother in Amarillo, Tex., says she has been dealing with more users, increased tension and heated debates since the pandemic began.

“Everybody’s a little more combative and gets a little more emotional,” Welch said. “In the beginning, it was really bad because everyone was freaking out about [the coronavirus] and nobody knew anything.”

From Facebook, Reddit and Nextdoor to homes for more niche topics like fan fiction, many online communities and groups are kept afloat by volunteer armies of moderators. The people who moderate these groups often start as founders or enthusiastic members, interested in helping shape and police the communities they’re already a part of.

They are both cleaning crew and den parent. Moderators take down spam and misinformation. They mediate petty disagreements and volatile civil wars. They carefully decide between reminding people of the rules, freezing conversations, removing members or letting drama subside on its own.

How to spot misinformation and stop its spread

Over the past five months, many moderators have found their jobs mirroring the outside world: increasingly messy, harder and unpredictable. The pandemic pushed people into isolation, and many turned to online communities for socialization and companionship. The real world has been hard to keep at bay.

The changes also reflect the increasingly partisan nature of social media, which allows users to create social bubbles that exclude views that contradict their own. Groups and other online forums splintering into factions have continued that trend.

“Conversations are becoming increasingly charged in spaces where people are not necessarily used to seeing politically charged conversations,” says Kat Lo, a content moderation lead at technology nonprofit group Meedan, who studies content moderation.

On Facebook, groups are spaces created by users for discussing specific topics. They can be about anything, from cult TV shows to a shared illnesses. Anyone can start a group, and they can be public, meaning anyone on Facebook can read the posts and comments; or private, in which only invited and approved members are privy to the conversations. Administrators have more control and moderators can make content decisions. There are now tens of millions of active groups on Facebook and more than 1.4 billion people use the groups every month.

Reddit has similar options for its forums, though most are viewable by the public. On Nextdoor, you have to show you live in a neighborhood to join its group.

After the police killing of George Floyd, overdue conversations about race started happening across these spaces. Moms groups debated privilege and issues like “nanny shaming,” when parents secretly photograph caregivers and share concerns with other members. Computer- and photography-related groups discussed doing away with problematic technical terms like “master” and “slave” that have long been used to refer to components that control others.

Groups discussed bringing on more moderators from marginalized groups, or not asking for so much emotional work from the underrepresented mods they did have. The novel coronavirus, especially regarding the start of the school year, led to political arguments about the effectiveness of masks and vaccines in previously lighthearted parent communities.

A third of Americans now show signs of clinical anxiety or depression, Census Bureau finds amid coronavirus pandemic

The Ani DiFranco Fan Forum group on Facebook has a very specific focus: the music and writings of folk singer-songwriter Ani DiFranco. Its 3,000-plus members discuss their favorite lyrics, share set lists and memories of live shows, and post information about her latest work. The group has a simple, strict Ani-only rule for posts.

As protests over the Floyd’s death spread across the nation in June, conversations veered into anger toward President Trump. After Lee Houck, one of the group’s admins, reminded the posters to stay on topic, they accused him and other moderators of censorship, leading to tense arguments in a usually friendly group.

Houck, a writer and quilter in Brooklyn, N.Y., who usually spends a half-hour a day on his moderation duties, started losing sleep and felt nauseated over the rift. Eventually a truce was reached when a number of members formed a splinter group for more political conversations, called “Ani DiFranco’s Righteous Army!”

“I think what’s happening is there’s a lot of people who are sort of waking up to this cultural moment in this really interested and activated way, but because they’re new to the moment they don’t know about stamina and they don’t know where to put the focus of their energy,” Houck said.

The outcome was not uncommon, according to Charles Kiene, a PhD student at the University of Washington studying conflict and change in the governance and institutions in online communities.

“There’s this phenomenon of online communities forking or splitting because some members are not happy how things are going, they sort of just leave and make their own,” Kiene said. He’s seen it firsthand in Seattle, where he lives, which now has two main subreddits. (Subreddit is the name for public communities on the online forum Reddit.) The second group was founded after disagreements over rule enforcement, but now they’re split on more ideological lines.

Misinformation about the coronavirus is thwarting Facebook’s best efforts to catch it

Facebook says there has been an increase in groups participation during the pandemic, and in conversations about race amid the Black Lives Matter protests. The group feature is a decade old but became a central part of the social network’s strategy in 2017, when Facebook started to push users toward the communities. In the United States, which has the highest number of reported covid-19 deaths in the world, 4.5 million people are in a pandemic-related support group on Facebook, according to the company.

Reddit has also had an increase in usage since the pandemic began, with traffic spikes of 20 percent to 50 percent across communities. Founded a year after Facebook, the site has more than 430 million active users and 130,000 subreddits. It has thousands of volunteer moderators.

Across sites, the online groups and communities are typically monitored by a combination of paid human content moderators and automated tools, which remove many of the most problematic people and links, or content such as blatant hate speech, violence and child pornography. The big sites hire teams of content moderators, often through third parties overseas, to deal with much of the most offensive content. Facebook, for example, has 35,000 people working on safety and security, most of whom are moderators.

Even with automation and company intervention, volunteer moderators are left to manage more nuanced, difficult and decidedly human issues.

Content moderators at YouTube, Facebook and Twitter see the worst of the web — and suffer silently

“You’re going to need both social and tech solutions. Yes, you need better tools. Yes, you need better support for moderators,” says Amy Bruckman, a professor in the School of Interactive Computing at the Georgia Institute of Technology. “If you don’t put in ridiculous hours, the group spins out of control.”

Bruckman has studied online communities and moderation for years, and moderates a large number of groups herself. (“I believe in getting your hands dirty,” she says.) She is a moderator on several Facebook groups and a handful of subreddits, including r/science and a group for Georgia Tech students. It takes up a few hours of her week.

In the motherhood “without the woo” groups, Welch and her fellow moderators have mediated difficult but constructive conversations about race and white privilege, and fielded complaints over a new group ban on digital blackface — when white people use GIFs and memes featuring black people in online conversations. The moderators are constantly rushing to keep up with the evolving science around the coronavirus, and to stop conversations about it from becoming politicized. (Welch is also an admin of the Facebook group in which everyone pretends to be ants, which, despite having nearly 2 million members yelling in all caps, has not seen similar issues.)

Nearly 2 million people on Facebook are pretending to be ants.

Earlier this month, there was a post in one of their groups from an anti-mask member, asking why Home Depot could be open but high school football wasn’t allowed. After determining the poster wasn’t interested in a scientific debate, Welch turned the conversation into an “anarchy” thread, meaning there would be no attempts at moderation. Other members piled on.

“We know it’s going to be a dumpster fire, so we just let it burn,” Welch said, “which probably wasn’t the nicest thing to do, but she wasn’t getting it.”

Tech platforms have been repeatedly criticized for not doing enough to address misinformation and racism, which has continued to fester in private groups that embrace it. Hundreds of advertisers are currently boycotting Facebook over its hate-speech policies.

Under pressure from public outrage over the death of Floyd and concerns ahead of Election Day in November, there have been some social media changes over the summer. Reddit in June announced it was shutting down a popular and known racist pro-Trump group and 2,000 other subreddits, while also adding a new policy banning hate speech. In August, Facebook took down a nearly 200,000-member QAnon conspiracy theory group and an additional 790 QAnon groups last week, though many more remain active. Nextdoor in July asked users to sign a “good neighbor pledge” and promise not to discriminate.

In August, Reddit started beta testing a new program to train and certify moderators. It has councils of moderators it consults with, and the company also offers resources to guide the volunteers. For example, if a community is in crisis, it can request help from a team of experienced moderators.

Facebook has tools with accurate coronavirus information sprinkled across all its products, to counter some of the misinformation that regularly flows around the site. It has added a number of resources for moderators, including a guide on conflict resolution and handling conversations about race. It recommends moderators educate themselves and their teams about sensitive issues, acknowledge what’s happening and spell out a plan for members to address it, including diversifying moderation teams. It also suggests revisiting group rules to make it clear what topics are allowed or not.

Strict, clear rules can help keep a group focused on its topic, says Casey Fiesler, an assistant professor at the University of Colorado at Boulder who studies online communities. But a simple “no politics” rule is complicated by, well, everything about 2020.

Suddenly Nextdoor is filled with kind neighbors. But also new kinds of shaming.

Moderating the popular Reddit r/coronavirus subreddit, a place for scientific discussion of the virus, has been a constant battle against conspiracy theorists and political bickering. So moderators tried adding a rule forbidding politics.

“We have a rule against discussing politics — just being stridently partisan, for example. This has always been tough for us because this is a political thing,” said Rick Barber, a computer science PhD student at the University of Illinois and moderator of r/coronavirus. “Politics is a pretty relevant dimension here.”

In the beginning of the pandemic, Barber was spending 10 hours a day moderating the forum, keeping conversations on track by reviewing flagged posts or banning bad-faith posters. The group has grown significantly, and has about 60 moderators, up to 20 of whom are active at any given time. It’s a science heavy team, with experts including virologists and epidemiologists, as well as some experienced “power” moderators who have helped guide decisions.

Still, Barber has had doubts about trying to sidestep some of the political aspects of the pandemic, including removing posts critical of political leaders.

“Every day I would feel like, is this is the right thing to do?” Barber said. “I’m not positive any of our rules are the right rules or they should be there. It’s kind of an open question we revisit from time to time.”

For Welch, the online communities are worth the effort and her time. She doesn’t feel like she’s burning out, yet, but the experience is changing her.

“I’ve definitely been a little more outspoken than I normally am in regards to wearing a mask as well as the Black Lives Matter movement,” said Welch. “I used to be very nonconfrontational and didn’t rock the boat. My patience is pretty much gone now.”