Amid an outpouring of support for the end of racial inequality prompted by the death of George Floyd at the hands of a Minneapolis police officer, Reddit CEO and co-founder Steve Huffman sent an open letter to his staff and published it on Twitter.
"We do not tolerate hate, racism, and violence, and while we have work to do to fight these on our platform, our values are clear," wrote Huffman, known as u/spez on the social media site.
For more like this
Subscribe to the CNET Now newsletter for our editors' picks for the most important stories of the day.
Those words, as well meaning as they are, didn't sit well with former CEO Ellen Pao.
"I am obligated to call you out: You should have shut down the_donald instead of amplifying it and its hate, racism, and violence," said Pao. "So much of what is happening now lies at your feet. You don't get to say BLM when reddit nurtures and monetizes white supremacy and hate all day long."
As people and companies around the world reflect on the ways in which they contribute to upholding the racism that still runs rife in society, this moment serves as a reckoning of sorts for Reddit. The site, often referred to as the last bastion of free speech, has a long history of allowing racist content to circulate.
And that history is coming back to haunt Reddit.
Within a few days of Pao's statement, Reddit co-founder Alexis Ohanian resigned from the company's board to make room for a black person to take his place. In a series of tweets, Ohanian said it was "long overdue to do the right thing," and he "urged them to fill my seat with a black candidate." He followed up with an interview on CBS This Morning, saying he hoped more tech businesses would take responsibility for the content of their platforms and diversity of their workforces.
Huffman posted another note on Reddit detailing more changes to the site's policy toward racist content. (When asked to comment for this story, a spokeswoman for the company pointed us back to this note.)
This time around, Huffman admitted there was "an unacceptable gap between our beliefs as people and a company, and what you see in our content policy."
On Wednesday, Reddit tapped Michael Seibel, CEO of seed capital firm Y Combinator to take Ohanian's spot on the board. Seibel was the first black partner at the firm, where he's advised and funded almost 2,000 startups. He is known for pushing diversity and inclusion in the startup world.
But bringing Seibel on to Reddit's board won't be the end of the conversation about how Reddit handles racism. Looking at the company's history, it's a conversation that has stretched back to its inception.
Free speech and the early years
From the get-go, Reddit's overlords championed a hands-off policy when it came to governing content on the site. The rules were simple and few: "no spamming, no cheating, no personal info, nothing illegal, and no interfering with the site's functions."
Responsibility for making further rules and enforcing them was handed over to volunteer moderators of Reddit's micro-communities, or subreddits. The majority of moderators take a hardline stance of deleting racist content from their subreddits and banning those that post it. But throughout Reddit's history there have been whole subreddits dedicated to posting racist content, and the volunteer moderators of these forums take no such stance.
- Livestreaming protests provides a raw, unfiltered and possibly warped viewpoint
- 'Master' and 'slave': Tech terms face scrutiny amid anti-racism efforts
- Black Lives Matter: Before you protest, know your rights
An October 2013 article in the Atlantic traced the rise and fall of a subreddit that was named after a well-known racial slur, which was closed down that same year. The author of the article described how visiting the subreddit, where around 6,000 users bonded over their disdain of black people, was "a mental chore."
"Emblazoned with icons like watermelons and fried chicken legs, the site maintained a rotating roster of photographs of whites who have presumably been the victims of violence by blacks, as if no white person has ever committed a violent crime," she said.
"Most of the community's content was about what you'd expect: news stories about crimes committed by blacks, pseudoscience about black inferiority, and personal anecdotes about troublesome interactions with black people."
When the subreddit was eventually banned, it was not, as you might have presumed or hoped, because of the racist content posted on the forum, or because of the way its users harassed other Reddit communities for black people. Instead, it was because it committed the cardinal sin of vote manipulation.
In the conclusion of the Atlantic article, the author describes how by banning the community, Reddit's leaders were taking a more hands-on approach to tackling racism, and expressed hope for the platform's future. But two years later, a blog post published by the Southern Poverty Law Center documented the spawning of a long list of new racist subreddits (which have now also been banned).
The subreddit that started this new wave was called "GreatApes" and first sprang up in November 2013, just a month after the Atlantic published its article. It blossomed into a network of racist subreddits known as the "Chimpire," the largest of which, also named after a racial slur, boasted 15,000 members, making it the second largest racist forum on the internet after white supremacist forum The Stormfront.
Inside the Chimpire, racism ran riot, with the worst content including videos featuring graphic and extreme acts of violence against black people.
Now playing: Watch this: How to protect your phone (and your privacy) at a protest 4:31
But while showing, posting, talking and even glorifying violence was tolerated under Reddit's rules, any sign of encouraging violence in the real world was not. As such, moderators of subreddits within the Chimpire often urged members to be cautious about making threats in case the communities were banned. But regardless of how careful they were to toe the line, the Chimpire was shut down for good in 2015.
Then-CEO Pao kicked things off by banning many of the worst offenders while closing a long list of problematic subreddits, including FatPeopleHate, which had over 150,000 subscribers. Her actions resulted in targeted abuse, which eventually contributed to her leaving the company. Even after Pao's departure, the changes kept coming when under Huffman's second tenure as CEO policy changes meant the remaining Chimpire subreddits were also outlawed on the site.
The demise of the Chimpire came amid the rise of the_Donald, a subreddit founded to support the presidential election of Donald Trump, and the forum mentioned by Pao in her call-out of Reddit.
While the_Donald fulfilled its purpose as a place for supporters of Donald Trump to gather, it also quickly earned a reputation for hosting conspiracy theories and content that was racist, misogynistic, Islamophobic and antisemitic. Slate called it "a hate speech forum." The Washington Post traced the moderators' links to other racist and hate-based forums promoting eugenics and Islamophobia.
In 2017, Nate Silver's statistical analysis news site FiveThirtyEight used a machine learning technique called latent semantic analysis to discover which subreddits were in essence most similar to each other (based on commenter overlap). The biggest overlap it discovered with the_Donald were a number of right-wing political subreddits.
But when it took politics out of the equation and included banned subreddits in the comparison, it showed significant overlap between one of the Chimpire forums, leading the author to conclude "that at least a subset of Trump's supporters are motivated by racism."
Huffman has taken action against the_Donald — banning threads from appearing on Reddit's homepage in 2016 and then finally quarantining the subreddit in July 2019. The quarantine means that the subreddit won't appear in search results and that only users with a verified email can click to opt in and view it. In February 2020, Reddit banned a group of the subreddit's moderators and the community was placed in restricted mode, effectively making it read only.
Most of the_Donald's members moved off Reddit to an independently hosted platform, but its quarantined and restricted remains still linger on in the platform's underbelly.
"I fear we let being technically correct get in the way of doing the right thing," said Huffman last week, speaking specifically about the_Donald. "Clearly, we should have quarantined it sooner."
Mixed messages, clear values?
In spite of Huffman stating last week he believes that the company's "values are clear," you could forgive someone who hadn't been keeping an extremely close eye on the company over the last few years for challenging him on the clarity of these values.
While Reddit's policies have evolved since the early days, Huffman stated during an AMA in 2018 that posting racist content wasn't against Reddit's rules. He later clarified his comments, saying: "To be perfectly clear, while racism itself isn't against the rules, it's not welcome here. I try to stay neutral on most political topics, but this isn't one of them.
"I believe the best defense against racism and other repugnant views, both on Reddit and in the world, is instead of trying to control what people can and cannot say through rules, is to repudiate these views in a free conversation, and empower our communities to do so on Reddit," he said.
In his note published last week, Huffman admitted that this statement was confusing for users. "This gap between our content policy and our values has eroded our effectiveness in combating hate and racism on Reddit," he said. "I accept full responsibility for this.
Racist subreddits also make up only a subsection of the controversial and often disturbing communities the platform has played host to over the years. Some of these still linger on Reddit today (even if they are under quarantine).
There are also new fires springing up that Reddit might need to consider fighting.
On Wednesday, the subreddit for Ireland said it would need to shut down between the hours of midnight and 8 A.M. local time. While its moderators were sleeping, it was overwhelmed by racist posts coming from users based in the US that it couldn't keep up with policing.
"The vast majority of hateful comments submitted over the past while have been solely directed towards someone's skin colour," said the moderators, explaining their decision to limit the subreddit's hours in the post.
It's clear that for Huffman, Seibel and the company's other leaders, the battle to keep racism in check on Reddit is far from over. The company's values might be clear to them, but for many outside this inner circle, Reddit's long history of mixed messages, complacency and tolerance of racist content might not be so easily forgotten.
Black Lives Matter. Visit blacklivesmatters.carrd.co to learn how to donate, sign petitions and protest safely.
Comments Online History Reddit Notification on Notification off Internet Services