Go back

CEJI Deep Dives: Reddit

21/02/2022

Amy Leete, Communications Officer

Reddit is a social news site where a wide variety of content is shared for users to comment and discuss. Reddit, at its core, is a massive collection of forums, where individuals can share new content, news, or comment on other’s posts regarding a variety of topics. These areas are called ‘subreddits’ – operating as miniature websites in themselves. Some of the most popular subreddits include ‘r/funny’ (jokes, comedy, funny videos), r/gaming (updates on new gaming systems, video games) and ‘r/explainlikeimfive’ (where users ask complex topics to be explained to them in a simple way). Over 35 million users are part of r/funny – with Reddit being the 19th most popular site world-wide, the idea of collective spaces to discuss niche interests appeals to a mass audience. In terms of demographics, Reddit focuses on the 18-30 age group.

These subreddits are self-governed, usually by a series of admins or moderators (people who review content to make sure it follows the rules of the group), selected by the creators of the subreddit. While Reddit has its own moderators and Content Policy, which can act when deemed necessary, the moderators and admins of subreddits can ban, block, and delete as they please. 

Aside from posting content yourself, the most popular way to interact on Reddit is by ‘upvoting’ or ‘downvoting’ something. When you ‘upvote’ a post or comment, you are saying that you like it, think it is a good response, or that it is informative. To ‘downvote’ is to do the opposite. Everyone can upvote or downvote; their votes are worth the same. These votes follow your account in a system called ‘Karma’. Post a lot of good content that gets upvotes, and you gain Karma – post something unpopular and you lose it. Some people are so good at Reddit that their Karma runs into the millions: becoming mini celebrities of their own on the site. 

The order of posts, and which posts you see, are determined by an algorithm: a digital formula that tries to discover what content you want to engage with. 

It attempts to show you what is both popular, and what you as a user would like to see. The algorithm tries to predict the content you will enjoy. If you join r/Judaism, for example, you will be prompted to join r/Hebrew, r/Yiddish, or r/HaShoah – because many users are part of all these subreddits at once. Similarly, if you upvote a post about cats, it will show you more cat content. If you downvote a post about spiders, it will show you less spider content. 

However, this combination of the voting system and the algorithm can sour quickly. The voting system is subjective, and the response to your post depends entirely on where you post it. An antisemitic post would be downvoted – and swiftly removed – on r/Judaism, whereas the same post on the (now-banned) Holocaust denying r/frenworld would be upvoted thousands of times. Effectively, the political leaning of the community determines your experience on the site. With the content you engage with recognised by the algorithm as content you want more of, people who upvote hate content are simply shown more of it. 

While the algorithm intends to match people with niche interests in mundane activities, instead, it offers an easy way for people with radicalised political views to find each other, leading to the creation of incredibly popular radical subreddits. For example, r/The_Donald, a right-wing political discussion community with 790,000 users calling themselves ‘Patriots’, which ran from 2012-2017, had  a lengthy documented history of hosting conspiracy theory content that was racist, misogynistic and antisemitic. The 2021 US Capitol Stormingwas heavily linked to discussions on that subreddit. It took 5 years for Reddit to shut the subreddit down – plenty of time for moderators and admins to create their own website ‘patriots.win’, labelled “a magnet for extreme discourse” by the Financial Times. In 2017, Reddit closed down several extremist forums after updating their Content Policy – but is it too little, too late? 

Reddit community r/Portugueses, which often includes anti-Black, anti-Roma and anti-immigrant sentiment reads: “How is it possible for someone to want to see a place like this full of Africans, Brazilians, Indians, and I don’t know what else?” 

“How is it possible for someone to want to see a place like this full of Africans, Brazilians, Indians, and I don’t know what else?” 

Reddit accounts can also be almost entirely anonymous – they require no email to sign up. This has led to the coining of the term ‘throwaway accounts’, in which people create single-use accounts to post specifically controversial content. Moderators have reported that Reddit’s Content Moderating Team only moderate posts written in English, leading to anti-Roma sentiment in r/Portugal and calls for the genocide of Muslims in r/India to be entirely ignored. An issue raised by critics alike, Reddit wants to benefit from the popularity of free, unlimited speech – while also detaching itself from the consequences of such policies. 

To tackle the current situation, Reddit must move more quickly. Government action, such as Germany’s NetzDG law, fine social media companies up to 50 million euros for failing to remove hate content within 24 hours. Yet, how enforceable – and effective – are these measures? Governments can only enforce ethical behaviour to a certain extent: the responsibility for tackling hate content lies with the platforms it is hosted on. Reddit must work with experts in the fields, such as CEJI – utilising our knowledge of hate speech and crime, through the Facing Facts Network and INACH.   

All moderators, both Reddit’s own and those appointed within subreddits, should be trained to identify content that violates Reddit’s policies – with moderation abilities revoked until the successful completion of the course. Training on understanding how specific forms of contemporary hatred are manifesting and coded in social media is essential: radicalised individuals are constantly developing their language in order to avoid detection. Subreddits with high levels of reports – or flagged for extremist content – should be assigned a dedicated team of Reddit moderators to ensure Reddit’s policies are upheld. Civil society organisations, including CEJI, are here to offer our guidance and support. 

In order to tackle increasing radicalisation in subreddits – particularly those that are political in nature – Reddit should adjust the algorithm to reveal anti-hate resources and content within these communities. Much like Facebook’s fact-checking system, content that focuses on the Holocaust should be presented with reputable sources like the UNESCO/WJC Facts about the Holocaust websitethe US Holocaust Memorial Museumand/or Yad Vashem. This could be in the form of a message directly underneath/above the content itself, in the form of Reddit’s ‘promoted’ posts (advertisements), or as a pop up when users access material or communities flagged for Content Policy violations.

It would be wrong to say that Reddit is entirely bad. Users have praised r/Judaism for showing a broad, welcoming spectrum of Jewish life. Hit Netflix show Don’t F*** with Cats revealed how serial killer Luka Magnotta’s home location was partially identified through a subreddit of vacuum cleaner aficionados, who revealed that the vacuum cleaner in the background of one of his videos was only sold in Canada. In the words of Twitter user @Lefty_Jew, niche subreddits ‘are some of the most helpful and nicest places online’.

The answer to Reddit’s issues is not simple. Reddit’s first rule, in their content policy, is ‘remember the human’. And yet, Reddit’s biggest problem is that in their policies and actions, they forget that the hatred they are trying to remove is coming from humans. Blanket bans of communities and accounts is futile when the appeal of the site is the ease of account making and the anonymity offered to users. Instead, Reddit needs to remember the human – and focus on education, empowerment, and deradicalization.