After Twitter, Facebook, Google and other internet bigwigs, it’s the turn of the ‘Front Page of the Internet’ to do something about the fake news and misinformation campaign spread by Russian propagandists in the lead-up and following the 2016 Presidential election in the US. A massive nationwide investigation is currently deciding the fate of the players involved in this, but online platforms are doing some in-house cleaning as well.
On Monday, Reddit admitted that Russian trolls used its platform to spread propaganda as well as misinformation. The platform also announced that it had “found and removed a few hundred accounts” that it believed to be related to the Russian campaign.
In order to make sure that it doesn’t receive revenue from such propaganda sources, the company said that it is making changes to its ad policies too. CEO Steve Huffman (u/spez) said, “ads from Russia are blocked entirely, and all ads on Reddit are reviewed by humans”. He also asserted that the company’s ad policies “prohibit content that depicts intolerant or overly contentious political or cultural views”.
The company, however, is seemingly standing by its policy of tolerating controversial content on its platform, with Huffman noting that, banning all propaganda, falsehood and fake news is easier said than done. Instead, he puts the onus on society at large to do the investigations themselves to separate facts from fiction. According to him, “Between truth and fiction are a thousand shades of grey. It’s up to all of us—Redditors, citizens, journalists—to work through these issues”.
Here’s the full text:
It is certainly heartening to see Reddit taking steps to weed out at least some of the nastiness from its platform, but it will take a lot more effort on its part to get rid of all the hatred and bigotry that’s become even more aggressive over the past few years.