Videos engage more internet users and clearly make more impact on them. This is why YouTube is in a grave situation when it comes to dealing with fake news, misinformation, and intriguing conspiracy theories. Just look up for videos refuting Al Qaeda’s involvement 9/11, about chemtrails, or the first landing on the moon and you’ll be astonished to see how many self-proclaimed conspiracy theorists you come across.
Now, YouTube is taking on these videos, not be removing them but by supplementing them with links from Wikipedia. CEO Susan Wojcicki, in an interview with WIRED, said that displaying information solutions or “cues” will aid in debunking incorrect assumptions. For now, the feature will only be introduced to augment conspiracy theories which are hot subjects of debate on YouTube.
YouTube’s decision to link text-based links instead of demoting the mentioned videos is also seen as a wise step as it will allow users to consciously absorb information and form a more solid perception. It also saves the company from the effort of censoring videos or barring users from accessing such videos for the sake of some humor.
YouTube’s fake news problem is much more serious in India where Jio has led to booming content consumption. Fake news and misleading news is often “Trending” on the service’s homepage. So, merely adding information links might not be as useful on a global scale as Wojcicki expects it to be. Noted technology commentator and contributor for The New York Times, Zeynep Tufekci, believes that the solution rests in an update to the suggestion algorithm as YouTube’s recommendations are what often lead users to extreme and mobilizing content such as conspiracy theory videos.
The step might also backfire as it has in the case of Google News and Facebook where the trending topics have often included hoaxes. Another aspect which YouTube – with aid from Wikipedia – will have to fight is the pace of fake news which often spread like wildfire.
At the same time, YouTube is easing working conditions for its human moderators to prevent them from falling prey to depression or trauma. Wojcicki told The Verge that despite the enormous volume of user-posted content, it will limit the time for which moderators view “disturbing videos” to four hours per day. The contracted workers at YouTube will also receive the same “wellness benefits” including psychotherapy sessions as other full-time Google employees. Earlier, YouTube had also pledged to hire 10,000 more moderators to take care of the content violations.