Following the recent YouTube hack, which affected a number of VEVO channels, the platform was accused of serving porn ads on trending videos. Add to this the Logan Paul fiasco, and it’s no wonder that YouTube is making big changes to how content is moderated and how videos are monetised. It’s also been faced with lawsuits from child rights groups in the US. The latest problem is also related to young users on YouTube and misleading playlists that are full of objectionable content.
According to The Sun, YouTube hosts playlists full of 18+ and explicit videos, which are mixed with videos directed at children, to lure minors to pornographic content.
The playlists in question, which also surface when searching for ‘unsimulated movies’, mix objectionable videos with videos for kids such as for coloring lessons (seen below). These are clearly directed at kids, who might search for these and stumble on to on explicit playlists.
The report further highlights that a spokesperson from the National Society for the Prevention of Cruelty to Children (NSPCC) has slammed the platform over the controversy, claiming that it risks giving children “a distorted view of sex, body image and healthy relationships”.
We were able to independently verify that the playlists still exist, however, YouTube seems to have removed some explicit videos within the playlists. Additionally, we were able to find playlists of entire movies on the platform that contain sexual content, which is against YouTube’s content policy.
It’s worth noting that even though the playlists include videos with explicit content along with videos directed at kids, the offending videos are age-restricted and won’t be available for viewers under the age of 18. However, some children are known to have opened new accounts with fake ages.
When contacted regarding the issue, a YouTube spokesperson only told The Sun that they’ll “look into it”. The company was also contacted regarding the explicit porn playlists on the platform which were discovered earlier this month. At the time, a YouTube spokesperson told the publication:
“We do not allow pornography on YouTube…We do allow videos with artistic or educational value by we apply an age restriction where appropriate.”
While the platform may have removed the offending videos from the playlists, it’s believed that YouTube still hosts a ron of pornographic or explicit content that it actually forbids in its rules, and it’s often mixed with innocuous-looking content to trap children.
The platform relies on user reports to remove objectionable videos, along with AI-based moderation, but more often than not a number of explicit videos pass under the scanner unnoticed. YouTube is now moving to human moderation as well, but this is at a considerably smaller scale at the moment.