Following Instagram’s lead to stop users from posting offensive comments, YouTube has also decided to warn users when their comments may be offensive to others in an attempt to make the platform a better place for healthy interactions.
Warn Users Posting Hate Comments
Users who post hateful comments will see a pop-up banner to remind them if they really want to do so. Commenters can then choose to edit the original comment or post the comment in its current state. These warnings will first appear on YouTube for Android. And here’s what it will look like:
Since this is an automated feature, there’s a chance that YouTube’s systems might accidentally flag your comment as offensive. In such cases when you think that the system has made a mistake, you can tap on the ‘Let us know’ link in the pop-up to leave feedback.
Similarly, even if YouTube AI doesn’t flag your comment as offensive, your comment might still get taken down if it violates the company’s community guidelines. Apart from the commenter, the channel owner has controls to remove potentially offensive comments.
In a FAQ on its support page, Google clarifies how it deems a comment as inappropriate. “Our system learns from content that has been repeatedly reported by users. We hope to learn more about what comments may be considered offensive as we continue to develop it,” says Google.
Other New YouTube Features
Apart from comment warnings, YouTube is also testing a new YouTube Studio feature to filter inappropriate and hurtful comments that are pending review. This way, YouTube says that creators will not have to go through offensive comments if they don’t want to.
YouTube is also working to identify any unintentional bias in its systems in terms of monetization. To achieve this, the company will ask creators to voluntarily reveal their gender, sexual orientation, race, and ethnicity in 2021.