Taking steps to address the criticism around the use of Messenger¬†to spread inciteful and hateful content – specifically in Myanmar last year – Facebook has introduced a couple of new tools which allow users to report conversations that violate the company’s community standards within the app.

Previously users were only able to report activity on the platform using the Facebook reporting tools or on Messenger web, however, according to a recent blog post on the matter, users can use the reporting tool directly from any conversation within Messenger on iOS or Android.

In the blog post, Hadi Michel, Product Manager for Messenger, writes:

“Most people use Messenger to connect with family and friends, make plans, and share photos and videos with loved ones. To help minimize bad experiences¬† that can get in the way of these connections, we’re introducing new tools on mobile for people to report conversations that violate our Community Standards.”

To access the new reporting tools, users can now:

  • Tap on the name of the person or group who they want to report.
  • Scroll to the Something’s Wrong option.
  • Select from categories, including harassment, hate speech, or pretending to be someone else.

Users will also be able to ignore or block the person they’re reporting and after successfully completing the report, they’ll receive a notification confirming that the report was submitted for review. The reports will be reviewed by the company’s Community Operations team, which reviews reports in over 50 languages. This, in effect, will result in issues being addressed faster.

The introduction of new reporting tools to Messenger is just a start and Facebook still has a long way to go, if it wants to live up to its responsibility in nations where the platform was used to spread fake news and hatred. It’s worth noting that earlier this year in March, a UN Fact-Finding Mission concluded that Facebook played a major role in accelerating ethnic violence in Myanmar.