Microsoft’s New Deepfake Tool Can Detect AI-Manipulated Media

Microsoft video authenticator feat.

With more and more use of artificial intelligence (AI) on pictures and videos, “deepfake” content has become very common on the internet. Although many of these content is made for fun and entertainment, these can still become sources of misinformation for the less-informed population. So, to battle the spread of misinformation and fake news via “deepfake” images and videos, Microsoft has made a tool that can detect images and videos that are AI-made.

The “Video Authenticator” tool, developed by the R&D division of the Redmond-based software giant, can analyze images and videos and detect if those are artificial or real. The tool studies the videos and the images and gives “a percentage chance, or confidence score” to them.

“Today, we’re announcing Microsoft Video Authenticator. Video Authenticator can analyze a still photo or video to provide a percentage chance, or confidence score, that the media is artificially manipulated.”, wrote Microsoft’s executives in an official blog post.

According to Microsoft, in case of analyzing videos, the tool will provide the “percentage chance” for every frame of the video in real-time.

Although detecting AI-manipulated media can tough, the company says that the tool “works by detecting the blending boundary of the deepfake and subtle fading or greyscale elements that might not be detectable by the human eye.”

Now, coming to some technical details, the “Video Authenticator” tool was developed by using a public dataset from FaceForesnics++. It was then tested on the DeepFake Detection Challenge Dataset. And as per Microsoft, these two are the “leading models for training and testing deepfake detection technologies”.

The Redmond-based giant is collaborating with an AI Foundation of San Francisco to distribute the tool. The company, with the introduction of this tool, aims to prevent the spread of misinformation, especially ahead of the US elections.

So, the tool “will initially be available only through RD2020 [Reality Defender 2020], which will guide organizations through the limitations and ethical considerations inherent in any deepfake detection technology.”. Hence, it will be distributed to election campaign organizers and news and media outlets covering political news. 

VIA Techcrunch
Comments 1
  • I desperately need to get money, so I decided to sell my face photo. I hope to get the appropriate response

Leave a Reply