Microsoft Releases Deepfake Detection Tool Ahead of Election
Published on September 01, 2020 at 11:30PM
An anonymous reader quotes a report from Bloomberg: Microsoft is releasing new technology to fight “deepfakes” that can be used to spread false information ahead of the U.S. election. “Microsoft Video Authenticator” analyzes videos and photos and provides a score indicating the chance that they’re manipulated, the company said. Deepfakes use artificial intelligence to alter videos or audio to make someone appear to do or say something they didn’t. Microsoft’s tool aims to identify videos that have been altered using AI, according to a Tuesday blog post by the company.
The digital tool works by detecting features that are unique to deepfakes but that are not necessarily evident to people looking at them. These features — “which might not be detectable to the human eye” — include subtle fading and the way boundaries between the fake and real materials blend together in altered footage. The tool will initially be available to political and media organizations “involved in the democratic process,” according to the company. A second new Microsoft tool, also announced Tuesday, will allow video creators to certify that their content is authentic and then communicate to online viewers that deepfake technology hasn’t been used, based on a Microsoft certification that has “a high degree of accuracy,” the post said. Viewers can access this feature through a browser extension.
Read more of this story at Slashdot.
More Stories
‘My 401k Misses You’: Black Woman Pumped To Meet Donald Trump In Philadelphia – July 18, 2023 at 04:56PM
Energy Provider Warns of Impending ‘Crisis,’ ‘Blackout Conditions’ Driven By Biden Plans – July 18, 2023 at 04:20PM
Dog starts barking at cows crossing a bridge, so the cows stop to have a look. – July 17, 2023 at 02:27PM