">

White House seeks to cryptographically verify Biden videos to mitigate deepfake risks

White House seeks to cryptographically verify Biden videos to mitigate deepfake risks thumbnail

In context: With deepfake and generative AI scams on the rise, the White House says that ways to cryptographically verify their official releases are “in the works.” No details have been shared yet of what this process would ultimately look like, but it seems probable that it would be a form of ‘signing’ official releases in a manner that proves that the White House was the true source.

The White House has confirmed that it is currently exploring ways to cryptographically verify the statements and videos that it puts out, in an effort to combat the rise of politically motivated deepfakes.

In January, we reported on an AI-generated robocall that faked President Biden’s voice and told New Hampshire residents not to vote in the upcoming primary election. This was followed by the news this week that FCC Chairwoman, Jessica Rosenworcel, has put forth a proposal

Read More

Exit mobile version