A Russia-affiliated disinformation network known as Doppelganger spread an anti-Biden video on X, made with the assistance of generative AI, according to a report by WIRED.
The video, called “Bye Bye Biden,” parodied US President Joe Biden, and Republican presidential candidate Donald Trump. An actor played the role of the two 2024 presidential candidates, whose features were made to look like them via AI.
Biden was depicted as a senile old man who rides a wheelchair, wears a diaper, and can’t steer a bicycle. The video also showed him as a corrupt politician, and parroted false allegations from the 2020 elections involving the theft of votes from Trump.
The video reached 6.5 million users, and was seen by 5 million, according to analysis by Antibot4Navalny, a group of Russian researchers. Tracking the campaign, the group found that there were 4,000 posts made to promote the video, beginning on May 21, using a network of 25,000 accounts on X.
The group also found a new technique to circumvent social media monitoring. The disinformation network trimmed and edited the video so that they could promote videos that “are technically different in milliseconds” and will be “likely considered as distinct unique videos by abuse-protection systems.”
Another organization, True Media – a US non-profit working to detect political deep fakes – declared with 100% confidence that the video was made with AI-generated audio, and 78% confidence that AI face manipulation was used for the video, WIRED said.
“As the Kremlin ramps up its efforts to undermine the US election in November, it is increasingly clear that Russia is willing to utilize emerging AI technologies,” WIRED wrote. The publication also mentioned its earlier report on the “CopyCop” campaign, which used AI tools to collect information and data from real news sites, which are then re-angled to promote a right-wing bias, published through a network of fakes sites.
This string of reports further illustrate the power of generative AI tools in making it easier to create potentially persuasive, and far-reaching deepfake propaganda or dangerous fake news.
In April, the Philippine government had to issue a warning over a deepfake audio that ordered a military attack. Earlier in the year, eventual Indonesian President Prabowo Subianto’s campaign made use of AI tools to create a cute, animated version of himself, that experts said was a critical part in softening the image of the military strongman.
The US will hold its presidential elections in November 2024. – Rav Ayag/Rappler.com
Rav Ayag is a Tech and Features intern at Rappler. He is an incoming senior at the Ateneo de Manila University in the Bachelor of Fine Arts Creative Writing program.
This story was vetted by a reporter and an editor.