Zelensky, Putin videos provide glimpse of evolving deepfake threat, experts say

More people know Volodymyr Zelensky’s face than ever before as the president leads Ukraine in a fight against a Russian invasion that’s now in its fourth week.

In turn, a headline-grabbing deepfake video recently targeted Zelensky’s high level of visibility, by putting words in his mouth that he never said.

The fake Zelensky video purported to show the president telling Ukrainians to lay down their arms — a false claim that the real Zelensky deemed a “childish provocation” amid a life-and-death war.

Experts say this deepfake fail may not have been particularly convincing, but what’s troubling is how more advanced versions of such deception could wreak havoc in the future.

“We could see more that are much more impressive, much more sophisticated and much more difficult to figure out if [they’re] real or not,” said Abby MacDonald, a fellow at the Canadian Global Affairs Institute who specializes in security and defense policy.

A range of deep fakery

Deepfakes have been in existence for years, well before the war in Ukraine started last month, drawing much media attention and concern about their use and abuse, as well as their reach on social media.

Zelesnky’s image is appearing widely in the media — and in the case of the photo above, on pillow cases made in the Czech Republic. (Eva Korinkova/Reuters)

MacDonald said they exist on a gradient, from low-tech “cheap fakes” that are produced with more basic software, to sophisticated deepfakes that make use of artificial intelligence and more advanced computing to produce more realistic-looking end products.

“I think deepfakes have in the past few years been coming more to the forefront,” said MacDonald, who recently authored a paper about the policy implications of deepfakes.

The wartime appearance of a Zelensky-focused deepfake didn’t come as a surprise to those watching the conflict in Ukraine closely, even if its specific provenance isn’t completely clear.

“I definitely think it’s something I would have expected to see emerge,” Alyssa Demus, a senior policy analyst at the think-tank the Rand Corporation, said in an interview from Santa Monica, Calif.

“I don’t know if it’s created by a state actor, or an affiliated proxy or something or by just [someone] on the internet trying to fool people.”

Benjamin Jensen, a senior fellow at the Center for Strategic and International Studies in Washington, DC, somewhat expected to see deepfakes deployed before this point in the war.

“I’m surprised it took this long and we didn’t see more of them during the mobilization phase,” said Jensen, who believes they are unlikely to sway opinion this far into the Russian invasion.

Not just Zelensky

Russian President Vladimir Putin has also been the subject of a manipulated video that has circulated during the invasion.

The video, shared on social media, claimed to show Putin declaring peace had been achieved with Ukraine.

No such declaration has happened and the war continues to grind on.

A customer at a Moscow souvenir shop is seen holding a nesting doll featuring the image of Russian President Vladimir Putin in December. (Pavel Golovkin/The Associated Press)

Eliot Borenstein, a professor of Russian and Slavic Studies at New York University (NYU), questioned how either of the publicly debunked Zelensky or Putin videos could be productive for any of the actors in the conflict.

I think the real big question is: Are we going to see more of it, in general, throughout the world? And that’s really horrifying.– Eliot Borenstein, professor at New York University, on deepfake videos

“What seems to be the intent is to get people confused about whether the opposing side or their own side is continuing the war,” said Borenstein.

“And I’m just not sure how effective that really could be in terms of, say, combat.”

Marta Dyczok, an associate professor of history and political science at Western University in London, Ont., said the dispelling of the validity of these videos may help Ukraine demonstrate that Russia’s efforts along these lines aren’t working.

“You’re trying this deepfake thing and you can’t do it.”

A more complex world

The presence of deepfakes is one thing. Defending against them is another. Both are concerned that extend well beyond Ukraine’s borders.

“I think the real big question is: Are we going to see more of it, in general, throughout the world? And that’s really horrifying,” said Borenstein.

“The fact is that we’ve been seeing deepfakes already and so far, it’s fairly easy to debunk them, fairly easy to show where they’ve come from. But I imagine in a little while it won’t be.”

Over the long term, MacDonald said, it will be key to improve our ability to identify and disprove deepfakes.

“Like all cybersecurity issues, this is the kind of thing that it’s constantly evolving, and it’s really hard to keep up and it’s really hard to co-ordinate. So, I think that is going to be a challenge,” the security expert said .

She said it will also be important to improve people’s digital literacy and ensure they are more critical about the media they consume.

Leave a Reply