Point of view

Acquire fact-checking and info-literacy skills to recognize Russian-made deepfakes (video)

IA “FACT” already  told, how migrant poets in St. Petersburg published Nazi poems for a year, passing them off as Z-poetry, to show the war supporters the true meaning of Russian aggression.

It was clear from the first months of the war that Russia would use narratives in its propaganda to discredit Ukrainian military commanders and soldiers. Professor Sandra Wachter, an expert on the law and ethics of AI, big data and robotics at the University of Oxford, believes: “The spread of false or misleading information about political leaders is as old as humanity itself.”

To complete the picture, let’s consider how the Russians use deep fakes, in which the Ukrainian leadership of the country and the higher command are portrayed as incompetent and corrupt, and the guys of the Armed Forces behave immorally and criminally: they torture, kill, loot.

These videos are distributed on social networks and other media in order to undermine confidence in the Ukrainian military and authorities and to cause disgust and anger in the public. Here are some examples that may soon become textbooks.

The fish is rotting from the head

This is exactly what the authors of Deepfake bet on, which at first could shock many: what was heard too much contrasted with the usual image of the people’s favorite Commander-in-Chief Valery Zaluzhny.

The reason for the video was the murder of the latter’s aide-de-camp, which happened the day before and was attributed by the authors of the video to the Ukrainian President. In the video, the pseudo-commander-in-chief called on Ukrainians to rise up against Zelenskyi’s tyranny, claiming that he had allegedly destroyed Zaluzhnyi’s adjutant and was now preparing an assassination attempt on himself.

In the voice of Yanukovych (has the ex-president decided to pursue a career as a dubbing actor?)), the pseudo-commander-in-chief made a seditious statement: “He is going to eliminate me as well, and then surrender the country. Zelensky is the enemy of our state. Because of him, the counteroffensive completely failed, losing half a million of the best sons of the nation. If he is not stopped, we will lose the whole country. Zelensky is constantly lying and will lie to the last Ukrainian…” The rhetoric of this performance seems clear.

The Center for Combating Disinformation quickly identified the country of origin of this “masterpiece”. To create such an “artifact”, you do not need to be a master of AI technology and assembly. You need to find a service that will skillfully work with the Ukrainian or Russian language. Select the video of the speaker you want to discredit. The program will copy his voice. It is important that the size of the new fake text matches the duration of the sound with the original text, otherwise the speech will be unnaturally slow or fast.

Acquire fact-checking and info-literacy skills to recognize Russian-made deepfakes (video)
Photo from video clip

In fact, it is easy to distinguish between a deepfake and an original video. In the video with Zaluzhny, it is obvious that the movement of the lips is out of sync with the voice, and the facial expressions are unnatural. For example, when there is a pause between sentences in a speech, and the speaker’s lips continue to move. And when this is repeated three times in the video, there is no doubt that the video is fake.

See also  How the world media react to Elon Musk's interference in European politics

The death of Zaluzhny’s assistant was used as an informational excuse, counting on the fact that the audience, which does not have the skills of fact-checking and information literacy, will swallow this hook. And then he will spread this unimaginative bait in his environment. Russian propaganda needs quantity, not quality.

A fake of fakes

A deepfake in which the Ukrainian President calls on people to surrender was distributed on the Internet. The video showed how Volodymyr Zelensky, standing behind the podium, allegedly said: “It turned out that being the president is not so easy,” that he “decided to return Donbas” to Russia, and that the efforts of the Ukrainian army in the war “failed.” And he summarizes: “My advice to you is to lay down your arms and return to your families. It is not worth dying in this war. My advice to you is to live. I am going to do the same.”

But this video is a fake of fakes, and it is obvious. The head of the pseudo-president is unnaturally large, planted at the wrong angle and strangely lit, with increased pixelization around it compared to the body. He has an unnatural timbre and tempo of his voice.

Was it a boy?

For deepfake masks, IPSO frontline workers use the faces of random people from photo stocks. Yes, the face of the “friendly” boy can be found on various Vkontakte bot pages.

A video in which a Ukrainian serviceman allegedly organized a Slavic brawl with a grenade in public transport also caused a stir.  Under the video, there were comments such as “the monkey beat the driver, broke the wall of the car dealership and decided to blow up everyone with a grenade that he threw from the location…”

First, the video was processed clumsily, and secondly, there was no arrest of such a brawler, which is additional evidence of the fakeness of this information product. Experts recognized the face of director Kyryl Khachaturov in a deepfake mask.

Deepfakes to discredit Ukrainian military personnel

In 2022, Russia published deepfake videos in which a Ukrainian military commander seems to be giving an order, and a Ukrainian soldier is shooting civilians. Z-patriots use deepfakes in which a Ukrainian soldier allegedly confesses to committing war crimes or expresses support for pro-Russian separatists. The videos were widely circulated in the Russian media and used to justify the Russian invasion of Ukraine.

If you look closely, you can see how the mask comes off the face

The more emotional the video, the more likely it will catch on and go viral. For example, a series of videos about Vokha, a Ukrainian serviceman with Down syndrome, went viral on TikTok and other social networks.

See also  Day of protection of children during war

The dog was thrown a bone – the appreciative audience picked it up – and comments appeared on reddit like: “Damn it. If his own brothers are doing this to him, it’s so sad. He doesn’t deserve it….what cruel sadists are sent to the field with Down syndrome?!”

The first broadcasts of the deepfake video took place on the pseudo-Ukrainian channel Razvedchik, on the pro-Russian radio channels Ryadovy na Peredovii and Stepnoi Veter. All three TG groups are interconnected. It is probably one grid of TG channels. Working according to this scheme, several channels affiliated with each other create a common info bubble. This is a real factory of deepfake productions, which then spread this trouble to a large number of TG channels with a large audience.

And the old woman has a breakthrough.  So, the experienced American radio presenter Jones (obsessed with conspiracy theories, admires the Russian ideologist Dugin and enjoys visiting Russian propaganda channels) also contributed to the fact that the deepfake gained hundreds of thousands of views on Russian channels. If he had not been lazy and conducted research, he would have seen how clumsily the deepfake was made: Vokha’s face in the video moves out of the frame because the neural network does not have time to apply the mask.

The main weapon is to trust only news from verified sources

Stamping fakes is the only talent of Kremlin journalists. Probably, the IPSO factory will continue to produce video dumps, recordings of telephone conversations, and fake videos. Therefore, it is vitally important for all our compatriots in Ukraine and abroad, as well as for the progressive foreign public, to train infohygiene skills, awareness and critical thinking.

Hit the enemy with his own weapon

Adhering to the patriotic impulse, each of us can become a fighter of the information front, sowing bugs in the mind of the careless chthoni. Don’t want to try your directing talent using the opportunities provided by deepfake programs? Here are some of them.

DeepFaceLab: Open source software for creating realistic deepfakes by training a model on images and videos of a target face.

FakeApp: A user interface for DeepFaceLab that simplifies the process of creating deepfakes.

Recover: A mobile app that allows users to deepfake selfies.

Faceswap: Image and video editing software for creating deepfakes.

Theirs: A Chinese mobile app that allows users to create deepfakes using celebrity video clips.

Adobe After Effects: Video editing software that can be used to create deepfakes using stop-motion animation and other techniques.

Synthesia: A cloud-based platform that enables users to create deepfake videos using AI.

D-ID: A company that provides deep learning technology to create and detect deepfakes.

It is important to note that these programs are becoming more and more sophisticated, and it is becoming increasingly difficult to distinguish between deepfakes and real videos. Therefore, it is important to be critical of the information you see online and check the sources before believing anything.

Tatyana Morarash

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

Back to top button