The Bugatti car, the First Lady and fake stories targeting Americans: BBC

A network of Russian websites masquerading as local American newspapers is spreading fake stories as part of an artificial intelligence operation increasingly targeting the US election. This is stated in an investigation by BBC Verify and BBC News.
The publication says that a former Florida police officer who moved to Moscow is one of the key figures in this. If true, this would be a devastating report.
Olena Zelenska, Ukraine’s first lady, allegedly bought a rare Bugatti Tourbillon sports car for €4.5m ($4.8m; £3.8m) during a visit to Paris for D-Day celebrations in June. The source of the funds was allegedly money from American military aid.
The story appeared on an obscure French website just a few days ago – and was quickly debunked.
Experts pointed out strange anomalies in the invoice published online. The whistleblower mentioned in the article appeared only in a strangely edited video that may have been artificially created. Bugatti strongly denied it, calling it “fake news” and its Paris dealership threatened legal action against the people behind the false story.
But before the truth could come out, the lie went viral. Influencers have already picked up on the false story and spread it widely. One user X, the pro-Russian activist who supports Donald Trump Jackson Hinkle, posted the link, which has been viewed by more than 6.5 million people. Several other accounts shared the story with X million more users — at least 12 million in total, according to the site.
“This story on a fake news website intended for widespread online distribution, which originated in the Russian disinformation operation BBC Verify, was first made public last year, and at the time the operation appeared to be trying to undermine the government of Ukraine.
Several errors were noticed in this fake invoice, including spelling, punctuation and usage of English. But it still spread widely online Our latest investigation, which spanned more than six months and included hundreds of articles on dozens of websites, found that the operation has a new target — American voters.”, – the BBC notes.
Dozens of fake stories tracked by the BBC are aimed at influencing US voters and sowing mistrust ahead of the November election. Some were completely ignored, but others were shared by influential people and members of the US Congress.
Fakes that go viral directly target American politics. They are published on a website called The Houston Post, one of dozens of websites with American titles that are actually run from Moscow. The BBC claims that the FBI illegally wiretapped Donald Trump’s Florida resort. It played well into Trump’s claims that the legal system is unfairly stacked against him, that there is a conspiracy to derail his campaign and that his opponents are using dirty tricks to undermine him. Trump himself accused the FBI of monitoring his conversations.
Experts say the operation is just part of a much larger effort from Moscow to spread disinformation during the US election campaign.
The BBC contacted the Russian Foreign Ministry and the US and UK embassies about this, but did not receive a response.
How fakes spread
The operation, investigated by BBC Verify, uses artificial intelligence to generate thousands of news articles published on dozens of sites with names that are meant to look American: Houston Post, Chicago Crier, Boston Times, DC Weekly and others. Some use the names of real newspapers that went out of business years or decades ago.
.Most of the stories on these sites are not outright fakes. Instead, they are based on real news from other sites, apparently rewritten by artificial intelligence software. In some cases, the instructions for the AI mechanisms were visible in the finished stories, for example: “Please rewrite this article with a conservative stance.” This is an example of instructions for an AI program – mistakenly left in the story on one of the fake news sites
The stories are attributed to hundreds of fake journalists with fictitious names and in some cases profile pictures taken from elsewhere on the internet. For example, a photo of bestselling author Judy Battalion was used in several stories on a website called DC Weekly, “written” by an online persona called “Jessica Devlin.”
“I was completely confused. I still don’t really understand what my photo was doing on this website. I had no contact with this website. It made me more aware of the fact that any photo of yours on the internet can be used by someone else” Battalion told the BBC.
The sheer number of stories – thousands every week – along with their repetition on various websites, indicates that the process of publishing AI-generated content is automated. Casual browsers can easily get the impression that these sites are thriving sources of legitimate news about politics and pressing social issues. However, this tsunami of content is peppered with fake stories increasingly aimed at American audiences.
The stories often mix American and Ukrainian political issues. For example, one of them claimed that an employee of the Ukrainian propaganda unit was frightened after learning that she had been assigned the task of overthrowing Donald Trump and supporting President Biden.
Another report fabricated a shopping trip to New York by the first lady of Ukraine and claimed that she was racist towards jewelry store staff. The BBC found that forged documents and fake YouTube videos were used to support both false stories.
Some of the fakes are emerging and getting a high level of engagement on social media, said Clement Briens, senior threat intelligence analyst at cybersecurity firm Recorded Future. His company says 120 websites were flagged as part of an operation it calls CopyCop in just three days in May. And this network is just one of a number of Russian disinformation operations.
Other experts — Microsoft, Clemson University and Newsguard, a company that monitors disinformation sites — say they have counted at least 170 sites connected to the operation.
“At first, the operation seems small, but every week it grows significantly in terms of size and reach. People in Russia regularly quoted and spread these narratives through Russian state television, Kremlin officials and Kremlin influencers. Almost every week or two, a network emerges a new narrative,” said Mackenzie Sadeghi, AI Newsguard and External Impact Editor.
Make fake real
To further increase the credibility of fake stories, YouTube videos are quickly created, often featuring people calling themselves “whistleblowers” or “independent journalists.” In some cases, the videos are narrated by actors – in others, they seem to be AI-generated voices. If several videos are shot against a similar background, this is even more evidence of a coordinated effort to spread fake news.
The videos themselves are not meant to go viral and have very few views on YouTube. Instead, they are cited as “sources” in text materials on fake newspaper websites. For example, a story about a Ukrainian intelligence operation allegedly targeting the Trump campaign cited a YouTube video that contained footage from an alleged office in Kyiv, where fake campaign posters were visible on the walls. Links to such stories are then published on Telegram channels and other social media accounts.
Microsoft researchers also say similar operations are trying to spread stories about UK politics, the Paris Olympics and more. One fake story, which appeared on a website called the London Crier, claimed that Zelensky had bought a mansion owned by King Charles III at a bargain price. It was seen by hundreds of thousands of social network users, then it was deleted.
The BBC asked the spreaders of fakes whether the spread of such false stories would slow down. The following answer was received:
“Don’t worry! The game is being improved!”.