close
close

Experts warn of ‘deepfakes’ of celebrity TV doctors promoting fraud

PARIS, Sept. 15 — Experts are warning that social media is being flooded with digitally created “deepfake” videos that use the trusted identities of famous doctors to promote dangerous miracle cures for serious health problems.

Facebook and Instagram videos exploit the credibility of TV celebrity doctors to advertise untested “natural” diabetes syrups, even claiming that the proven first-line drug, metformin, “can kill” patients.

Experts say such scams pose a risk to human life, especially as they use the likenesses of popular health experts such as British TV presenter Michael Mosley, who died earlier this year.

“People seem to trust these films,” British doctor John Cormack told AFP.

“Many of these media doctors have spent a lot of time building up an image of trustworthiness so that people believe them even when they make incredible claims,” said Cormack, who worked with British Medical Journal (BMJ) on this topic.

Artificial intelligence (AI) expert Henry Ajder said doctor deepfakes have “really taken off this year.”

AI videos tend to target older audiences, Ajder said, by spoofing the identities of doctors who regularly appear on daytime television.

French doctor Michel Cymes, who frequently appears on French television, told AFP in May that he intended to take legal action against Facebook owner Meta over “fraud” involving his image.

British doctor Hilary Jones even hired a detective to track deepfakes bearing his likeness.

In one video, Jones was seen selling fake high blood pressure medication — as well as marijuana gummies — on a UK TV show on which he regularly appears.

“Even if they are removed, they will simply reappear the next day under a different name,” Jones lamented in the BMJ.

“A game of cat and mouse”

As French scientist and AI expert Frederic Jurie explains, recent advances in AI have made the quality of deepfake images, audio and video recordings much more convincing.

“Today we have access to tens of billions of images and we can build algorithms that can model anything that appears in the images and regenerate it. This is what we call generative AI,” he said.

This isn’t just about the inappropriate use of likenesses of well-respected physicians.

The appearance of controversial French researcher Didier Raoult — accused of spreading misleading information about COVID cures — has also been used in several deepfake videos.

Australian naturopath Barbara O’Neill, who has been heavily criticised for claiming baking soda can cure cancer, has been falsely portrayed in TikTok videos as selling “vessel-cleansing” pills.

Her husband Michael O’Neill, contacted by AFP, regretted that “many unethical people” were using his wife’s name “to sell products that she does not recommend and in some cases these are just plain scams.”

Some fake videos take the propaganda even further by falsely claiming that O’Neill died from taking a miracle oil sold on Amazon.

AI expert Adjer was not surprised that such controversial health data also fell victim to deepfakes.

“People from circles that could be described as unorthodox or conspiratorial enjoy enormous trust,” he said.

Experts were not optimistic about the ability of new AI-powered content detection tools to counter the onslaught of deepfakes.

“It’s a game of cat and mouse,” Jurie said.

Instead of trying to find all the fake videos that are out there, he pointed to technology that can “ensure that the content has not been altered, for example in messages, through software that creates digital signatures like a certificate,” he said. — AFP