David Beckham is given the power to speak in nine different languages via the use of deepfake technology in a new malaria awareness campaign

David Beckham deepfake

Deepfake technology has been used to depict David Beckham speaking nine languages for a Malaria No More charity advert (Credit: Malaria No More)

Sporting superstar David Beckham may not know Arabic, Hindi or Kinyarwanda – the official language of Uganda – but he is able to execute them perfectly for a new malaria awareness campaign through the use of deepfake technology.

The one-minute advert for charity Malaria No More prompts people to join the world’s first voice petition to end the mosquito-borne disease and features Beckham reeling off facts in several different languages.

Controversial deepfake technology uses AI algorithms to effectively lip-sync the video of Beckham with the voices of different people.

Despite the positive use of the technology in the charity appeal, there are fears that deepfakes could be used to commit fraud, undermine political opponents or to damage someone’s reputation–  leading it to be described as “fake news on steroids”.

 

How was the David Beckham deepfake made?

In order to produce a deepfake video, a deep learning neural network – which is a form of AI – has to be trained with visual data of a person’s face.

Once trained, the AI programme can take that person’s face and superimpose it on to another source video – making it appear as if the person in question is saying words  they haven’t.

The Beckham ad was created in conjunction with UK AI technology company Synthesia.

In a behind-the-scenes video, Synthesia CEO and co-founder Victor Riparbelli said: “We used the video to build a 3D model and we can then re-animate that, so you can change the language or re-write the script of a film.

“What we are doing helps to spread the message in a much more effective way than if it had been filmed without using technology like ours.”

Beckham, who supports 19 charities and non-profit organisations, added: “It’s great to be involved in something where the tech side of our lives and our world get involved, to be one voice of many different people.”

 

Ethical issues with deepfakes

Fears over the malicious use of deepfake technology has led some leading politicians to call for them to be banned.

US and UK politicians have raised concerns that the tech represents a threat to democracy, while Facebook is looking into ways to fact-check videos uploaded to the social media site.

Siwei Lyu, an associate professor in the Department of Computer Sciences at the University of Albany New York, specialises in deep learning and media forensics, and fears the impact deepfake videos could have.

He previously told Compelo: “The deepfakes create a fake media that can create a kind of synthetic reality – although these videos aren’t real, they look real.

“At the highest level, they could actually shake our belief in online visual media.”

David Beckham deepfake
Synthesia’s deepfake technology is used on a BBC newsreader (Credit: BBC)

Synthesia, which has also used the technique to make a newsreader change language and to change the script spoken by an actor, has published its own ethical guidelines online.

It says: “As early pioneers of video synthesis technology we are excited about the possibilities it will bring to visual content creation.

“Removing the language barrier from video will allow great stories to travel the world, regardless of their origin.

“Our hope is that this new medium will foster cultural exchange, joy and deeper understanding in the same manner that the written word has done for centuries.

“However, as with all tools, they can be used for good or bad.

“Soon these and other sophisticated technologies will be widespread and it is important that the risk of misuse is both acknowledged and alleviated as much as possible.

“In our view, the best approach is to create public awareness and develop technological security mechanisms to ensure all content created is consensual.”

It also promises to never create a deepfake of someone without their explicit consent.