Deepfakes: Not Everything You See is Real

Share

If you think differentiating a real video from a fake one is easy, recent developments in AI technology will make you change your mind.

Imagine waking up someday to find a perfectly made fake video portraying you practicing some magic tricks. How would you feel about it? Imagine again, how would you feel if this video goes viral? This is not hypothetical anymore; fake videos have become everywhere and have already taken digital impersonation to the next level. Until recently, we thought that nothing is easier than differentiating a real video from a computer-generated one, but recent developments in Artificial Intelligence (AI) technology have been alarming to millions of internet users.

During the quarantine, we found many celebrities who were not on social media, creating accounts and sharing their daily routines with their fans. Thus, it was of no surprise to find Hollywood star Tom Cruise finally joining TikTok in February 2021, as @deeptomcruise? and sharing videos of himself doing magic tricks. In fact, the videos’ contents and how they went viral were not as surprising as knowing that they are fake, as by announced by the account creator Christopher Ume, a visual effects artist; using “deepfake” technology, with help from Cruise's impersonator.

 

Visual effects deepfake breakdown of the viral DeepTomCruise Tiktok videos by Christopher Ume

The term "deepfake", as coined by a Reddit user in 2017, is derived from its underlying AI technology "deep learning". It relies on deep neural networks involving auto-encoders and employing a face-swap technique in videos. Deep learning algorithms are programmed to solve problems when given large sets of data, so the editor runs thousands of face shots of two persons and easily pastes someone's face over another to make realistic-looking fake videos, or "deepfakes". In other words, it is computers, not humans, who do the hard work.

The benign applications of this technology have been in cinema and gaming; however, it has become an obviously dangerous technology with troubling applications. We used to find several varieties of imperfect video manipulations generally for their high costs or being time consuming, but that is no longer the case. Average users now generate deepfakes using various applications and softwares. As the technology becomes more sophisticated, it becomes more threatening with various usages other than humor or satire.

This technology is growing at a breathtaking pace; according to a report from startup Deeptrace, the number of online deepfakes amounted from 7,964 to 14,678 in just nine months, in 2019! Researchers try to reveal some indicators for deepfakes, for example, a research conducted in 2018, discovered that deepfake faces do not blink normally, and other highlight other obvious problems in skin tone, or poor lighting match or lip-syncing in the target audio. However, as soon as any of these issues are revealed, the technology is improved and detecting deepfakes gets harder.

“It is like Photoshop 20 years ago, people did not know what photo-editing was, and now they know about these fakes,” Ume says to The Verge about his viral clips.

Even though an event would seem unlikely, always remember that there is a possibility it is real. If deepfakes make people believe they cannot trust whatever they see or hear, the war of misinformation and conspiracy theories will get worse. We need to increase public awareness of the existence of these technologies and give them more space to analyze and criticize the information they consume.

References

businessinsider.com
deeptracelabs.com
edition.cnn.com
forbes.com
fpf.org
theguardian.com

Image source.

About Us

SCIplanet is a bilingual edutainment science magazine published by the Bibliotheca Alexandrina Planetarium Science Center and developed by the Cultural Outreach Publications Unit ...
Continue reading

Contact Us

P.O. Box 138, Chatby 21526, Alexandria, EGYPT
Tel.: +(203) 4839999
Ext.: 1737–1781
Email: COPU.editors@bibalex.org

Become a member

© 2024 | Bibliotheca Alexandrina