Difference between revisions of "Deepfake"
(better lede, const) |
m (Text replacement - " served as " to " was ") |
||
(3 intermediate revisions by 2 users not shown) | |||
Line 4: | Line 4: | ||
|image_caption=Thispersondoesnotexist.com made these pictures. Imagine if video's would also look completely normal. | |image_caption=Thispersondoesnotexist.com made these pictures. Imagine if video's would also look completely normal. | ||
|constitutes=illusion, computation technology, statecraft | |constitutes=illusion, computation technology, statecraft | ||
− | |description=A futuristic and dystopian technology to impersonate...anyone in a video, audio, picture or combined. Use in [[statecraft]] is | + | |description=A futuristic and dystopian technology to impersonate...anyone in a video, audio, picture or combined. Use in [[statecraft]] is probably already here. |
|interests= | |interests= | ||
− | }}A '''Deepfake''' (a portmanteau of "deep learning" and "fake") is a generated static image or a | + | }}A '''Deepfake''' (a portmanteau of "deep learning" and "fake") is a generated static image or a video which appears to be simple record of [[reality]] but which is actually artificially created to deceive. In addition, the dystopian and eerie idea that nothing a person could see can be trusted any more<ref>https://www.cnbc.com/2019/10/15/deepfakes-could-be-problem-for-the-2020-election.html</ref> was<ref>https://www.theguardian.com/technology/ng-interactive/2019/jun/22/the-rise-of-the-deepfake-and-the-threat-to-democracy</ref> pushed as a dreadful new possibility by the traditional peddlers of lies ([[corporate media]] and [[intelligence services]]) after a surge of pornographic images starting in [[South Korea]] in [[2018]].<ref>https://regmedia.co.uk/2019/10/08/deepfake_report.pdf</ref> |
Line 16: | Line 16: | ||
==Potential== | ==Potential== | ||
While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as auto encoders or generative adversarial networks (GANs). | While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as auto encoders or generative adversarial networks (GANs). | ||
+ | |||
Deepfakes have garnered widespread attention for their uses in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud. This has elicited responses from both industry and government to detect and limit their use. | Deepfakes have garnered widespread attention for their uses in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud. This has elicited responses from both industry and government to detect and limit their use. | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
=== Amateur development === | === Amateur development === | ||
The term deepfakes originated around the end of 2017 from a [[Reddit]] user named "deepfakes".<ref>https://www.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley</ref> He, as well as others in the Reddit community r/deepfakes, shared deepfakes they created; many videos involved celebrities’ faces swapped onto the bodies of actresses in pornographic videos, while non-pornographic content included many videos with actor [[Nicolas Cage]]’s face swapped into various movies.<ref>https://mashable.com/2018/01/31/nicolas-cage-face-swapping-deepfakes/</ref> | The term deepfakes originated around the end of 2017 from a [[Reddit]] user named "deepfakes".<ref>https://www.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley</ref> He, as well as others in the Reddit community r/deepfakes, shared deepfakes they created; many videos involved celebrities’ faces swapped onto the bodies of actresses in pornographic videos, while non-pornographic content included many videos with actor [[Nicolas Cage]]’s face swapped into various movies.<ref>https://mashable.com/2018/01/31/nicolas-cage-face-swapping-deepfakes/</ref> | ||
− | |||
− | |||
=== Commercial development === | === Commercial development === | ||
Line 48: | Line 30: | ||
A mobile deepfake app, [[Impressions Mobile Application|Impressions]], was launched in March of 2020. It was the first app for the creation of celebrity deepfake videos from mobile phones.<ref>https://www.dailydot.com/debug/impressions-deepfake-app///</ref><ref>https://kool1079.com/fun-or-fear-deepfake-app-puts-celebrity-faces-in-your-selfies//</ref> | A mobile deepfake app, [[Impressions Mobile Application|Impressions]], was launched in March of 2020. It was the first app for the creation of celebrity deepfake videos from mobile phones.<ref>https://www.dailydot.com/debug/impressions-deepfake-app///</ref><ref>https://kool1079.com/fun-or-fear-deepfake-app-puts-celebrity-faces-in-your-selfies//</ref> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
==Applications== | ==Applications== | ||
− | ==Gabon== | + | ===Gabon=== |
[[Gabon]] was on the first actual countries where deepfakes instigated a coup in [[2019]].<ref>https://www.motherjones.com/politics/2019/03/deepfake-gabon-ali-bongo/</ref> | [[Gabon]] was on the first actual countries where deepfakes instigated a coup in [[2019]].<ref>https://www.motherjones.com/politics/2019/03/deepfake-gabon-ali-bongo/</ref> | ||
Line 65: | Line 42: | ||
* In separate videos, the face of the Argentine President [[Mauricio Macri]] has been replaced by the face of [[Adolf Hitler]], and [[Angela Merkel]]'s face has been replaced with [[Donald Trump]]'s.<ref>https://www.aargauerzeitung.ch/leben/digital/wenn-merkel-ploetzlich-trumps-gesicht-traegt-die-gefaehrliche-manipulation-von-bildern-und-videos-132155720</ref><ref>http://faktenfinder.tagesschau.de/hintergrund/deep-fakes-101.html</ref> | * In separate videos, the face of the Argentine President [[Mauricio Macri]] has been replaced by the face of [[Adolf Hitler]], and [[Angela Merkel]]'s face has been replaced with [[Donald Trump]]'s.<ref>https://www.aargauerzeitung.ch/leben/digital/wenn-merkel-ploetzlich-trumps-gesicht-traegt-die-gefaehrliche-manipulation-von-bildern-und-videos-132155720</ref><ref>http://faktenfinder.tagesschau.de/hintergrund/deep-fakes-101.html</ref> | ||
− | * In April 2018, [[Jordan Peele]] collaborated with [[BuzzFeed|Buzzfeed]] to create a deepfake of [[Barack Obama]] with Peele's voice; it | + | * In April 2018, [[Jordan Peele]] collaborated with [[BuzzFeed|Buzzfeed]] to create a deepfake of [[Barack Obama]] with Peele's voice; it was a public service announcement to increase awareness of deepfakes.<ref>https://www.vox.com/2018/4/18/17252410/jordan-peele-obama-deepfake-buzzfeed </ref> |
* In January 2019, [[Fox Broadcasting Company|Fox]] affiliate KCPQ aired a deepfake of Trump during [[January 2019 Oval Office address|his Oval Office address]], mocking his appearance and skin color (and subsequently fired an employee found responsible for the video).<ref>https://www.washingtonpost.com/nation/2019/01/11/seattle-tv-station-aired-doctored-footage-trumps-oval-office-speech-employee-has-been-fired/</ref> | * In January 2019, [[Fox Broadcasting Company|Fox]] affiliate KCPQ aired a deepfake of Trump during [[January 2019 Oval Office address|his Oval Office address]], mocking his appearance and skin color (and subsequently fired an employee found responsible for the video).<ref>https://www.washingtonpost.com/nation/2019/01/11/seattle-tv-station-aired-doctored-footage-trumps-oval-office-speech-employee-has-been-fired/</ref> | ||
* During the [[2020 Delhi Legislative Assembly election]] campaign, the Delhi Bharatiya Janata Party used similar technology to distribute a version of an English-language campaign advertisement by its leader, Manoj Tiwari, translated into Haryanvi to target Haryana voters. A voiceover was provided by an actor, and AI trained using video of Tiwari speeches was used to lip-sync the video to the new voiceover. A party staff member described it as a "positive" use of deepfake technology, which allowed them to "convincingly approach the target audience even if the candidate didn't speak the language of the voter."<ref>https://www.vice.com/en_in/article/jgedjb/the-first-use-of-deepfakes-in-indian-election-by-bjp</ref> | * During the [[2020 Delhi Legislative Assembly election]] campaign, the Delhi Bharatiya Janata Party used similar technology to distribute a version of an English-language campaign advertisement by its leader, Manoj Tiwari, translated into Haryanvi to target Haryana voters. A voiceover was provided by an actor, and AI trained using video of Tiwari speeches was used to lip-sync the video to the new voiceover. A party staff member described it as a "positive" use of deepfake technology, which allowed them to "convincingly approach the target audience even if the candidate didn't speak the language of the voter."<ref>https://www.vice.com/en_in/article/jgedjb/the-first-use-of-deepfakes-in-indian-election-by-bjp</ref> | ||
Line 71: | Line 48: | ||
In June 2019, the United States [[House Intelligence Committee]] held hearings on the potential malicious use of deepfakes to sway elections.<ref>https://www.cnn.com/2019/06/04/politics/house-intelligence-committee-deepfakes-threats-hearing/index.html</ref> | In June 2019, the United States [[House Intelligence Committee]] held hearings on the potential malicious use of deepfakes to sway elections.<ref>https://www.cnn.com/2019/06/04/politics/house-intelligence-committee-deepfakes-threats-hearing/index.html</ref> | ||
− | |||
− | |||
− | |||
==Concerns== | ==Concerns== | ||
Line 79: | Line 53: | ||
=== Fraud === | === Fraud === | ||
Audio deepfakes have been used as part of [[Social engineering (security)|social engineering]] scams, fooling people into thinking they are receiving instructions from a trusted individual.<ref>https://www.theverge.com/2019/9/5/20851248/deepfakes-ai-fake-audio-phone-calls-thieves-trick-companies-stealing-money</ref> In 2019, a U.K.-based energy firm's CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive.<ref>https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/</ref> | Audio deepfakes have been used as part of [[Social engineering (security)|social engineering]] scams, fooling people into thinking they are receiving instructions from a trusted individual.<ref>https://www.theverge.com/2019/9/5/20851248/deepfakes-ai-fake-audio-phone-calls-thieves-trick-companies-stealing-money</ref> In 2019, a U.K.-based energy firm's CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive.<ref>https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/</ref> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
Latest revision as of 15:23, 1 July 2023
Deepfake (illusion, computation technology, statecraft) | |
---|---|
Thispersondoesnotexist.com made these pictures. Imagine if video's would also look completely normal. | |
Interest of | • Munira Mustaffa • Operation Blackout • Matthijs Veenendaal |
A futuristic and dystopian technology to impersonate...anyone in a video, audio, picture or combined. Use in statecraft is probably already here. |
A Deepfake (a portmanteau of "deep learning" and "fake") is a generated static image or a video which appears to be simple record of reality but which is actually artificially created to deceive. In addition, the dystopian and eerie idea that nothing a person could see can be trusted any more[1] was[2] pushed as a dreadful new possibility by the traditional peddlers of lies (corporate media and intelligence services) after a surge of pornographic images starting in South Korea in 2018.[3]
Contents
History
Photo manipulation was developed in the 19th century and soon applied to motion pictures. Technology steadily improved during the 20th century, and more quickly with digital video.
Deepfake technology has been developed by researchers at academic institutions beginning in the 1990s, and later by amateurs in online communities.[4][5] More recently the methods have been adopted by industry.[6]
Potential
While the act of faking content is not new, deepfakes leverage powerful techniques from machine learning and artificial intelligence to manipulate or generate visual and audio content with a high potential to deceive. The main machine learning methods used to create deepfakes are based on deep learning and involve training generative neural network architectures, such as auto encoders or generative adversarial networks (GANs).
Deepfakes have garnered widespread attention for their uses in celebrity pornographic videos, revenge porn, fake news, hoaxes, and financial fraud. This has elicited responses from both industry and government to detect and limit their use.
Amateur development
The term deepfakes originated around the end of 2017 from a Reddit user named "deepfakes".[7] He, as well as others in the Reddit community r/deepfakes, shared deepfakes they created; many videos involved celebrities’ faces swapped onto the bodies of actresses in pornographic videos, while non-pornographic content included many videos with actor Nicolas Cage’s face swapped into various movies.[8]
Commercial development
In January 2018, a proprietary desktop application called FakeApp was launched.[9] This app allows users to easily create and share videos with their faces swapped with each other.[10] As of 2019, FakeApp has been superseded by open-source alternatives such as Faceswap and the command line-based DeepFaceLab.[11][12]
Larger companies are also starting to use deepfakes. The mobile app giant Momo created the application Zao which allows users to superimpose their face on TV and movie clips with a single picture. The Japanese AI company DataGrid made a full body deepfake that can create a person from scratch.[13] They intend to use these for fashion and apparel.
Audio deepfakes, and AI software capable of detecting deepfakes and cloning human voices after 5 seconds of listening time also exist.[14][15][16][17][18]
A mobile deepfake app, Impressions, was launched in March of 2020. It was the first app for the creation of celebrity deepfake videos from mobile phones.[19][20]
Applications
Gabon
Gabon was on the first actual countries where deepfakes instigated a coup in 2019.[21]
Pornography
Many deepfakes on the internet feature pornography of people, often female celebrities whose likeness is typically used without their consent.[22] Deepfake pornography prominently surfaced on the Internet in 2017, particularly on Reddit.[23] The first one that captured attention was the Daisy Ridley deepfake, which was featured in several articles. Other prominent pornographic deepfakes were of various other celebrities.[24][25][26] As of October 2019, most of the deepfake subjects on the internet were British and American Actresses. However, around a quarter of the subjects are South Korean, the majority of which are K-pop stars.[27]
Politics
Deepfakes have been used to misrepresent well-known politicians in videos.
- In separate videos, the face of the Argentine President Mauricio Macri has been replaced by the face of Adolf Hitler, and Angela Merkel's face has been replaced with Donald Trump's.[28][29]
- In April 2018, Jordan Peele collaborated with Buzzfeed to create a deepfake of Barack Obama with Peele's voice; it was a public service announcement to increase awareness of deepfakes.[30]
- In January 2019, Fox affiliate KCPQ aired a deepfake of Trump during his Oval Office address, mocking his appearance and skin color (and subsequently fired an employee found responsible for the video).[31]
- During the 2020 Delhi Legislative Assembly election campaign, the Delhi Bharatiya Janata Party used similar technology to distribute a version of an English-language campaign advertisement by its leader, Manoj Tiwari, translated into Haryanvi to target Haryana voters. A voiceover was provided by an actor, and AI trained using video of Tiwari speeches was used to lip-sync the video to the new voiceover. A party staff member described it as a "positive" use of deepfake technology, which allowed them to "convincingly approach the target audience even if the candidate didn't speak the language of the voter."[32]
- In April 2020, the Belgian branch of Extinction Rebellion published a deepfake video of Belgian Prime Minister Sophie Wilmès on Facebook.[33] The video promoted a possible link between deforestation and COVID-19. It had more than 100,000 views within 24 hours and received many comments. On the Facebook page where the video appeared, many users interpreted the deepfake video as genuine.[34]
In June 2019, the United States House Intelligence Committee held hearings on the potential malicious use of deepfakes to sway elections.[35]
Concerns
Fraud
Audio deepfakes have been used as part of social engineering scams, fooling people into thinking they are receiving instructions from a trusted individual.[36] In 2019, a U.K.-based energy firm's CEO was scammed over the phone when he was ordered to transfer €220,000 into a Hungarian bank account by an individual who used audio deepfake technology to impersonate the voice of the firm's parent company's chief executive.[37]
Related Quotation
Page | Quote | Author |
---|---|---|
Matthijs Veenendaal | “Trust is a key foundation of a well-functioning society. Without reliable communication, organizations cannot operate effciently, be they corporations or government institutions. Malicious actors are aiming to exploit vulnerabilities in communication flows. With the advent
of new technology, it is possible for adversaries to impersonate leaders and create false impressions among population. The Tallinn based NATO Cooperative Cyber Defence Centre of Excellence will organize a session focusing on questions including: What is at stake? What can nations do to enhance and protect trust in democratic institutions? Or is it already too late?” | Matthijs Veenendaal |
References
- ↑ https://www.cnbc.com/2019/10/15/deepfakes-could-be-problem-for-the-2020-election.html
- ↑ https://www.theguardian.com/technology/ng-interactive/2019/jun/22/the-rise-of-the-deepfake-and-the-threat-to-democracy
- ↑ https://regmedia.co.uk/2019/10/08/deepfake_report.pdf
- ↑ https://www.washingtonpost.com/technology/2019/06/12/top-ai-researchers-race-detect-deepfake-videos-we-are-outgunned/
- ↑ https://www.nbcnews.com/think/opinion/thanks-ai-future-fake-news-may-be-easily-faked-video-ncna845726
- ↑ https://www.theverge.com/2019/9/2/20844338/zao-deepfake-app-movie-tv-show-face-replace-privacy-policy-concerns
- ↑ https://www.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley
- ↑ https://mashable.com/2018/01/31/nicolas-cage-face-swapping-deepfakes/
- ↑ https://www.online-tech-tips.com/computer-tips/what-is-a-deepfake-and-how-are-they-made/
- ↑ https://www.theverge.com/2018/2/11/16992986/fakeapp-deepfakes-ai-face-swapping%7Ctitle=I'm using AI to face-swap Elon Musk and Jeff Bezos, and I'm really bad at it
- ↑ https://faceswap.dev
- ↑ https://github.com/iperov/DeepFaceLab
- ↑ https://www.fastcompany.com/90407145/youve-been-warned-full-body-deepfakes-are-the-next-step-in-ai-based-human-mimicry
- ↑ https://www.theverge.com/2020/1/29/21080553/ftc-deepfakes-audio-cloning-joe-rogan-phone-scams
- ↑ https://google.github.io/tacotron/publications/speaker_adaptation/
- ↑ http://www.niessnerlab.org/projects/roessler2019faceforensicspp.html
- ↑ https://spectrum.ieee.org/tech-talk/artificial-intelligence/machine-learning/facebook-ai-launches-its-deepfake-detection-challenge
- ↑ http://ai.googleblog.com/2019/09/contributing-data-to-deepfake-detection.html
- ↑ https://www.dailydot.com/debug/impressions-deepfake-app///
- ↑ https://kool1079.com/fun-or-fear-deepfake-app-puts-celebrity-faces-in-your-selfies//
- ↑ https://www.motherjones.com/politics/2019/03/deepfake-gabon-ali-bongo/
- ↑ https://www.rollingstone.com/culture/culture-news/deepfakes-nonconsensual-porn-study-kpop-895605/
- ↑ https://variety.com/2018/digital/news/deepfakes-porn-adult-industry-1202705749/
- ↑ https://www.businessinsider.com/deepfakes-explained-the-rise-of-fake-realistic-videos-online-2019-6
- ↑ https://www.bbc.com/news/technology-42912529
- ↑ https://www.vice.com/en_us/article/ywe4qw/gfycat-spotting-deepfakes-fake-ai-porn
- ↑ https://medium.com/@frenizoe/deepfake-porn-efb80f39bae3
- ↑ https://www.aargauerzeitung.ch/leben/digital/wenn-merkel-ploetzlich-trumps-gesicht-traegt-die-gefaehrliche-manipulation-von-bildern-und-videos-132155720
- ↑ http://faktenfinder.tagesschau.de/hintergrund/deep-fakes-101.html
- ↑ https://www.vox.com/2018/4/18/17252410/jordan-peele-obama-deepfake-buzzfeed
- ↑ https://www.washingtonpost.com/nation/2019/01/11/seattle-tv-station-aired-doctored-footage-trumps-oval-office-speech-employee-has-been-fired/
- ↑ https://www.vice.com/en_in/article/jgedjb/the-first-use-of-deepfakes-in-indian-election-by-bjp
- ↑ https://www.extinctionrebellion.be/en/
- ↑ https://journalism.design/les-deepfakes/extinction-rebellion-sempare-des-deepfakes/%7Ctitle=Extinction Rebellion s'empare des deepfakes
- ↑ https://www.cnn.com/2019/06/04/politics/house-intelligence-committee-deepfakes-threats-hearing/index.html
- ↑ https://www.theverge.com/2019/9/5/20851248/deepfakes-ai-fake-audio-phone-calls-thieves-trick-companies-stealing-money
- ↑ https://www.forbes.com/sites/jessedamiani/2019/09/03/a-voice-deepfake-was-used-to-scam-a-ceo-out-of-243000/
Wikipedia is not affiliated with Wikispooks. Original page source [ here]