Author ORCID iD
Abstract
For nearly as long as there have been photographs and videos, people have been editing and manipulating them to make them appear to be something they are not. Usually edited or manipulated photographs are relatively easy to detect, but those days are numbered. Technology has no morality; as it advances, so do the ways it can be misused. The lack of morality is no clearer than with deepfake technology.
People create deepfakes by inputting data sets, most often pictures or videos into a computer. A series of neural networks attempt to mimic the original data set until they are nearly indistinguishable. The result is an ability to create pictures and videos entirely from data points.
There are many positive uses for deepfakes, such as in education, entertainment, and business, but the potential for misuse is high. People can create pornographic images of others and make it appear as if they are performing sexual acts on video that they had not. Deepfakes such as these often target women and celebrities. People also use deepfakes to target politicians, which has deeper implications for democracy and the electoral process.
Unfortunately, the legal system is currently unequipped to handle the problems that deepfakes are causing. In response, many lawmakers and policy experts are calling for legislation to protect people from these dangers. These proposals range from technological preventative measures to legal remedies. Many people are calling for criminal liability for those engaging in malicious deepfake activities, but there has been reluctance towards enacting a civil remedy. Malicious deepfakes overwhelmingly are nonconsensual porn that target women. Currently the law in most jurisdictions offers little to no legal recourse for those who are targeted. Therefore, it is necessary that the federal government include a private right of action in any proposed deepfake legislation.
Recommended Citation
Eric Kocsis,
Deepfakes, Shallowfakes, and the Need for a Private Right of Action,
126
Dick. L. Rev.
621
(2022).
Available at:
https://ideas.dickinsonlaw.psu.edu/dlr/vol126/iss2/10
Included in
Broadcast and Video Studies Commons, Civil Law Commons, Communications Law Commons, Communication Technology and New Media Commons, Computer Law Commons, Criminal Law Commons, Forensic Science and Technology Commons, Graphic Communications Commons, Legal Theory Commons, Legal Writing and Research Commons, Legislation Commons, Other Computer Engineering Commons, Other Legal Studies Commons, Publishing Commons, Rule of Law Commons, Science and Technology Studies Commons, Sexuality and the Law Commons, Signal Processing Commons, Social Justice Commons, Social Media Commons, Torts Commons