By Zhen Liu, Staff Writer
Deepfakes challenge our assumptions about what is real and what is not. A combination of the phrases “deep learning” and “fake,” deepfakes are super realistic digitally altered images that use machine learning algorithms to swap one person’s face or voice for another’s. Deepfakes can be photos, videos or audio recordings of people saying and doing things that never actually happened.
It is now possible for anyone with basic computer skills to create a pornographic deepfake portraying an individual engaging in a sex act that never actually occurred. Deepfake pornography, or simply fake pornography, is a type of synthetic porn that is created via altering an already-existing pornographic publication by applying deepfake technology to the faces of the actor or actress. While pornographic deepfakes were first created to produce videos of celebrities, they are now being generated to feature other nonconsenting individuals—like a friend or a classmate.
Some websites have taken marginal steps to ensure that deepfakes are not being created with the photos of non-consenting individuals. Reddit has banned the deepfakes subreddit that had a hundred thousand members. Discord has shut down two servers where the chats centered on deepfakes and has banned several users. Pornhub and Twitter have also banned deepfake videos.
However, section 230 of the Communications Decency Act (often shortened to “CDA 230”) says that websites aren’t liable for third-party content. So, if someone creates a fake video and posts it on a third-party site, that third-party site isn’t going to be legally responsible for that video and cannot be forced to remove it. Any injunction that a victim received would only apply to the person who shared the content, and not the platform. The webpages are not incentivized to take swift action to fight these uploads, and many videos are still online.
Deepfake pornography has the potential to be used to extort, humiliate, harass, and blackmail victims. As deepfakes become more refined and easier to create, they also highlight the inadequacy of the law to protect would-be victims of this new technology. What can you do if you’re inserted into pornographic images or videos against your will? Is it against the law to create, share, and spread falsified pornography with someone else’s face?
The answer is complicated. The best way to get a pornographic face-swapped photo or video taken down is for the victim to claim either defamation or copyright, but neither provides a guaranteed path of success. Although there are many laws that could apply, there is no single law that covers the creation of fake pornographic videos — and there are no legal remedies that fully remedy the damage that deepfakes can cause.
On the federal level, federal legislation to address the problem of deepfake pornography—The Malicious Deep Fake Prohibition (MDFP) Act of 2018 — was introduced in the 2018 Congressional session, but the legislation did not advance. Unfortunately, as with many new technologies, the law is unequipped to handle these impending issues.
A defamation claim could potentially be effective because the person depicted in the video isn’t actually in it. It’s a false statement of fact about the victim’s presence, so there could potentially be a judgment against the perpetrator that orders the removal of the video or images. However, a defamation claim can be hard to win if you’re dealing with overseas or anonymous content publishers. Also, the fact that it isn’t a celebrity’s body makes it difficult to pursue as a privacy violation, because you might not be able to sue someone for exposing the intimate details of your life, when it’s not your life they’re exposing.
Courts are still hesitant to avoid chilling otherwise protected or worthy speech. There is a heavy presumption against the constitutional validity of any system of prior restraint of expression. In New York Times v. Sullivan, the U.S. Supreme Court held that recovering damages for defamation of a public official required showing that the false statement was made with malice. Malice was defined as “knowledge that it was false or [made] with reckless disregard of whether it was false or not.”
The Supreme Court laid out an even broader rule when dealing with defamation of private individuals. In Gertz v. Robert Welch, the Court held that states can define their own standard of liability for a publisher of “defamatory falsehood[s] injurious to a private individual” so long as it is not strict (essentially “automatic”) liability.
Under New Jersey defamation law, in order to succeed in a defamation claim, the plaintiffs must prove the following four (4) elements:
- The assertion of a false and defamatory statement concerning another;
- The unprivileged publication of that statement to a third party;
- Fault amounting at least to negligence by the publisher;
- The Plaintiff was damaged by the statement.
It is also worth noting that New Jersey has a one year statute of limitations for defamation.
Revenge Porn and Invasion of Privacy
Commonly known as “revenge porn,” it is the distribution of sexually explicit images or videos of individuals without their consent. The sexually explicit images or video may be made by a partner in an intimate relationship with the knowledge and consent of the subject at the time, or it may be made without their knowledge. The possession of the material may be used by the perpetrators to blackmail the subjects into performing other sex acts, to coerce them into continuing the relationship or to punish them for ending the relationship (in case of relationship), to silence them, to destroy their reputation, and/or for financial gain.
Prosecuting the maker of a deepfake porn photo or video as revenge porn could lead to success, but it also could be problematic. Unlike classic revenge porn, which depicts a real person, deepfake pornography is not real, unless the viewer recognizes the face. Once the viewer recognizes the face, the viewer might assume the act depicted actually occurred. But without proof of viewer recognition, the prosecutor will be without an essential element of a revenge porn case: harm to the victim, the owner of the face.
Not all states have laws specifically targeting nonconsensual pornography, but New Jersey is one of those that does. New Jersey has a specific law prohibiting nonconsensual pornography, also known as revenge porn or cyber exploitation. It is a crime in New Jersey to make a nonconsensual recording that reveals another person’s “intimate parts” or shows the person engaged in a sexual act without consent, disclose such a recording (however made), and even if the person shown consented to the recording, it is still a crime to post the recording without that person’s further consent to the disclosure.
Criminal Invasion of Privacy
Nonconsensual pornography is considered an invasion of privacy under New Jersey law. It is a crime in New Jersey to invade another person’s privacy without consent. A person commits the crime by intentionally observing, recording, and/or disclosing the recording of another individual’s intimate parts or sexual conduct, without the individual’s consent, and doing so when the individual observed or recorded has a reasonable expectation of privacy. A person faces up to 5 years in prison for third-degree invasion of privacy and up to 18 months in prison for fourth-degree invasion of privacy. The standard fines for a third and fourth-degree crime are enhanced to $30,000 if an individual pleads or is found guilty of either grade of this charge.
Civil Lawsuit for Invasion of Privacy
A person who invades another’s privacy in New Jersey as described above may face liability for civil damages, in addition to a criminal penalty. (N.J. Stat. Ann. § 2A:58D-1.) New Jersey law gives a victim of a nonconsensual recording of their intimate parts or sexual conduct the right to file a civil action in which they may seek actual damages (that is, any losses or expenses associated with the violation) of not less than $1,000 per violation; attorney’s fees, and punitive damages.
As with criminal invasion of privacy, consent or lack of a reasonable expectation of privacy may be a defense in such an action. Consent means the alleged victim gave the defendant permission to carry out the act, for example, he or she might consent to being recorded or photographed. Consent is not a valid defense where the defendant’s conduct exceeded the scope of the consent, or where the defendant was mistaken as to whether consent was given.
In New Jersey, an individual should not be prosecuted for invasion of privacy offense if: (1) he or she posted or otherwise provided notice that they would be filming, photographing or observing; (2) the conduct is at an access way, foyer or entrance to a fitting room or dressing room operated by a retail establishment; (3) the accused is a law enforcement officer, corrections officer or guard in a correctional facility or jail, who is engaged in the official performance of his/her duties.
Intentional Infliction of Emotional Distress
The victim may also pursue legal action by bringing a state tort claim of Intentional Infliction of Emotional Distress (“IIED”). In New Jersey, in order to succeed in a IIED claim, the following elements must be proven:
- the defendant acted intentionally or recklessly in doing the act alleged to be wrongful which produced the emotional distress;
- the conduct was so extreme and outrageous as to go beyond all bounds of human decency;
- the defendant’s actions caused the emotional distress; and
- the distress was so severe that no reasonable person could be expected to endure it.
IIED is one of the most common tort claims brought in civil court; however, what many people don’t know is that it is difficult to win. IIED cases require a deep factual analysis, and the victim can only succeed in narrow circumstances where the producer of the deepfake video had the required intent and the victim had suffered extremely.
Staff Writer Zhen Liu is a recent graduate from Seton Hall Law School, where she was a member of the Asian Pacific American Lawyers Association. She specializes in Family Law and serves as research assistant to associate reporters of The American Law Institute.