Legal Guide

Illinois Deepfake Laws:
What You Need to Know in 2025

Illinois has the most aggressive deepfake and biometric privacy protections in the country. This guide covers every statute you can use to fight back, from BIPA to the new AI-specific criminal laws.

Call (630) 839-9195
Updated April 2025 By Justin Abdilla, Managing Attorney Super Lawyers Rising Stars 2021 to 2026
Justin Abdilla, Illinois deepfake and identity protection attorney

Illinois: The National Leader in Deepfake Regulation

I get calls every week from people who found their face in a deepfake. A fake dating profile. An AI-generated video. A cloned voice saying things they never said. The first thing I tell them: you live in the right state.

Illinois has six enacted statutes that directly address deepfakes and AI-generated content, plus three pre-existing criminal statutes that cover deepfake conduct. That is more legal firepower than any other state. While Congress was still debating what a deepfake even is, Illinois was already writing the laws to stop them.

I wrote this guide because I got tired of seeing bad legal advice on the internet about deepfakes in Illinois. Half the articles out there are written by marketing firms that do not practice law. This guide is written by someone who actually files these cases.

In a crisis right now? Skip to what to do immediately. Preserve your evidence, do not contact the perpetrator, and call us at (630) 839-9195. The legal analysis can wait. Your evidence cannot.

Why Illinois? Illinois passed BIPA in 2008 when most states did not know what biometric data was. That law has generated over $1 billion in settlements — $650 million from Meta alone. The same legislature that passed BIPA now writes the deepfake laws. They are not gentle about it.

If You Are a Deepfake Victim, Do This First

Before you read the rest of this guide, handle the emergency. I have seen too many cases where people spent weeks researching the law while the evidence disappeared.

1

Preserve Evidence Immediately

Screenshot everything. Record URLs, usernames, timestamps, and platform details. Save the images or videos — I know that is the last thing you want to do, but they are your evidence. If you can, use the Wayback Machine or a screen capture service that timestamps everything. I have seen cases fall apart because the fake account got deleted before we could document it.

2

Do Not Contact the Perpetrator

Do not threaten them. Do not confront them. Do not send them a message saying you know what they did. Every time someone tips off the person behind a fake account, that person deletes everything and starts over with a new account. Let your attorney handle first contact.

3

File Platform Reports

Report the content through the platform's non-consensual intimate image channel — not just the general report button. Under the federal TAKE IT DOWN Act, platforms now have a 48-hour deadline to remove it. Document every report you file and every response you get.

4

Call a Lawyer

I am biased here, obviously. But the reality is that a legal demand letter gets content removed faster than a platform report. An attorney can issue preservation notices to prevent evidence destruction, subpoena platforms to unmask anonymous accounts, and file for injunctions that carry the force of a court order. We do this regularly.

5

Consider a Police Report

If the deepfake is sexual, that is a Class 4 felony under 720 ILCS 5/11-23.7. If it involves identity theft, that is up to a Class 1 felony. Filing a police report creates an official record and puts enormous pressure on the perpetrator. You can pursue criminal charges and civil claims simultaneously — they are not mutually exclusive.

Evidence Disappears Every Day You Wait

Free case review. I will tell you exactly what claims you have and what they are worth. No charge for the conversation.

Or call (630) 839-9195 · Text (312) 489-8710

Civil Liability for Deepfake Sexual Images

740 ILCS 190/ (amended by HB 2123, effective January 1, 2024)

Before 2024, there was a gap in Illinois law. The revenge porn statute covered real images shared without consent, but it was not clear whether AI-generated sexual images — images of things that never happened — were covered. Defense attorneys were arguing that if the depicted conduct never occurred, the statute did not apply.

The legislature closed that gap. The law now covers any "digitized depiction" of a real, identifiable person engaged in sexual conduct or in a state of nudity, regardless of whether the depicted conduct ever actually occurred. If someone can tell it is you, and you did not consent, it is actionable.

What It Covers

  • AI-generated sexual images depicting a real person
  • Face-swapped pornographic content
  • Digitally altered images that make someone appear nude or engaged in sexual acts
  • Any synthetic media where a real person is identifiable in sexual content they did not consent to

Remedies

Statutory Damages

Minimum $10,000 per defendant. No need to prove actual financial loss.

Actual Damages

Compensation for emotional distress, therapy costs, lost income, and reputational harm.

Punitive Damages

Available to punish particularly egregious conduct.

Injunctive Relief

Court orders forcing removal of content and prohibiting future creation or distribution.

Attorney Fees

The court may award reasonable attorney fees and costs to prevailing plaintiffs.

Real Results: Doe v. Fritch (2024 IL App (4th) 230585)

This statute is not theoretical. In Doe v. Fritch, an Illinois appellate court upheld a total award of $316,785 against an ex-boyfriend who posted explicit video of his former partner online with her full name, city, and personal details. The breakdown:

  • $150,000 in emotional distress damages (based on plaintiff's testimony alone, no expert required)
  • $150,000 in punitive damages (1:1 ratio held constitutional)
  • $4,300 in economic damages (restraining order and internet removal costs)
  • $12,485 in attorney fees

The court found that the defendant's invocation of the Fifth Amendment could be used as an adverse inference at summary judgment, and that emotional distress damages did not require expert testimony. This case demonstrates exactly what these statutes are designed to do.

Statute of limitations: Two years from the date of dissemination, or one year from discovery if the victim did not know the images were being distributed. Do not wait.

Criminal Penalties for Sexually Explicit Deepfakes

720 ILCS 5/11-23.7 (effective January 1, 2024)

Creating or distributing a sexually explicit deepfake is not just a civil matter in Illinois. It is a Class 4 felony, carrying one to three years in prison and fines up to $25,000.

Elements of the Crime

A person commits the offense when they intentionally disseminate a sexually explicit digitized depiction of another identifiable person when they know or should have known the person depicted did not consent to the creation or dissemination.

Key Points

  • Class 4 felony: 1 to 3 years imprisonment
  • Enhanced penalties when the victim is a minor or when the distribution was for profit
  • The person depicted does not need to have been aware of the deepfake at the time of creation
  • Both the creator and the distributor can be charged

Criminal and civil are not mutually exclusive. You can report deepfake sexual images to law enforcement for criminal prosecution while simultaneously pursuing a civil lawsuit for damages. The criminal case puts pressure on the perpetrator. The civil case compensates you.

Someone Made a Deepfake of You?

We investigate using facial recognition technology, preserve the evidence, and take legal action under every applicable Illinois statute. Free case review.

Or call (630) 839-9195 · Text (312) 489-8710

Right of Publicity Act: AI Digital Replicas

765 ILCS 1075/ (amended by P.A. 104-282, effective January 1, 2026)

The Illinois Right of Publicity Act has prohibited unauthorized commercial use of a person's identity since 1999. The legislature expanded it in P.A. 104-282 to explicitly cover AI-generated "digital replicas" — defined as electronic representations of a person's voice, likeness, or other identifying characteristics created using AI, so realistic that a reasonable observer would believe it is a genuine performance.

This is the broadest deepfake protection in Illinois because it is not limited to sexual content. It covers any unauthorized use of your digital replica, including:

  • AI-generated videos using your face
  • Cloned versions of your voice
  • Synthetic media impersonating you for any purpose
  • Commercial exploitation of your likeness through AI tools

Secondary Liability

The P.A. 104-282 amendments also create secondary liability for anyone who facilitates the unauthorized creation or distribution of a digital replica. This could extend to AI tool providers, platforms that host the content, and anyone in the distribution chain.

Remedies

Illinois courts have enforced the Right of Publicity Act with real teeth. In Blair v. Nevada Landing Partnership, the court established a $1,000 statutory minimum per violation even where actual damages are difficult to quantify. In Doe v. Flava Works, Inc., the court confirmed this floor applies regardless of whether the plaintiff can prove the defendant's profits. Available remedies include:

  • $1,000 statutory minimum per violation
  • Actual damages or the profits derived from the unauthorized use (whichever is greater)
  • Injunctive relief
  • Attorney fees for willful violations
  • Right survives death and is descendible (protects estates)

Key Case Law

Illinois courts have developed a robust body of Right of Publicity Act case law that will shape how the AI digital replica amendments are applied:

  • Blair v. Nevada Landing Partnership — Established that the Act supplanted common law publicity claims, created the $1,000 statutory minimum, and applied a one-year statute of limitations with a single publication rule.
  • Trannel v. Prairie Ridge Media, Inc. — Distinguished between "holding out" (using someone's identity as if they endorsed you) and "public use" (simply making someone's identity publicly visible), and held that unauthorized use of identity for commercial purposes does not require celebrity status.
  • Brown v. ACMI Pop Division — Rejected federal copyright preemption of Right of Publicity claims, meaning both copyright and publicity claims can proceed simultaneously when someone's likeness is used without authorization.
  • Schivarelli v. CBS, Inc. — Confirmed the news and public affairs exemption: uses of identity in connection with news reporting and commentary are protected.

This is your broadest tool. Unlike the deepfake sexual images statute, the Right of Publicity Act covers non-sexual deepfakes. If someone is using an AI version of your face or voice for any unauthorized purpose, whether commercial, political, or personal, this statute applies. And unlike BIPA, where the 2024 amendment capped damages at one recovery per person, the Right of Publicity Act allows recovery of the defendant's entire profits from the unauthorized use.

BIPA: The Biometric Privacy Angle

740 ILCS 14/ (Biometric Information Privacy Act, enacted 2008)

BIPA is not a deepfake statute. It is something better: a biometric data consent law with statutory damages that has generated over $1 billion in settlements against major technology companies. I have spent years building a research database on BIPA litigation — 200+ case summaries, 30 entity profiles, every major settlement tracked. This is the statute I know best.

Here is why it matters for deepfakes: to create a convincing deepfake of your face, AI tools must analyze your facial geometry. That is a biometric identifier under BIPA. If the tool collects, stores, or processes your facial geometry without your prior written consent, that is a BIPA violation. And unlike most privacy laws, BIPA lets you sue. Not the attorney general. You.

How BIPA Applies to Deepfakes

Section 15(a)

Companies must publish a retention schedule and destruction guidelines for biometric data. Most AI deepfake tools have no such policy.

Section 15(b)

Written notice and consent required before collecting facial geometry. Deepfake creators almost never obtain this consent.

Section 15(c)

No profiting from biometric data. Companies selling deepfake tools that use your face geometry for profit are exposed.

Section 15(d)

Cannot disclose biometric data to third parties without consent. Sharing facial geometry data between AI models or services violates this provision.

BIPA Damages

$1,000 per negligent violation or $5,000 per intentional/reckless violation. After the 2024 amendment, damages are capped at one recovery per person per violation type. But the statutory minimum still applies, and attorney fees are recoverable.

BIPA is the statute that has generated over $1 billion in settlements against the largest technology companies in the world:

Company Settlement Biometric Data at Issue
Meta (Facebook)$650,000,000Facial recognition photo tagging
Google$100,000,000Google Photos face grouping
TikTok$92,000,000Faceprint and voiceprint collection
Clearview AI$51,750,000Facial recognition web scraping
Motorola / Vigilant$47,500,000FaceSearch booking photo processing
Snap (Snapchat)$35,000,000Lenses and filters facial geometry
ADP$25,000,000Employee fingerprint timeclocks
UKG / Kronos$18,600,000Employee fingerprint timeclocks

If someone is running your photos through an AI tool to create deepfakes, BIPA gives you a private right of action that no other state provides. Illinois is the only state where an individual can sue for biometric privacy violations and recover statutory damages without proving actual harm.

Digital Voice and Likeness Protection Act

815 ILCS 550/ (P.A. 103-830, effective August 9, 2024; amended by P.A. 104-282 and P.A. 104-417)

This law protects performers, content creators, and public figures from having their voices and likenesses exploited by AI without fair compensation or consent. It emerged directly from the 2023 SAG-AFTRA and WGA strikes, where AI-generated performances were a central bargaining issue.

The Act defines a "digital replica" as "a newly created, electronic representation of the identity of an actual individual created using a computer, algorithm, software, tool, artificial intelligence, or other technology" that is "so realistic that a reasonable observer would believe it is a performance by the individual being portrayed." This same definition is shared by the Right of Publicity Act amendments, creating a unified legal framework.

Voids Exploitative AI Contracts

A contract provision allowing creation and use of a digital replica is unenforceable for new performances fixed on or after January 1, 2026 if it meets all three of the following conditions:

  • It allows a digital replica to replace work the individual would otherwise have performed in person
  • It does not include a reasonably specific description of the intended uses of the digital replica
  • The individual was not represented by either legal counsel who negotiated the digital replica licensing terms, or a labor union whose CBA expressly covers digital replica uses

There is a safe harbor: vague descriptions are permitted when the uses are "consistent with the terms of the contract" and the "fundamental character" of the original work as recorded or performed.

What This Means in Practice

If a production company or content platform wants to use AI to generate a performance using your face or voice, the contract must specifically describe what the AI replica will be used for, and you must have had the opportunity to be represented by a lawyer or union during the negotiation. Broad, all-encompassing IP assignment clauses that do not mention AI replicas are unenforceable for new AI-generated work.

Who this protects most: Actors, voiceover artists, musicians, influencers, and anyone entering contracts for creative work. The Act works alongside the Right of Publicity Act: the DVLPA prevents you from being locked into exploitative AI contracts going forward, while the Right of Publicity Act provides tort remedies if your identity is used without authorization at all. Together, they create a two-layer defense.

Your Face. Your Voice. Your Rights.

Illinois gives you more legal tools to fight deepfakes than any other state. We know how to use every one of them.

Or call (630) 839-9195 · Text (312) 489-8710

Criminal Impersonation and Identity Theft

Beyond the deepfake-specific statutes, Illinois has strong pre-existing criminal laws that apply when deepfakes are used for fraud or impersonation.

Identity Theft (720 ILCS 5/16-30)

Using personal identifying information, including biometric data, to fraudulently obtain credit, goods, services, or other benefits is a Class 3 felony (2 to 5 years). When the victim suffers financial loss exceeding $100,000, it escalates to a Class 1 felony (4 to 15 years).

Deepfakes used to impersonate someone for financial gain, such as in a business email compromise, investment scam, or fraudulent identity verification, fall squarely within this statute.

False Personation (720 ILCS 5/17-2)

Fraudulently representing yourself as another person to obtain a benefit or cause harm ranges from a Class A misdemeanor to a Class 3 felony, depending on the context and amount of harm caused.

AI-Generated Child Sexual Abuse Material (P.A. 103-0775)

Illinois expanded its child pornography statute to explicitly cover AI-generated CSAM. Creating AI-generated sexual images of minors carries the same severe felony penalties as producing real CSAM, regardless of whether the depicted minor is a real person.

Federal Deepfake Laws Applicable in Illinois

TAKE IT DOWN Act (Signed May 19, 2025)

The first federal law specifically targeting deepfake intimate images. Key provisions:

  • Criminal offense to publish non-consensual intimate images, including AI-generated deepfakes
  • Penalties: Up to 2 years imprisonment; 3 years if the victim is a minor
  • Platform mandate: Social media platforms must remove reported NCII within 48 hours of receiving a valid request
  • Platforms must provide an accessible, user-friendly reporting mechanism

Practical impact: Before this law, platforms could ignore your takedown requests indefinitely. Now they have a 48-hour federal deadline. This does not replace your Illinois state law claims, which provide stronger damages, but it gives you a faster mechanism to get content removed while you pursue legal action.

DEFIANCE Act (Pending)

The DEFIANCE Act passed the Senate on January 13, 2026 and remains pending in the House. If enacted, it would create a federal civil cause of action for deepfake intimate images with damages of $150,000 to $250,000. This is not yet law, but it signals the direction of federal policy.

AI in Employment Decisions

775 ILCS 5/ (P.A. 103-0804, effective January 1, 2026)

Illinois amended its Human Rights Act to make it a civil rights violation for employers to use AI systems that have the effect of discriminating against employees or applicants based on protected characteristics. The law also requires employers to notify employees when AI is used in employment decisions.

Two New Violations

  • Discriminatory AI use: Using AI for recruitment, hiring, promotion, discharge, or other employment decisions where the AI has the effect of discriminating based on race, sex, age, disability, or any other protected class. This operates on a disparate impact theory — no intent required.
  • Failure to provide AI notice: Failing to disclose to employees that AI is being used in employment decisions.

Enforcement

Violations carry penalties of up to $5,000 per violation through the Illinois Department of Human Rights. Individuals can obtain right-to-sue letters and pursue private actions for compensatory damages, punitive damages, and attorney fees. The law specifically prohibits using zip codes as a proxy for protected characteristics in AI decision-making.

Why This Matters for Deepfakes and Biometric Privacy

  • AI-powered hiring tools that use facial analysis or voice analysis to screen candidates are covered — creating dual liability under both HB 3773 and BIPA
  • The law uses identical AI definitions to the Digital Voice and Likeness Protection Act, reflecting a coordinated legislative approach (both were signed on August 9, 2024)
  • Employers using biometric-based AI tools face a triple compliance burden: BIPA consent, HB 3773 non-discrimination testing, and the older Illinois AI Video Interview Act (820 ILCS 42)

Damages Breakdown: What Can You Recover?

Different statutes provide different remedies. Here is a summary of what each law makes available to deepfake victims in Illinois.

Statute Type Minimum Damages Additional Remedies
Non-Consensual Images (740 ILCS 190/) Civil $10,000 per defendant Actual, punitive, injunction, attorney fees
Right of Publicity (765 ILCS 1075/) Civil $1,000 statutory minimum per violation Actual damages or profits, injunction, attorney fees
BIPA (740 ILCS 14/) Civil $1,000 negligent / $5,000 intentional Injunction, attorney fees
AI Employment (775 ILCS 5/, P.A. 103-0804) Civil / Admin $5,000 per violation (IDHR) Compensatory, punitive, attorney fees via private action
Sexually Explicit Depiction (720 ILCS 5/11-23.7) Criminal N/A (criminal fines up to $25,000) 1 to 3 years imprisonment (Class 4 felony)
Identity Theft (720 ILCS 5/16-30) Criminal N/A 2 to 15 years (Class 3 to Class 1 felony)
TAKE IT DOWN Act (federal) Criminal N/A Up to 2 to 3 years imprisonment; 48-hour takedown

These stack. A single deepfake incident can give rise to claims under multiple statutes simultaneously. A deepfake sexual image of an Illinois resident could trigger civil claims under 740 ILCS 190/, the Right of Publicity Act, and BIPA, while also supporting criminal charges under 720 ILCS 5/11-23.7 and the federal TAKE IT DOWN Act. Your attorney should evaluate every available cause of action.

You Have the Laws. Now Use Them.

Free case review. Flat-fee pricing. I use facial recognition technology to find what you cannot find on your own.

Or call (630) 839-9195 · Text (312) 489-8710

Frequently Asked Questions About Deepfake Laws in Illinois

Yes. Illinois has some of the strongest deepfake protections in the country. Deepfake sexual images carry both criminal penalties (Class 4 felony, 1 to 3 years) and civil liability ($10,000 minimum per defendant plus actual damages). Non-sexual deepfakes that use your face or voice without permission violate the Right of Publicity Act and potentially BIPA. The specific claims depend on what the deepfake is being used for and how it was created.

In most cases, yes. Your strongest claims are under the Right of Publicity Act (765 ILCS 1075), which now explicitly covers AI-generated 'digital replicas' of your likeness or voice. If the deepfake is sexual, 740 ILCS 190/ provides $10,000 in statutory damages per defendant plus actual and punitive damages. If the person creating the deepfake scanned your face using facial recognition technology, BIPA provides $1,000 to $5,000 per violation. You may also have claims for defamation, false light invasion of privacy, and intentional infliction of emotional distress.

The Biometric Information Privacy Act (740 ILCS 14/) requires consent before anyone can collect, store, or share your biometric identifiers, including your facial geometry and voiceprint. Deepfake tools that scan your photos to map your facial geometry are collecting biometric data. If they do this without your written consent, that is a BIPA violation carrying $1,000 per negligent violation or $5,000 per intentional violation. Illinois is the only state with a private right of action for biometric privacy violations.

It depends on the claims. Under the non-consensual intimate images statute (740 ILCS 190/), you get a minimum of $10,000 per defendant in statutory damages plus actual damages, punitive damages, and attorney fees. Under the Right of Publicity Act, you can recover actual damages or the profits the person made from using your identity. Under BIPA, statutory damages are $1,000 to $5,000 per violation. Defamation claims can yield compensatory damages and, in some cases, presumed damages without proving financial loss.

First, preserve evidence immediately. Screenshot everything with timestamps. Do not contact the person who made it. Second, call a lawyer. You have both criminal and civil options in Illinois. The federal TAKE IT DOWN Act requires platforms to remove non-consensual intimate images within 48 hours of a report. Under Illinois law, you can pursue criminal charges (Class 4 felony), civil damages ($10,000 minimum), and injunctive relief. Time matters because evidence disappears and statutes of limitations apply.

Section 230 of the Communications Decency Act generally shields platforms from liability for user-generated content. However, the federal TAKE IT DOWN Act now requires platforms to remove non-consensual intimate images (including deepfakes) within 48 hours of receiving a report. Illinois Right of Publicity Act amendments also create secondary liability for anyone who 'facilitates' unauthorized use of a digital replica, which could extend to platforms in some circumstances. This area of law is evolving rapidly.

Timelines vary by claim. Defamation has a one-year statute of limitations from the date of first publication. The Right of Publicity Act has a one-year statute of limitations with a single publication rule, meaning each act of publication starts its own clock (established in Blair v. Nevada Landing Partnership). BIPA has a five-year statute of limitations (established by the Illinois Supreme Court in Tims v. Black Horse Carriers). Non-consensual intimate image claims under 740 ILCS 190/ have a two-year statute of limitations, or one year from discovery if the images were distributed without the victim's knowledge. Do not wait. Evidence disappears and your window shrinks every day.

BIPA protects Illinois residents. If your biometric data (facial geometry) was collected from photos of you as an Illinois resident, BIPA may apply regardless of where the person creating the deepfake is located. Courts have found personal jurisdiction over out-of-state companies that process biometric data of Illinois residents. However, this is a developing area of law and the analysis depends on the specific facts.

(630) 839-9195