Someone Is Using Your Photos.
You're the Identity Theft Victim.
Someone is using your photos on Tinder, Bumble, or Hinge to catfish people. Maybe your content is being turned into deepfakes you never consented to. You're not the catfish victim — you're the identity theft victim. Illinois law gives you real remedies: copyright claims, right of publicity, criminal identity theft charges, and new federal deepfake penalties.
Is Catfishing Illegal? What Illinois Law Actually Says
Yes, catfishing is illegal in Illinois. Using someone else's photos to create a fake dating profile can violate Illinois identity theft statutes (720 ILCS 5/16-30), the Right of Publicity Act (765 ILCS 1075), federal copyright law, and the TAKE IT DOWN Act if deepfakes are involved.
But here's what every other article about catfishing gets wrong: they're written for the person who got tricked — the person who fell for a fake dating profile. Nobody is writing for you: the person whose photos were stolen to build that fake profile in the first place.
That distinction matters legally. You're not a romance scam victim. You're an identity theft victim — and you have significantly more legal standing, more available remedies, and higher potential damages than the person who was catfished.
"How to spot a catfish on dating apps"
"Signs your online date isn't real"
"What to do if you've been catfished"
Written for the person who got tricked. The person who swiped right on a fake profile.
"Someone stole my photos and put them on Tinder"
"My face is being used in deepfake content I never consented to"
"Fake profiles using my images keep appearing on multiple platforms"
Written for the person whose identity was stolen. You have copyright claims, right of publicity claims, and identity theft claims the catfished person doesn't have.
Why Content Creators Are the Biggest Targets
If you create content for a living — or even just have a public-facing profile — your photos are being harvested. An estimated 50–70% of paid OnlyFans content gets stolen. 88–92% of romance scams use real people's photos. Here's who we see this happen to most:
Instagram Influencers & Models
Public photos downloaded from IG and TikTok, then used to create fake personas on Tinder, Bumble, Hinge, and Facebook Dating. Brand damage and professional reputation at stake.
OnlyFans & Adult Content Creators
Content scraped, leaked, or ripped from the platform. Photos used on fake dating profiles or sold through scam accounts on Telegram and Reddit.
Fitness & Lifestyle Creators
High-quality physique and lifestyle photos scraped for crypto romance scams. Your fitness photos are being used to catfish people into sending money.
Real Estate Agents & Professionals
Professional headshots widely available online. Fake dating profiles built on your name and face can damage your professional reputation with clients who find them.
Deepfake Victims
AI tools scrape your public photos to generate fake intimate content. This is now a federal crime under the TAKE IT DOWN Act, with platforms required to remove within 48 hours of notice.
The "Whack-a-Mole" problem: You report a fake profile on Tinder. It comes down. A new one appears on Bumble the next week. Another surfaces on Hinge. Platform reporting was designed for one-off problems, not serial impersonation. When the same person — or multiple people — keep creating fake profiles with your photos, you need a legal solution that targets the source, not the symptom.
Tired of Playing Whack-a-Mole With Fake Profiles?
Platform reporting treats symptoms. Legal action treats the source. We identify who is behind the fake profiles and make them stop. Free consultation.
What to Do Right Now If Someone Is Using Your Photos
Document everything
Screenshot every fake profile — the photos used, the profile bio, the username, the URL, and the platform. Include timestamps. Do this before you report, because reported profiles disappear and you lose your evidence.
Report to the platform
Every platform has an impersonation report option. On Tinder, select "Someone is impersonating me." On Bumble, select "Stolen Photo." On Instagram, use the IP (intellectual property) report form. This is a necessary step, but it won't stop the problem from recurring.
File a DMCA takedown notice
If you took the photos (selfies, directed shoots, content you created), you own the copyright. A DMCA takedown notice legally compels the platform to remove the content. This is more powerful than a standard impersonation report and creates a legal paper trail.
Consider filing a police report
If your photos are being used to commit fraud (romance scams, money solicitation), this is criminal identity theft under Illinois law (720 ILCS 5/16-30). A police report strengthens your civil case and may trigger a criminal investigation.
Consult an attorney
When self-help isn't working — when new fake profiles keep appearing, when the impersonator is unknown, when your content is being sold or used for fraud — a lawyer can file John Doe subpoenas to unmask anonymous impersonators, send cease and desist letters with legal consequences, and pursue civil damages.
Can You Sue Someone for Catfishing With Your Photos?
Yes. Using someone else's photos to create a fake dating profile violates multiple Illinois and federal laws. Here are the legal theories available to you — and the damages you can recover:
| Law | What it covers | Potential damages |
|---|---|---|
| IL Right of Publicity Act 765 ILCS 1075 | Unauthorized use of your name, photo, image, or likeness. Now includes AI-generated "digital replicas" since HB 4875 (2024). | Actual damages + profits + attorney fees + injunction |
| Copyright Infringement 17 U.S.C. § 101–810 | If you took the photos (selfies, directed shoots), you own the copyright. Unauthorized use is infringement. | Up to $150,000 per photo (if registered); actual damages + profits |
| IL Digital Forgeries Act HB 2123 | Sexually altered AI-generated images (deepfake porn) created or distributed without consent. | Up to $10,000 per defendant + injunctive relief |
| TAKE IT DOWN Act Federal, signed May 2025 | First federal law criminalizing nonconsensual intimate images including AI deepfakes. Platforms must remove within 48 hours. | Up to 2 years imprisonment (criminal) |
| DEFIANCE Act Federal, passed Senate Jan 2026 | Federal civil cause of action for nonconsensual intimate digital forgeries. 10-year statute of limitations. | $150,000–$250,000 liquidated damages + attorney fees |
| IL Identity Theft 720 ILCS 5/16-30 | Using your identifying information to fraudulently obtain money, credit, goods, or services. | Class 4 felony: 1–3 years (criminal) |
| DMCA Takedown 17 U.S.C. § 512 | Compels platforms to remove infringing content. Subpoena power to identify anonymous infringers. | Content removal + identification of infringers |
You likely have more legal options than you think. Most content creators qualify for multiple overlapping claims — copyright, right of publicity, identity theft, and potentially the new federal deepfake laws. More claims means more leverage in settlement negotiations and higher potential damages.
Find Out Which Laws Protect You
Every case is different. A free 30-minute consultation tells you exactly which legal claims apply to your situation, what realistic outcomes look like, and what it would cost. Confidential. No obligation.
How We Help Content Creators Fight Back
We identify the impersonator
We file John Doe lawsuits and subpoena dating platforms, social media companies, and ISPs for account holder information — IP addresses, email addresses, phone numbers. We work with licensed private investigators for cases that require deeper cyber investigation.
We force the content down
DMCA takedown notices through counsel carry more weight than self-filed notices. Cease and desist letters with litigation behind them get results. For deepfakes, the TAKE IT DOWN Act now requires platforms to remove within 48 hours of notice from an attorney.
We hold them accountable
Civil litigation for damages under the Right of Publicity Act, copyright law, and federal statutes. Criminal referrals to law enforcement where appropriate. The goal is not just removal — it's making sure this stops permanently.
Justin Abdilla
I built an identity protection practice because I saw how badly the legal system was failing people whose photos and identities were being stolen. Platform reporting is slow, unreliable, and doesn't address the root cause. I use facial recognition technology to find unauthorized uses of your likeness, DMCA and legal takedowns to force content removal, and litigation to hold impersonators accountable. If someone is using your face to deceive people, I'll tell you exactly what we can do about it and what it costs before we start.
Common Questions
Yes. Using someone else's photos to create a fake dating profile can violate Illinois identity theft law (720 ILCS 5/16-30), the Right of Publicity Act (765 ILCS 1075), and federal copyright law if you took the photos. If deepfake intimate images are involved, Illinois's Digital Forgeries Act and the federal TAKE IT DOWN Act both apply. The specific claims available depend on what the impersonator is doing with your photos and how they obtained them.
In most cases, yes. If you created the photos (selfies, directed shoots, content you produced), you own the copyright and unauthorized use is infringement — up to $150,000 per photo if registered. Your likeness is also protected under Illinois's Right of Publicity Act. If the fake profile is being used to commit fraud (romance scams, money solicitation), that's criminal identity theft under 720 ILCS 5/16-30. An attorney can pursue all three simultaneously.
Platform reporting treats symptoms, not the source. It removes one profile but cannot identify who is behind it, cannot stop them from creating new profiles, and cannot compensate you for the damage already done. A lawyer can subpoena the platform for account holder information (IP address, email, phone), identify the person behind the fake accounts, and get a court injunction that carries the force of law. Violating a court order is contempt — a much stronger deterrent than a platform ban.
The TAKE IT DOWN Act is a federal law signed in May 2025 that specifically addresses nonconsensual intimate images, including AI-generated deepfakes. It requires online platforms to remove such content within 48 hours of receiving a valid notice. Violations carry criminal penalties. If someone has created deepfake intimate images of you, this law gives you a fast-track removal right that exists independently of whether you can identify the creator.
Yes, under multiple laws. Illinois's Digital Forgeries Act (HB 2123) provides a civil cause of action with $10,000 in statutory damages per defendant for sexually altered AI-generated images created without consent. The federal TAKE IT DOWN Act criminalizes publishing nonconsensual intimate deepfakes. The federal DEFIANCE Act (passed Senate January 2026) adds a civil claim with $150,000–$250,000 in liquidated damages. You may also have Right of Publicity claims under Illinois HB 4875 (2024), which now explicitly covers AI-generated 'digital replicas.'
We start with what's publicly available: the profile username, linked accounts, reverse image searches, and any contact information. Then, if that's not enough, we file a John Doe lawsuit and subpoena the platform for account registration data — IP address, email, phone number, payment information. We work with a licensed private investigator for cases that require deeper cyber investigation, including social media scraping and facial recognition searches across public platforms.
It depends on what's needed. A cease and desist letter with legal backing typically costs $750–$1,500 and stops most impersonators cold. For cases that require identifying an anonymous impersonator through court subpoenas, the cost scales with the complexity of the investigation. We give you a clear cost estimate before you commit to anything. Nationally, identity theft litigation costs $14,000–$16,000. We structure our fees so that most cases resolve well below that threshold.
Your Photos. Your Identity. Your Rights.
Illinois law gives you real remedies when someone steals your photos for fake profiles or deepfakes. Free case evaluation. Straight answers. An attorney who actually understands content creator identity theft.