Checklist: What to Do If Your Photos or Videos Are Used to Train an AI Without Consent
Practical, step-by-step checklist to identify misuse, demand dataset opt-outs, issue takedowns and file DPA complaints when AI used your images without consent.
Someone used your photos or videos to train an AI without asking? Here’s exactly what to do — step by step
If your image or video appears in an AI model’s output or dataset and you never consented, you’re not alone — this problem exploded in late 2025 and into 2026. High-profile incidents (including lawsuits arising from AI systems generating sexualised or deceptive images) show platforms and model owners are under scrutiny. The good news: there are concrete legal, technical and practical routes to force removal, demand opt-outs and lodge regulator complaints. This checklist walks you through identifying misuse, gathering evidence, issuing takedown and opt-out requests, and escalating to Data Protection Authorities (DPAs) — with ready-to-use templates.
Why act now? (The 2026 context)
Regulators and courts in 2025–2026 increasingly treat unauthorised use of photos and videos in AI training as a serious privacy and copyright concern. Following multiple incidents where generative AIs produced sexualised deepfakes or recognisable likenesses without consent, regulators in the UK and EU have sharpened guidance and enforcement priorities. Platforms and several model providers now face reputational and legal risk — which strengthens your leverage when demanding dataset removal or opt-outs.
Quick checklist — immediate actions (first 48 hours)
- Preserve evidence: take time-stamped screenshots, save URLs and copy the model outputs that show your image or likeness.
- Record originals: secure original files (photos, videos) and note metadata, upload times, location and any witnesses.
- Run reverse image searches: use Google Images, TinEye, and specialized AI-output search tools to find copies and derivatives.
- Note hosting details: identify where the AI provider hosts models, the platform used (social site, chatbot, gallery) and any dataset names shown in disclosures.
- Check platform reporting tools: most major platforms have “report” or takedown channels for privacy, intellectual property, or deepfakes — use them immediately. If you need advice on hosting and moderation workflows, see guidance on how to host a safe, moderated live stream on emerging social apps.
Evidence checklist — what to collect
- Original file(s) with EXIF/metadata where available (download originals from your device)
- Screenshots with visible timestamps and URLs
- Full-page HTML saves or PDF prints of pages showing the AI output
- Video clips with timestamps or logs showing when your likeness appears
- Reverse-search results (links and screenshots)
- Any correspondence with platforms or model owners
- Witness statements or social posts referencing the content
- Hashes (SHA256) of original files to prove identity if needed
Step 1 — Identify where the misuse happened
Before sending notices, be precise. Was your image used by:
- a platform (social network, image host);
- a model provider (company selling an API, an open-source model repo);
- a dataset curator (research dataset or third-party data broker); or
- a downstream service (an app embedding a model).
Finding the right controller/respondent determines the law you rely on: copyright takedowns for hosts, data protection (GDPR/UK-GDPR) rights for personal data processing, or consumer protection when a company uses your images in commercial training without consent.
Step 2 — Immediate takedown and preserve requests
Use platform reporting tools first — they’re fast. For platforms or hosts that ignore reports, send a formal notice with these dual tracks:
- Copyright or DMCA-style takedown (if you own IP): demand removal of copies and derivatives and provide proof of ownership or original file metadata. If you need a checklist to think through IP and listing issues, see this copyright and listing checklist.
- Privacy/GDPR erasure request: ask the controller to delete your personal data, including any processed embeddings or model outputs that use your likeness.
Sample takedown to a platform (quick template)
Use this when posting on a social platform or image host:
To: [Platform abuse/DMCA/Privacy team]
Subject: Urgent – Unauthorised use of my image/video and request for removal
I am the rights holder/subject in the image(s) and video(s) located at: [URL(s)]. These files show my likeness and were used to train or are being distributed by your service without my consent. Please remove all copies and derivatives immediately and confirm deletion in writing. I also request preservation of all logs and copies for the period necessary for enforcement. Contact me at: [email/phone].
Step 3 — Formal data-protection notices (DSARs, erasure, objection)
Under the UK GDPR and EU GDPR you have rights that are powerful in AI-training cases: access to data, erasure (the “right to be forgotten”), objection to processing, and sometimes portability. Use them to force disclosure of whether your data was used, and to demand deletion.
How to structure a GDPR/UK-GDPR request
- Identify yourself clearly and attach proof (ID where necessary, but send minimal sensitive proofs if possible).
- State which right you’re exercising (access, erasure, objection, portability).
- Give URLs, timestamps, and hashes of the images/videos you claim were used.
- Ask for specific actions and evidence of deletion (e.g., deletion logs, confirmation of removal from training datasets and model weights where feasible).
- Set a clear deadline (the law requires a response within one month; remind of possible one-month extension only if complex).
Template: Erasure (right to be forgotten) request
To: Data Protection / Privacy Officer, [Company]
Subject: Request for erasure under [UK GDPR / GDPR] – Unauthorised use of my image
I am writing under Article 17 of the [UK GDPR / GDPR] to request the erasure of my personal data consisting of images/videos and any derived data used to train or generated by your AI systems. Details: [list URLs, timestamps, file hashes]. Please:If you are not the controller, please forward this to the controller and notify me who that is. I reserve the right to complain to the ICO or another DPA and to pursue legal remedies.
- Confirm in writing whether my data was processed and provide copies where applicable.
- Delete all copies and derivatives, including deletion from datasets and model training sets.
- Confirm deletion with supporting evidence (deletion logs or certificate) within one month.
Step 4 — Demand an opt-out from training datasets
Many model owners and dataset curators now offer or are being pressured into providing opt-outs. Request a specific confirmation that your image/video and any derived embeddings were removed from training sets and will not be used in future re-training.
What to request specifically
- Confirmation whether your file was included in any named dataset or snapshot.
- Immediate removal and prevention of re-ingestion in future training.
- Proof of removal: dataset manifests, logs or attestations.
- Commitment that models trained on data including your image will not be deployed, or that mitigation steps (retraining, model surgery) will be taken.
Sample opt-out request to a model provider or dataset curator
To: [Model Provider / Dataset Curator]
Subject: Demand for opt-out and removal from training datasets
I request immediate removal of all instances of my image/video ([list URLs and file hashes]) from any datasets you manage and from any models trained using those datasets. Please confirm:If you decline, please provide the legal basis for processing my data and the contact details of your data protection officer. I will notify the relevant DPA if my request is not satisfied promptly.
- Whether my data appears in any dataset or training snapshot (identify names and versions).
- That you will remove my data and any embeddings derived from it, and will not use it in future training.
- Provide dated evidence of removal and the measures taken to prevent re-ingestion.
Step 5 — When to file a DPA complaint (and which one)
If the controller fails to act or replies inadequately, file a complaint with a Data Protection Authority. In the UK, that is the ICO. In the EU, file with the DPA in the country where the controller is based or your country’s DPA if that controller targets you directly.
What to include in a DPA complaint
- Summary of events and timeline
- Copies of your original request(s) and any responses
- Evidence you collected (screenshots, URLs, hashes)
- Precise legal grounds (e.g., unlawful processing, failure to comply with erasure request)
- What remedy you seek (deletion, opt-out confirmation, compensation, injunction)
Practical tip on jurisdiction
Controllers offering services in the UK/EU or processing data of residents will fall under UK GDPR/EU GDPR. If the company is based outside Europe, you can still approach your local DPA; cross-border coordination is common. Expect investigations to take weeks to months, but DPAs increasingly prioritize AI cases in 2025–2026. For recent regulatory updates and how cross-border enforcement works, see reporting on new regulatory trends.
Step 6 — Use copyright and platform-specific rules in parallel
Copyright can be faster—especially under DMCA-style processes for US-hosted platforms. If you created the image/video, a copyright takedown can force rapid removal. Always run both tracks: intellectual property and data protection.
When copyright isn’t available
If you don’t own the copyright (e.g., a candid photo taken by someone else), lean on privacy, personality rights, defamation and data-protection claims. Many platforms treat non-consensual sexual images or deepfakes as policy violations and will remove them on that basis.
Step 7 — Advanced actions and escalation
- Preserve litigation options: ask platforms to preserve logs and pull their retention schedules in writing. Consider forensic preservation steps explained in security case studies like this incident simulation.
- Get legal advice: specialist privacy or IP lawyers can obtain injunctive relief if the content is especially harmful (sexualised deepfakes, minors, repeat misuse). Teams automating compliance and legal checks are increasingly common — see automation for legal and compliance workflows for context.
- Collect forensic help: forensic analysts can extract watermarks or identify model fingerprints indicating which dataset or model generated an output.
- Contact journalists or consumer groups: media attention can prompt faster corporate action, but weigh privacy trade-offs before going public.
- Consider civil actions: privacy, misuse of likeness and breach of data protection laws can support damages claims in some cases.
How to ask for proof of deletion and what to accept
Not all providers can delete “model weights” that encode your likeness without retraining. But they can and should:
- Delete raw copies from datasets and backups
- Remove identifiers and embeddings derived from your files
- Provide attestation of deletion and an explanation of technical limits
Ask for signed or logged confirmations, exact timestamps and the name of the person handling the request. If the provider claims deletion is technically impossible, request proof that it will not be used in future training or inference and ask for mitigation like targeted model surgery or rollout freezes. Technical teams managing model hosting and infrastructure may be using modern sharding and hosting architectures—see provider infrastructure notes for why manifests and logs matter.
Recent trends & future predictions (2026)
Late 2025 and early 2026 saw a surge in regulatory scrutiny, high-profile lawsuits and a market shift. Expect:
- More platforms publishing dataset manifests and provenance details.
- Wider adoption of opt-out portals for individuals (some providers already piloted them in 2024–25).
- Technical advances in watermarking and model-attribute frameworks that make attribution easier — useful evidence in complaints.
- Stronger DPA enforcement priorities for AI misuse — resulting in faster action on complaints.
These trends improve your leverage: companies are more risk-aware and increasingly need to show due diligence to avoid sanctions and reputational harm.
Common pushbacks you’ll get — and how to respond
- “We used public web data” — reply: Public availability does not equal consent under data protection rules; ask for dataset identifiers and legal basis for processing.
- “Technical deletion impossible” — ask for mitigation, attestations and non-deployment commitments; threaten DPA complaint and injunctive relief if harm is severe.
- “No evidence you were used in training” — provide hashes, reverse-search evidence and explain model outputs using your likeness; request a full disclosure via a DSAR.
Practical examples (mini case studies)
Case 1 — Fast platform removal
A UK user found a chatbot producing sexualised images of them. They took screenshots, reported via the platform’s “non-consensual imagery” policy and sent a GDPR erasure request. The platform removed the images within 72 hours and suspended the specific model rollout pending review.
Case 2 — Dataset opt-out enforced after DPA complaint
Someone discovered their public Instagram images used in a research dataset. After an erasure request was ignored, they filed a DPA complaint. The DPA opened an inquiry; within months the dataset curator issued an opt-out process and removed the complainant’s images from the dataset snapshot.
When to get a lawyer (and what to expect)
Contact a specialist if:
- The content involves sexualised images, minors, or serious reputational harm;
- Your erasure/opt-out requests are ignored;
- You need preservation letters or injunctive relief to prevent further distribution.
Lawyers can issue preservation letters, draft stronger legal notices, and pursue emergency court orders. They’ll also prepare you for a DPA investigation or civil claim if necessary.
Final practical tips — keep control, stay organised
- Use a single folder for all evidence, correspondence and copies.
- Log dates of every action and the names/IDs of people you spoke to.
- Use secure channels (encrypted email) for sensitive exchanges.
- Don’t share more sensitive personal data than necessary with platforms when proving identity.
- Retain copies of everything even after deletion — you may need them for a DPA or court.
Key takeaways
- Act fast: preserve evidence and use platform reports immediately.
- Parallel tracks win: use copyright takedowns, platform policy flags and GDPR/UK-GDPR notices together.
- Demand opt-outs and proof: ask model owners and dataset curators for explicit removal and attestations.
- Escalate to DPAs: if a controller ignores you, file a complaint with the ICO or the relevant EU DPA.
- Get expert help: forensic analysts and specialist lawyers can be decisive for serious harms.
Ready-to-use templates
Use the templates above as a starting point. For bespoke letters tailored to your jurisdiction and the exact facts, contact a privacy lawyer or use a consumer-advocacy service that specialises in AI harms.
Closing — what we recommend you do next
If you suspect your images or videos were used to train an AI without consent, follow this checklist now: preserve evidence, report to the platform, send a GDPR erasure and opt-out request to the model provider, and be prepared to file a DPA complaint. Regulators and courts in 2025–2026 are taking these complaints more seriously — your action matters.
Need help? If you want our free checklist in editable form or a pre-written complaint tailored to the ICO or a European DPA, visit Complains.uk or contact a specialist privacy lawyer. Take control: secure deletion, opt-outs and accountability are within reach.
Related Reading
- Designing Coming‑Soon Pages for Controversial or Bold Stances (AI, Ethics, Deepfakes)
- Automating Legal & Compliance Checks for LLM‑Produced Code in CI Pipelines
- Designing Audit Trails That Prove the Human Behind a Signature — Beyond Passwords
- Case Study: Simulating an Autonomous Agent Compromise — Lessons and Response Runbook
- Seven‑Day App: A TypeScript Playbook for Building a Small App with AI Assistants
- How to Launch a Successful Limited-Edition Drop: Lessons from Art Auctions and Trade Shows
- Partnering with Global Platforms: How a BBC–YouTube Model Could Expand Quranic Education
- From Cards to Collabs: How Gaming IPs Like Fallout Are Shaping Streetwear Drops
- After the Assault: What Athletes Should Know About Bystander Intervention and Public Safety
Related Topics
Unknown
Contributor
Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.
Up Next
More stories handpicked for you
Phishing in the Age of AI: Essential Strategies for UK Consumers
How to Keep Your LinkedIn Professional Reputation After an Account Hijack
Navigating Freight Disruptions: What Consumers Need to Know When Deliveries Fall Through
Preventing Consumer Disputes Over Commodity Failures: Best Practices for 2026
How Trading Standards Can Help When Social Platforms Fail to Protect Users
From Our Network
Trending stories across our publication group