Skip to main content

How to Opt Out of Facial Recognition Databases

How to Opt Out of Public Facial Recognition Databases: A Complete Guide


Facial recognition technology is increasingly used in public spaces, law enforcement, and even online platforms. 

While it can enhance security, it also raises significant privacy concerns. If you value your privacy, you may want to opt out of facial recognition databases where your biometric data could be stored or used.

This guide will walk you through what public facial recognition databases are, why opting out matters, and actionable steps you can take in the UK (and beyond).


1. Understanding Public Facial Recognition Databases


Public facial recognition databases are systems that store facial images and biometric data for identification purposes. 


They include:

• Law enforcement databases: Images captured from CCTV, bodycams, or public cameras.

• Government systems: Passport, ID, or border control biometric systems.

• Commercial platforms: Social media sites or photo-sharing apps that automatically tag faces.


These databases may be used for:

• Verifying identities in public spaces

• Preventing fraud or crime

• Targeted advertising or analytics


Important: Some systems are regulated, while others—especially commercial databases—may have minimal oversight.


2. Why You Might Want to Opt Out


Opting out helps protect you from:

• Privacy invasion: Unconsented tracking in public or online spaces.

• Misidentification: Facial recognition systems are not perfect and may misidentify you, especially if bias exists.

• Data misuse: Biometric data could be shared with third parties or used for marketing.


3. Legal Considerations in the UK


In the UK, your rights are primarily governed by:

Data Protection Act 2018 & UK GDPR: Give you rights to access, correct, and request deletion of your personal data.

Human Rights Act 1998: Protects privacy and freedom from arbitrary surveillance.

ICO Guidance: The Information Commissioner’s Office provides advice on facial recognition and biometric data use.


⚠ Note: Opting out is not always guaranteed. Some law enforcement or government systems require data collection for legal reasons.


4. Steps to Opt Out of Government or Law Enforcement Databases


Step 1: Identify Relevant Authorities

Local police forces

National government biometric systems (passport, ID cards)

Public CCTV operators


Step 2: Submit a Data Subject Access Request (DSAR)

Request access to your personal data stored in these databases.

You can ask for:

Copies of your facial images

The purpose of collection

Who has access


Step 3: Request Deletion (Where Possible)

Under UK GDPR, you can request deletion of personal data if there’s no legal reason for retention.

For law enforcement databases, deletion may not always be granted. Always check the relevant policy or guidance.


Step 4: Contact the ICO for Assistance

If authorities refuse to delete your data, you can escalate complaints to the Information Commissioner’s Office.

They oversee lawful processing of personal data and can investigate misuse.


5. Steps to Opt Out of Commercial Facial Recognition Databases


Step 1: Review Privacy Policies

Check if platforms like Facebook, Google Photos, LinkedIn, or other apps use facial recognition.

Look for opt-out instructions, usually under privacy or security settings. We have a step-by-step guide to this here.


Step 2: Disable Automatic Tagging

Most social media platforms allow you to turn off facial recognition features:

Prevent automatic tagging

Limit who can identify you in photos

Stop the platform from using your facial data in recommendations


Step 3: Remove Uploaded Images

Delete photos containing your face that might be used to train recognition models.

Consider contacting the platform to remove copies that may exist in backup or analytical systems.


Step 4: Use Privacy Tools

Privacy masks or obfuscation tools for photos before uploading.

Consider apps or browser extensions that warn when websites try to collect facial data.


6. Practical Tips for Reducing Your Presence in Facial Recognition Systems


Physical privacy measures:

Wear hats, sunglasses, or masks in public (especially where permitted).

Infrared or anti-surveillance glasses (note legality varies).


Digital hygiene:

Avoid uploading facial images to untrusted platforms.

Adjust social media privacy settings.

Use services like Remove.bg or photo editors to obscure faces in public uploads.


Stay informed:

Monitor local council or police announcements about new surveillance initiatives.

Sign up for privacy newsletters or advocacy groups.


7. Challenges and Limitations


Government and law enforcement systems: Often cannot be fully opted out due to legal requirements.

Commercial systems: Data may still exist even after deletion requests.

Surveillance creep: New systems may appear without clear opt-out options.

The goal is risk reduction, not complete invisibility.


Final Thoughts

Facial recognition is becoming an unavoidable part of modern life. While full opt-out is not always possible, being proactive—through data requests, privacy settings, and practical measures—can significantly reduce your exposure.

Empowering yourself with knowledge and tools ensures you maintain control over your personal identity, both online and in public spaces.

Comments

Popular posts from this blog

What Actually Works (and Doesn’t) to Avoid Facial Recognition in 2026

What Actually Works (and Doesn’t) to Avoid Facial Recognition in 2026 Advice about “beating” facial recognition is everywhere—but much of it is outdated, oversimplified, or just wrong.  Modern systems are built on deep learning and high-dimensional embeddings, which makes them far more robust than earlier generations. This article cuts through the noise. It explains what actually reduces your likelihood of being identified today, what doesn’t, and why. 1. The Reality: You Can Reduce Risk, Not Eliminate It Before getting into techniques, it’s important to be precise: There is no reliable way to guarantee anonymity in environments where facial recognition is actively deployed You can reduce accuracy, increase uncertainty, or avoid inclusion in certain systems.  Effectiveness depends heavily on context (lighting, camera quality, database size, and system design) Think in terms of risk reduction, not invisibility. 2. What Doesn’t Work (or Barely Works Anymore) Many widely shared t...

Fargo Police Facial Recognition Error Sparks AI Policing Debate

Fargo Police Facial Recognition Error Sparks AI Policing Debate A Fargo police facial recognition error led to a wrongful 5-month jail term. Lets explore what went wrong, AI risks, and the future of policing. The growing use of artificial intelligence in policing has come under intense scrutiny following a high-profile Fargo police facial recognition error that resulted in a wrongful arrest and months-long imprisonment. The case highlights a critical question: • Can law enforcement safely rely on AI to identify suspects? What Happened in the Fargo Case? At the centre of the controversy is Angela Lipps, a Tennessee woman who was: • Arrested in her home state • Accused of bank fraud in Fargo • Jailed for nearly five months • Extradited over 1,000 miles Despite the severity of the charges, Lipps maintained she had never even visited North Dakota. The case eventually collapsed when evidence confirmed she was in Tennessee at the time of the alleged crime. The Role of Facial Recognition Tech...

Is Facial Recognition Racist? What You Need to Know

Facial recognition itself is a technology—it’s not inherently racist. However, the way it is developed, trained, and deployed can lead to biased outcomes, which is why concerns about racism and discrimination arise.  Here’s a careful breakdown: 1. How bias can occur Facial recognition systems use AI trained on large datasets of faces. Problems arise if: • The dataset is unbalanced, containing mostly faces of one ethnicity. • Algorithms are tuned for certain facial features more than others. Result: The system may misidentify or fail to recognize people of underrepresented ethnic groups, even though it works well for the majority group in the dataset. 2. Evidence from research Several studies, especially in the US, have found: • Higher error rates for darker-skinned individuals, particularly women. • Some commercial facial recognition tools misidentify Black people 10–100 times more often than white people in certain tests. • Misidentifications have led to wrongful arrests and discr...