When AI Finds You Without Your Face: The Next Frontier of Law Enforcement Surveillance

When AI Finds You Without Your Face: The Next Frontier of Law Enforcement Surveillance

Beyond Faces: How New AI Tools Let Police Track People Without Facial Recognition — And Why It’s Raising Alarm Bells

By Chandrakumar Pillai


AI is transforming law enforcement. But not always in the ways we expect. And certainly not without raising serious questions about privacy, civil liberties, and ethics.

A new type of AI technology, called Track, is offering police and government agencies in the US the ability to track people across video footage — even in areas where facial recognition is banned or heavily restricted.

Developed by Veritone, a video analytics company, Track doesn't use faces to follow people. Instead, it uses other attributes like body size, hair color, clothing, accessories, and even gender.

This allows law enforcement to build timelines of a person’s movements across multiple cameras, public videos, and police footage — without ever using biometric facial recognition data.

For civil liberties experts, this raises urgent questions about how these technologies could be abused and how existing laws may be inadequate to protect people’s rights.


How Does Track Work?

Unlike facial recognition systems, Track uses non-biometric characteristics to monitor individuals.

These include:

  • Body size and shape
  • Gender presentation
  • Hair color and style
  • Clothing, shoes, accessories (like backpacks, hats, scarves, etc.)

Veritone says that Track is already being used by 400 customers across the US, including state and local police departments, universities, and federal agencies like the Department of Justice.

Veritone claims the system:

✅ Works where facial recognition is banned or restricted

✅ Helps identify people whose faces are obscured

✅ Can analyze recorded video footage from diverse sources: body cameras, drones, Ring cameras, YouTube videos, and citizen-submitted footage

And, within a year, Veritone says Track will also be able to analyze live video feeds in real time.


The Controversy: Skirting the Law?

Civil liberties advocates, including the ACLU (American Civil Liberties Union), are sounding the alarm.

They argue that Track:

  • Raises the same privacy and civil liberties concerns as facial recognition
  • Enables large-scale tracking of individuals over time — without consent or due process
  • Exploits gaps in current laws that focus only on "biometric data" like faces and fingerprints

Jay Stanley from the ACLU warns that this is the first known instance of a non-biometric tracking system used at scale by law enforcement in the US. And it comes at a time when the government is increasing surveillance of activists, protesters, immigrants, and students.

Critics fear that tools like Track give law enforcement unprecedented powers to monitor individuals even when no crime has been committed.

Nathan Wessler, an ACLU attorney, says that Track represents a “categorically new scale and nature of privacy invasion” that was never before possible in human history.


Track vs. Facial Recognition: Different Technology, Same Concerns?

One reason Track is seen as a loophole is that laws banning facial recognition often focus on the use of "biometric data." But the definition of biometric data is blurry.

It typically refers to:

  • Immutable characteristics like faces, gait, and fingerprints

But what about attributes that change, like clothing?

Critics argue that even these attributes can become pseudo-biometric when combined at scale.

For example:

  • A person who always wears the same boots, backpack, and coat could still be reliably tracked over days, weeks, or months across video footage.

This raises a critical question: Is Track offering a backdoor for law enforcement to track people in ways that facial recognition laws tried to prevent?

Veritone acknowledges that Track uses skin tone as one of its many identifiers, but says the system doesn’t allow users to explicitly search by skin color. Still, this detail adds further concerns about bias and misuse.


The Bigger Picture: Technology Moves Faster Than the Law

Track’s rise comes at a time when many cities and states in the US are implementing laws limiting facial recognition.

  • San Francisco and Oakland, California have near-total bans.
  • Montana and Maine limit its use, especially in real-time video feeds.

But these laws may not apply to Track, since it technically does not use biometric facial data.

This legal gray area puts pressure on lawmakers to revisit how they define privacy, surveillance, and the scope of law enforcement technologies.


Why This Matters for Business, Tech Leaders, and Society

New AI tools like Track highlight how technology can evolve faster than laws and ethics frameworks.

It’s a wake-up call for all businesses developing or using AI for surveillance, security, or analytics.

It raises critical debates about the balance between public safety and personal privacy.

It challenges policymakers to act swiftly to close legal loopholes and protect civil liberties.


Critical Questions for Reflection and Debate

Do existing privacy laws sufficiently address non-biometric AI tracking tools? Or do we need new laws?

Who oversees the responsible use of AI-powered tracking tools like Track? Should independent audits be mandatory?

How do we prevent mission creep — where tools built for investigating crimes become used for mass surveillance of peaceful protesters or students?

Is the use of such powerful surveillance tools in democratic societies crossing into authoritarian practices?

What obligations do tech companies have to ensure their products are not misused by law enforcement or governments?


Final Thoughts: A New Era of Surveillance?

Track might not use facial recognition, but it raises equally serious — if not greater — concerns about surveillance, accountability, and civil liberties.

It opens the door to:

  • **Mass surveillance without faces.
  • Tracking without direct identification.
  • Policing without needing visible evidence of wrongdoing.**

This is not science fiction. It is already happening.

As AI evolves, we must ask tough questions about its role in society, who controls it, how it is used, and whether our current laws and institutions are strong enough to protect our freedoms.

If we wait until these tools are widely adopted, it might be too late to roll them back.


Let’s Discuss 👇

  • How do you feel about AI tools that track people without using facial recognition?
  • Is this a necessary evolution for public safety or a dangerous overreach of technology?
  • How can we ensure such tools are used responsibly, transparently, and fairly?

I invite you to share your views, stories, and solutions in the comments.

Join me and my incredible LinkedIn friends as we embark on a journey of innovation, AI, and EA, always keeping climate action at the forefront of our minds. 🌐 Follow me for more exciting updates https://lnkd.in/epE3SCni


#AI #AISurveillance #CivilLiberties #FacialRecognition #PrivacyRights #EthicalAI #ResponsibleTech #ACLU #LawEnforcementTech #PublicSafety #VideoAnalytics #AIethics #TechLeadership #AIgovernance #AIregulation #FutureOfPolicing #HumanRights #SurveillanceState #AIimpact #LinkedInNewsletter

Reference: MIT Tech Review

Indira B.

Visionary Thought Leader🏆Top 100 Thought Leader Overall 2025🏆Awarded Top Global Leader 2024🏆Honorary Professor of Practice Leadership&Governance |CEO|Board Member|Leadership Coach| KeynoteSpeaker |21Top Voice LinkedIn

5mo

ChandraKumar, this thought-provoking perspective highlights the critical balance between innovation and ethics. As AI pushes boundaries, the dialogue around privacy, civil liberties, and responsible tech becomes more urgent than ever. Thank you for shedding light on these pivotal discussions.

Like
Reply
CHONG HUAT LEE

Editor | Journalist | Founder of VIVOICEX | ANOVIA MD | Public Affairs, Public Relations & Comm’ | Global Affairs Delegate & Diplomat to the United Nations (UNESCO • UNCTAD • ASEAN • EU)

5mo

Thanks for sharing, ChandraKumar, it’s insightful, I really enjoyed reading it. BEST 🫵🏻👍🏻

Like
Reply
Nick Robinson

Sports Business Leader | Over $250M in Contracts | Charity Founder | Keynote Speaker | Follow for Insights on Sports Business, Leadership & High-Performance Mindset.

5mo

Surveillance without faces raises serious privacy concerns. We must prioritise civil liberties.

Great insights, but AI’s impact feels less surprising now, We keep crossing new lines, and just when we think it’s enough, something bigger arrives, ChandraKumar

To view or add a comment, sign in

More articles by ChandraKumar R Pillai

Others also viewed

Explore content categories