Skip to main content

Surveillance Company Flock Now Using AI to Report Us to Police if it Thinks Our Movement Patterns Are “Suspicious”


The police surveillance company Flock has built an enormous nationwide license plate tracking system, which streams records of Americans’ comings and goings into a private national database that it makes available to police officers around the country. The system allows police to search the nationwide movement records of any vehicle that comes to their attention. That’s bad enough on its own, but the company is also now apparently analyzing our driving patterns to determine if we’re “suspicious.” That means if your police start using Flock, they could target you just because some algorithm has decided your movement patterns suggest criminality.

There has been a lot of reporting lately about Flock but I haven’t seen anyone focus on this feature. It’s a significant expansion in the use of the company’s surveillance infrastructure — from allowing police to find out more about specific vehicles of interest, to using the system to generate suspicion in the first place. The company’s cameras are no longer just recording our comings and goings — now, using AI in ways we have long warned against, the system is actively evaluating each of us to make a decision about whether we should be reported to law enforcement as potential participants in organized crime.

In a February 13 press release touting an “Expansive AI and Data Analysis Toolset for Law Enforcement,” the company announced several new capabilities, including something called “Multi-State Insights”:

Many large-scale criminal activities—such as human and narcotics trafficking and Organized Retail Crime (ORC)—involve movement across state lines. With our new Multi-State Insights feature, law enforcement is alerted when suspect vehicles have been detected in multiple states, helping investigators uncover networks and trends linked to major crime organizations.

Flock appears to offer this capability through a larger “Investigations Manager,” which urges police departments to “Maximize your LPR data to detect patterns of suspicious activity across cities and states.” The company also offers a “Linked Vehicles” or “Convoy Search” allowing police to “uncover vehicles frequently seen together,” putting it squarely in the business of tracking people’s associations, and a “Multiple locations search,” which promises to “Uncover vehicles seen in multiple locations.” All these are variants on the same theme: using the camera network not just to investigate based on suspicion, but to generate suspicion itself.

In a democracy, the government shouldn’t be watching its citizens all the time just in case we do something wrong. It’s one thing if a police officer out on a street sees something suspicious in public and reacts. But this is an entirely different matter.

First, the police should not be collecting and storing data on people’s movements and travel across space and time in the first place, or contracting to use a private company’s technology to accomplish the same thing. Second, they shouldn’t be taking that data and running it through AI algorithms to potentially swing the government’s eye of suspicion toward random, innocent civilians whose travel patterns just happen to fit what that algorithm thinks is worth bringing to the attention of the police.

And of course because Flock is a private company not subject to checks and balances such as open records laws and oversight by elected officials, we know nothing about the nature of the algorithm or algorithms that it uses— what logic it may be based upon, the data upon which it was trained, or the frequency and nature of its error rates. Does anyone actually know whether there are movement patterns characteristic of criminal behavior that won’t sweep in vastly larger numbers of innocent people?

We also don’t know what kind of biases the company’s algorithms might exhibit; it’s very easy to imagine an algorithm trained on past criminal histories in which low-income neighborhoods and communities of color are highly over-represented because of the well-established, top-to-bottom biases in our criminal justice system. That could mean that just living in such a neighborhood could make you inherently suspicious in the eyes of this system in a way that someone living in a wealthier place would never be. Among other problems, that’s just plain unfair.

The bottom line is that Flock, having built its giant surveillance infrastructure, is now expanding its uses — validating all our warnings about how such systems inevitably undergo mission creep, and providing all the more reason why communities should refuse to allow the police departments that serve them to participate in this mass surveillance system.

Comments

Popular posts from this blog

The Supreme Court Declined a Protestors' Rights Case. Here's What You Need to Know.

The Supreme Court recently declined to hear a case, Mckesson v. Doe , that could have affirmed that the First Amendment protects protest organizers from being held liable for illegal actions committed by others present that organizers did not direct or intend. The high court’s decision to not hear the case at this time left in place an opinion by the Fifth Circuit, which covers Louisiana, Mississippi, and Texas, that said a protest organizer could be liable for the independent, violent actions of others based on nothing more than a showing of negligence. Across the country, many people have expressed concern about how the Supreme Court’s decision not to review, or hear, the case at this stage could impact the right to protest. The ACLU, which asked the court to take up the case, breaks down what the court’s denial of review means. What Happened in Mckesson v. Doe? The case, Mckesson v. Doe , was brought by a police officer against DeRay Mckesson , a prominent civil rights activi...

New video by T-Series on YouTube

Aila Re Aillaa (Video) Sooryavanshi | Akshay, Ajay, Ranveer, Katrina, Rohit | 5 November Presenting first song "Aila Re Aillaa " from the most awaited movie of the year "Sooryavanshi". The movie is staring Akshay Kumar, Ajay Devgn, Ranveer Singh and Katrina Kaif in the lead role. The biggest party anthem of the year, this track "Aila Re Aillaa" is sung by Daler Mehndi and the Music Recreated by Tanishk Bagchi and the new lyrics are penned by Shabbir Ahmed. The song originally is composed by Pritam and penned by Nitin Raikwar. Reliance Entertainment, Rohit Shetty Picturez In association with Dharma Productions and Cape Of Good Films presents “Sooryavanshi”. Produced by: Hiroo Yash Johar, Aruna Bhatia, Karan Johar, Apoorva Mehta and Rohit Shetty Directed by: Rohit Shetty Star Cast: Akshay Kumar, Ajay Devgn, Ranveer Singh and Katrina Kaif. SONG CREDITS Song - Aila Re Aillaa Singer - Daler Mehndi Music Reworked by - Tanishk Bagchi Programmed and Arranged by -...

The Young Singaporean's Guide to Saving & Investing on a Small Salary (2025 Edition)

The Young Singaporean’s Guide to Saving and Investing on a Small Salary Introduction Living in Singapore can feel overwhelming when you’re just starting your career. Rents are high, kopi prices keep rising, and saving on a monthly salary of $2,500–$3,000 might seem impossible. Yet, many young Singaporeans have proven that with the right habits, even a small income can grow into long-term financial security. The key is to start early, be consistent, and leverage the tools available to you — especially CPF, robo-advisors, and smart budgeting. This guide breaks down practical steps you can take to save and invest, even if you’re earning on the lower side. 💰 1. Start with the Basics: Budgeting the 50/30/20 Way If you’re earning $2,800 a month (a common starting salary for many graduates), here’s how the 50/30/20 rule can be applied in Singapore: 50% Needs ($1,400) – rent, transport (MRT/Grab), phone bills, meals. 30% Want...