Skip to main content

Surveillance Company Flock Now Using AI to Report Us to Police if it Thinks Our Movement Patterns Are “Suspicious”


The police surveillance company Flock has built an enormous nationwide license plate tracking system, which streams records of Americans’ comings and goings into a private national database that it makes available to police officers around the country. The system allows police to search the nationwide movement records of any vehicle that comes to their attention. That’s bad enough on its own, but the company is also now apparently analyzing our driving patterns to determine if we’re “suspicious.” That means if your police start using Flock, they could target you just because some algorithm has decided your movement patterns suggest criminality.

There has been a lot of reporting lately about Flock but I haven’t seen anyone focus on this feature. It’s a significant expansion in the use of the company’s surveillance infrastructure — from allowing police to find out more about specific vehicles of interest, to using the system to generate suspicion in the first place. The company’s cameras are no longer just recording our comings and goings — now, using AI in ways we have long warned against, the system is actively evaluating each of us to make a decision about whether we should be reported to law enforcement as potential participants in organized crime.

In a February 13 press release touting an “Expansive AI and Data Analysis Toolset for Law Enforcement,” the company announced several new capabilities, including something called “Multi-State Insights”:

Many large-scale criminal activities—such as human and narcotics trafficking and Organized Retail Crime (ORC)—involve movement across state lines. With our new Multi-State Insights feature, law enforcement is alerted when suspect vehicles have been detected in multiple states, helping investigators uncover networks and trends linked to major crime organizations.

Flock appears to offer this capability through a larger “Investigations Manager,” which urges police departments to “Maximize your LPR data to detect patterns of suspicious activity across cities and states.” The company also offers a “Linked Vehicles” or “Convoy Search” allowing police to “uncover vehicles frequently seen together,” putting it squarely in the business of tracking people’s associations, and a “Multiple locations search,” which promises to “Uncover vehicles seen in multiple locations.” All these are variants on the same theme: using the camera network not just to investigate based on suspicion, but to generate suspicion itself.

In a democracy, the government shouldn’t be watching its citizens all the time just in case we do something wrong. It’s one thing if a police officer out on a street sees something suspicious in public and reacts. But this is an entirely different matter.

First, the police should not be collecting and storing data on people’s movements and travel across space and time in the first place, or contracting to use a private company’s technology to accomplish the same thing. Second, they shouldn’t be taking that data and running it through AI algorithms to potentially swing the government’s eye of suspicion toward random, innocent civilians whose travel patterns just happen to fit what that algorithm thinks is worth bringing to the attention of the police.

And of course because Flock is a private company not subject to checks and balances such as open records laws and oversight by elected officials, we know nothing about the nature of the algorithm or algorithms that it uses— what logic it may be based upon, the data upon which it was trained, or the frequency and nature of its error rates. Does anyone actually know whether there are movement patterns characteristic of criminal behavior that won’t sweep in vastly larger numbers of innocent people?

We also don’t know what kind of biases the company’s algorithms might exhibit; it’s very easy to imagine an algorithm trained on past criminal histories in which low-income neighborhoods and communities of color are highly over-represented because of the well-established, top-to-bottom biases in our criminal justice system. That could mean that just living in such a neighborhood could make you inherently suspicious in the eyes of this system in a way that someone living in a wealthier place would never be. Among other problems, that’s just plain unfair.

The bottom line is that Flock, having built its giant surveillance infrastructure, is now expanding its uses — validating all our warnings about how such systems inevitably undergo mission creep, and providing all the more reason why communities should refuse to allow the police departments that serve them to participate in this mass surveillance system.

Comments

Popular posts from this blog

New video by T-Series on YouTube

Aila Re Aillaa (Video) Sooryavanshi | Akshay, Ajay, Ranveer, Katrina, Rohit | 5 November Presenting first song "Aila Re Aillaa " from the most awaited movie of the year "Sooryavanshi". The movie is staring Akshay Kumar, Ajay Devgn, Ranveer Singh and Katrina Kaif in the lead role. The biggest party anthem of the year, this track "Aila Re Aillaa" is sung by Daler Mehndi and the Music Recreated by Tanishk Bagchi and the new lyrics are penned by Shabbir Ahmed. The song originally is composed by Pritam and penned by Nitin Raikwar. Reliance Entertainment, Rohit Shetty Picturez In association with Dharma Productions and Cape Of Good Films presents “Sooryavanshi”. Produced by: Hiroo Yash Johar, Aruna Bhatia, Karan Johar, Apoorva Mehta and Rohit Shetty Directed by: Rohit Shetty Star Cast: Akshay Kumar, Ajay Devgn, Ranveer Singh and Katrina Kaif. SONG CREDITS Song - Aila Re Aillaa Singer - Daler Mehndi Music Reworked by - Tanishk Bagchi Programmed and Arranged by -...

Latest AI tools in 2025

Artificial Intelligence has reached a new height in the year 2025. With the help of powerful tools, AI has made it possible to transform business, revolutionalize the way we live, and the way we work. Chatbots are one of the many amazing things that AI has brought to us in 2025. They have made it possible for businesses to provide 24/7 customer service without the need for human interruption. But chatbots are just the tip of the iceberg of what AI has to offer in 2025. With natural language processing (NLP), AI has made it possible for machines to understand human language and emotions. This has paved the way for virtual assistants like Siri and Alexa to assist with everyday tasks and questions. Robotic Process Automation (RPA) is also one of the growing trends of AI in 2025. This tool facilitates the automation of repetitive tasks, which frees up time for more important work. This improves productivity and efficiency in businesses and organizations. As for the healthcare industry, ...

Documents Reveal Confusion and Lack of Training in Texas Execution

As Texas seeks to execute Carl Buntion today and Melissa Lucio next week, it is worth reflecting on the grave and irreversible failures that occurred when the state executed Quintin Jones on May 19, 2021. For the first time in its history — and in violation of a federal court’s directive and the Texas Administrative Code — Texas excluded the media from witnessing the state’s execution of Quintin Jones. In the months that followed, Texas executed two additional people without providing any assurance that the underlying dysfunction causing errors at Mr. Jones’ execution were addressed. This is particularly concerning given that Texas has executed far more people than any other state and has botched numerous executions. The First Amendment guarantees the public and the press have a right to observe executions. Media access to executions is a critical form of public oversight as the government exerts its power to end a human life. Consistent with Texas policy, two reporters travelled t...