In January, Flock CEO Garrett Langley did an interview with Inspired Capital, a venture capital firm. In it, he discusses his takes on crime, the judicial system, and offers some revealing praise for the approach taken for the “precrime” police program in the 2002 movie, Minority Report.
Precrime as a Business Model
In the broader context of AI doing investigative police work—something Flock is pushing hard with Nova and its “Night Shift” feature—Langley had this to say when asked about Minority Report:
[When] you think about it, it was decades of arrests with only one wrongful arrest. How nice would that be if our current judicial system and policing system only had one wrongful arrest and multiple deaths? That sounds great.
The only problem with this system, according to Langley, is that the “terminal decision” lies with the “precog”[1] rather than a human—the old “local decision” mantra Flock repeats in public, but puts aside when it unilaterally removes “permanent” information from its product.
Langley either completely misses the point of the movie, or he aims to bring about its dystopia in his stated, quixotic quest to eliminate crime.
Minority Report’s problem was never that the precogs were, like AI, “inhuman,” as Langley puts it. The movie is a warning that putting blind faith into a system—any system—is a terrible mistake.
The term “Minority Report” in the movie’s universe refers to an outlying data point: a piece of evidence that contradicts the other evidence and, at minimum, raises doubt about the system’s fallibility. The government’s solution in the movie? Purge minority reports from the record and hide their existence from the public.
The “one wrongful arrest”—which was actually a conviction—serves to highlight that the system has always been fallible. There are likely thousands of innocents who could not have been convicted but for the purged minority reports, removed from society “to eliminate crime in America.”
It’s a fitting reference for a company whose approach to inconvenient data is to make it disappear.
Blurring the Line Between Arrest and Conviction
In the same interview, Langley speaks on real-world problems in proving crime. The first, he claims, is that people will no longer come forward as witnesses.[2] He continues:
The second is: our expectations of truth have gone through the roof. And and this is, like, largely a good thing, but you know, people like you and me watch NCIS on TV and we assume there’s cameras everywhere. … and you watch [shows like NCIS] and you’re like, “Oh, like this is how [it works]”, but the real world doesn’t work this way.
And so you you get to a judge, you get to a jury, and absent incredibly hard evidence, an arrest will not occur. And that’s actually, I think, good. We’re holding ourselves to a higher standard of eliminating wrongful arrest, but that kind of moves the difficulty level up. And then those two things are compounded by [the third issue of] a staffing crisis, right?
There’s a tell buried in this quote: Langley keeps saying “arrest” when he means “conviction.” He did it with Minority Report, and he does it again here. Judges and juries don’t decide arrests—they decide convictions. The standard for arrest is probable cause, which is far lower than the courtroom standard of beyond a reasonable doubt.
This conflation is not accidental. It’s strategic. Flock’s product is strong enough to generate arrests—point a camera at a road, flag a plate, send a cop. But generating a conviction requires evidence that can survive cross-examination, expert challenge, and judicial scrutiny. As we’ll see below, Flock’s evidence authentication doesn’t clear that bar. Langley blurs the terms because admitting the distinction would expose the gap between what Flock can trigger and what Flock can prove.
The CSI Effect and the “Burden of Truth”
Langley is gesturing at something real: the so-called “CSI Effect,” where jurors exposed to forensic-heavy TV dramas expect more scientific evidence than prosecutors can realistically provide. It’s a documented phenomenon, and it has made some prosecutions harder.
But Langley doesn’t frame it that way. Instead, he frames rising evidentiary expectations as a problem to be solved—a “difficulty level” that Flock can help overcome. The implication is that courts should lower the bar, or that Flock’s evidence should clear it. Neither follows. The standard in criminal proceedings exists to protect defendants from wrongful conviction. That standard hasn’t “gone through the roof.” It’s exactly where it’s supposed to be.
What has changed is that Flock wants to be the one supplying the evidence—and the evidence it produces, as we’ll see, doesn’t hold up.
Why Flock’s Evidence Doesn’t Hold Up
First, a caveat: I don’t claim authority on the Rules of Evidence. It’s a complex topic. If any lawyers want to correct me on anything, please reach out.
Here’s the gist: you can’t just make shit up and throw it at the judge. Courts require any evidence to have a basis and to be introduced by someone. This is, in part, why the prosecutor can’t show up with bodycam footage of you rolling up to the Louvre with your ladder—a police officer who was wearing the bodycam has to show up and say “I saw this guy carrying a ladder through the streets of Paris.”
When it comes to Flock footage, that means either (1) a witness comes in and says “I saw this,” (2) an expert comes in and says “this is authentic and has not been tampered with,” or (3) the court relies on more circumstantial evidence like metadata and affidavits.
Option 1 is impossible—there is no witness. Option 2 is expensive and exposes technical details in open court. Which leaves option 3: the weakest possible basis and, apparently, the focus of Langley’s complaint about standards going “through the roof.”
How Flock Authenticates Evidence
The details are sketchy, because of Flock’s continued lack of transparency, but I believe that some time last year, Flock changed how it authenticates evidence. Where it used to sign an affidavit on request, it now appears to use an automated process. Based on what I can determine, this is roughly how it works since July 1, 2025:
- A Flock camera takes a picture
- It creates a hash (shortened representation) of the image
- Flock stores the image and the hash
- An investigator goes into the Flock portal and downloads an image
- (Optional) Flock deletes the image due to retention periods, but keeps the hash
- The investigator, months later preparing for court, uploads the stored image to Flock
- The server generates a hash of the uploaded image
- The server compares it to the hash stored for the original capture
- The server returns a PDF with the image and the date, time, and location of capture that says “we checked: we took this picture and these items all belong together.”
Sounds reasonable. It isn’t.
The Chain of Custody Problem
The Chain of Custody is a key part of the rules of evidence: you have to be able to show that evidence has not been tampered with. For physical evidence, there are rigid protocols—sealing, unsealing, signing in and out of secure storage.
For Flock’s images: nobody, most likely including Flock, knows who has had access to the image, the metadata, or the hash at any point in the process. This is the reason CJIS requires permanent, immutable audit logs.
Images captured by Flock cameras are stored unencrypted on the device before transmission. Flock has previously said images are stored “for up to 7 days” on the camera, which means the metadata—capture times, location data—is also stored for up to 7 days.
This asynchronous processing is a technical necessity when operating over spotty LTE networks, but it also means there is a multi-day window in which images and metadata sit on an unattended, unsecured device.
It’s the digital equivalent of finding a dead body in an alley and saying “we’ll come back in a few days to collect the evidence.”
The Metadata Integrity Problem
There is no mechanism to validate that the metadata belongs to the image. A properly secured device would have a TPM (Trusted Platform Module) that cryptographically binds the data, image, and hash together so they cannot be separated, altered, or accessed independently. Flock’s cameras are not such devices and, from the teardowns I’ve seen, contain no TPM.
By all appearances, the file hash and the metadata are simply stored in AWS alongside everything else. Anyone with access to AWS—a Flock employee, a compromised account, a contractor—could update the data. With a few keystrokes, a photo of your car taken in Langley, VA, could be associated with a camera in Paris, IL.
Flock’s automated system will attest to this fact in court.
What Flock’s System Actually Proves
Flock’s authentication is a convoluted version of “trust me, bro.” Instead of verifying that a photo was taken where it was taken, when it was taken, it attests only to the fact that an image downloaded from Flock matches another image in Flock’s system.
That’s not authentication. That’s “both our watches say it’s 2:37pm, so it must be 2:37pm”—while ignoring that you left them unattended in your hotel room for three days before driving from Chicago to L.A. All it proves is that both watches show the same time.
Langley wants us—and the courts—to accept 2:37pm as the absolute, indisputable truth. The times match, and he’s wearing one of the watches—how could it not be the truth?
The law demands more, and so should the courts when they are deciding someone’s life and liberty.
Staff shortage be damned.
A “precognitive” individual in the movie’s universe who can see crimes before they happen. ↩︎
I have not verified whether his claim is true, but if people are no longer stepping up, my first theory would be that it has to do with decreasing societal trust in police—perhaps for the very reasons discussed in this post. ↩︎