About a week ago, I had the pleasure of speaking with a number of students from Sequoia Union High School District in California who were working on an article about surveillance and Flock. They asked great questions about their local fusion center, the Northern California Regional Intelligence Center (NCRIC), as well as about California’s SB 34—a 2015 law that, among other things, prohibits public agencies from sharing ALPR data except with other public agencies, and requires operators to maintain security procedures, access logs, and retention limits.[1]
Questions I could not answer in detail, but that are important and deserve answers.
I’ve written about fusion centers before, in the post about the federal Regional Information Sharing Systems® (RISS) program—which are federally-funded, quasi-privately-operated “fusion centers before it was cool”. That post was mainly in the abstract. Let’s examine what’s happening at NCRIC.
Fusion centers and data sharing
NCRIC is a practical example of what can go wrong when we take promises about data retention and security at face value, and what happens when we write poorly drafted bills—like HF 2161 chugging along here in Iowa, with the ACLU of Iowa’s support.[2]
Data dissemination centers like RISS “permit federated searching across many systems without requiring the RISSNET user to have a separate user account for each partner system.” But that website copy is about as far as we get—while federally funded,[3] these centers are operated as private corporations and are therefore not subject to the Freedom of Information Act.[4][5]
But we don’t have to speculate for too long. The state of Colorado lays it all out cleanly for its Auto Theft Intelligence Coordination Center (ATICC):
The goal of this project is to share license plate recognition data among all contributing agencies that have established this Memorandum of Understanding with the Colorado ELSAG EOC, managed by the Colorado State Patrol (CSP) ATICC.
Participating agencies will share license plate reader (LPR) information for replication to the data warehouse or as part of a central querying system hosted by the Colorado ELSAG EOC and will have the capability to query all LPR based information from around the State of Colorado which is stored within the warehouse
Simple as that. Drop everything in CSP’s bucket, and take what you need. Cop-communism.
In case you’re wondering, ELSAG cameras are a Leonardo product.[6] They offer stationary surveillance cameras (with cool-sounding names like “The Street Sentry™” and “The Fixed Plate Hunter™”), as well as mobile cameras disguised as roof-mounted skiboxes or construction barrels.
In the MoU, the “Denver Police Department agrees to share ALPR data with other law enforcement agencies utilizing the Colorado ELSAG EOC”, where it can be stored for up to three years.[7]
Although Colorado State Patrol was short-sighted enough to name its own entity after a vendor product,[8] ATICC explicitly commits to “obtaining the cooperation of any third-party contractor or vendor” that provides license plate reader systems in Colorado. Presumably this includes Flock.
The “data warehouse” used by CSP, while only one component of a fusion center, is a much more descriptive term for what’s really happening at the backdoor of these systems.
The Northern California Regional Intelligence Center (NCRIC)
Colorado is not just similar to NCRIC—it’s the template for what NCRIC is almost certainly doing but refusing to document. NCRIC gives itself permission to store ALPR data for up to 12 months, and broadly disseminate it.
The [ALPR] information is also retained for a fixed retention period, though it is only reaccessible by law enforcement given a legitimate law enforcement purpose.
The FAQ specifies that only users with a need-to-know have access, but, from context, it’s clear that NCRIC’s version of “need-to-know” is clearly not particularized and apparently extends to all ALPR data, forever.
Although the previous version of NCRIC’s FAQ was more explicit that “most ALPR data will be stored for 12 months,” the current FAQ is silent on retention. The FAQ drones on for a bit, carefully evading its own questions, but at the end of it all, the agency essentially gives itself carte blanche to do what Colorado spelled out more clearly.
The policy reveals more. Especially in light of SB 34.
In October 2023, the California Office of the Attorney General issued bulletins gently reminding police laws exist, and that they are not supposed to be sending ALPR data from California to out of state agencies.[9] Exactly six months later NCRIC disappears from non-California log files.
NCRIC updated its ALPR policy accordingly, but in a way that created performative compliance and resulted in less oversight.
- It removed the specific security requirements for data storage—SECRET-level clearances, 24/7 security personnel, multiple secured doors—replacing them with a passing reference to “secure systems.”
- It removed the requirements for multi-factor authentication and encryption.
- It removed the requirement for audit logs to contain a “justification for access.”
- It weakened retention limits from a hard cap (“shall not be retained longer than 12 months” with explicit purge requirements) to an aspirational ceiling (“supports a maximum retention period of 365 days”), and outsourced the actual operative limit to whichever vendor NCRIC happens to be using.
- It authorized sourcing ALPR information from private sources, including “parking, tolling, private security, or other sources”—where the 2021 policy explicitly prohibited sharing data with commercial entities.
- It introduced contradictory language on visual confirmation of plate reads: one section retains the 2021 standard (“to the fullest extent possible”), while another weakens it to “should visually confirm.”
- It dropped the annual training recertification requirement entirely.
The FAQ changed in parallel. The 2015 FAQ described a multi-factor authentication process requiring a randomly generated PIN sent to a government email account. The current FAQ reduces this to “a unique username and login.” That downgrade is worth keeping in mind when we get to the part about user “a.”
Where the policy did not change much was its audit requirements. Those are still essentially non-existent, requiring only a report based on a “sampling” (it does not say the sampling must be random) be sent to the NCRIC director.
The Logs: Counting Searches
To get the cleanest possible data, these charts are based on only two sets of log files: Louisville, KY from March 2022 through April 29, 2024, and Capitola, CA from that date onward.
The charts show a highly suspicious trend. Here it is, close up, based on only Capitola data:
Between January 1, 2024 and May 1, 2024, the enforcement date, the number of searches NCRIC does is low, peaking at around 170. Activity stays around that level until the beginning of June, when both the number of users, but especially the number of searches see explosive growth.
NCRIC more than doubles the number of active users, going from having 5–20 weekly active users to a consistent ~40. What’s more, individual users go from doing ~5 searches/week to ~60 searches/week.
NCRIC’s insights page immediately reveals why: NCRIC’s users are nearly all identified with single, lowercase letters like “a.” or “c.”. These users show remarkably consistent around-the-clock activity.
NCRIC’s users are either bots, or shared accounts.
The Plausible Backdoor: Who is “a.”?
Of course, NCRIC’s deliberate avoidance of oversight and accountability is not direct evidence that it is sharing data in violation of California law—cops will be cops. But its behavior and context do lead directly to that question.
It’s possible that NCRIC was suddenly motivated to start doing some police work, and that it has absolutely terrible internal security practices. Maybe it logs in a terminal “a” and when the next person reports for their shift, they don’t log in with their own credentials and simply continue working.
It would violate every basic tenet of information security, not to mention, most likely, several federal and state laws, but it’s a possibility.
The other, in my opinion more plausible, explanation is that NCRIC shares its user accounts with external, out-of-state agencies—just like Loveland, CO was caught doing last year.
Another possible explanation is that these accounts are automated and serve to fill NCRIC’s data warehouse. Of course, that leads to a follow-up question: who can access the warehouse?
The Missing RISS
It is also worth noting that the other relevant fusion center, the Western States Information Network® (RISS), is conspicuously the only RISS absent from Flock’s audit logs. The other five are accounted for.
Unlike the FBI, which simply stopped showing up in log files after July 2023,[10] WSIN does not show up in our data at all. This is the same center that covers Washington: the state most covered by the logs we have. Either WSIN is the only RISS without Flock access, or it is not being logged as “WSIN” or some other cognizable variant.
What the Logs Can’t Show
The logs can tell us that NCRIC stripped its own security and audit requirements immediately after California started enforcing its privacy laws. They can tell us that anonymous, bot-like accounts began running searches around the clock within weeks. They can tell us that the one RISS center covering the most-logged state in our dataset is conspicuously absent from every log file we have.
What the logs can’t tell us is why — and that’s exactly the point. NCRIC designed its policies to ensure that no one, including its own director, has the information needed to answer that question.
The “sampling”-based audits don’t require randomness. The access logs don’t require justification. The retention policy doesn’t require limits.
This is not a gap in oversight. It is the deliberate architecture of unaccountability. When a fusion center rewrites its policies to remove the very mechanisms that would detect abuse, the question is no longer whether the data is being shared in violation of California law.
The question is whether anyone with authority to act will bother to find out.
The students from Sequoia Union asked the right questions. The fact that a group of high schoolers can identify the problems that California’s oversight apparatus declines to investigate is not a compliment to the students — though they’ve earned one.
It’s an indictment of everyone else.
Even though it may be too late for their deadline, maybe the information can help someone else. ↩︎
The bill permits copying or warehousing of the data within 24 hours of capture, and then fails to restrict the copied data. The Iowa State Police Association, Axon, and Motorola all oppose the bill. Flock is undecided. ↩︎
Through the Omnibus Crime Control and Safe Streets Act of 1968, whose Section 524(b) (amended by the Crime Control Act of 1973) resulted in 28 CFR Parts 20 & 23, causing the FBI’s CJIS Security Policy. ↩︎
A RISS center was also behind the FBI’s directive to make searches as “vague as permissible.” ↩︎
Yet that distinction is only made when it suits—laws that prohibit sharing intelligence data with private corporations go unenforced, as does Flock’s stated policy on giving private businesses access to its “law enforcement network.” ↩︎
Leonardo’s Data Privacy statement contains much of the same vague “local control” and “ethics” language as Flock’s. ↩︎
Denver’s own retention policy caps at one year—but the warehouse is governed by ATICC’s policy, which defers to the three years set in § 24-72-113 C.R.S. ↩︎
A vendor product with a “®” after its name, no less. ↩︎
It should be noted that police across the state only violated the privacy of millions of Californians for nearly a decade; it’s not like they shoplifted from Walmart. ↩︎
And claims not to understand what a “contract” is, in response to a FOIA request. ↩︎