Flock Dodges Dunwoody Question with Demo Defense

Flock responds to allegations that its executives accessed cameras inside a community center gymnastics room — three months late, via blog post, and with a novel theory of what 'crime-solving' means.

by H.C. van Pelt15 min read

Following allegations that Flock employees had accessed cameras inside a private Jewish community center, including its gymnastics room, Flock and local government officials responded predictably: they conferred behind closed doors, handwaved away the allegation in public, and proceeded to give each other whatever benefits they negotiated.

Three days after the deal closed, Flock, apparently alive to the optics of its employees viewing a community center pool through police cameras, released a blog post titled “Understanding Flock’s Testing and Development Program.” Personally, I would not have chosen to link “employees viewing a gymnastics room” to “testing and development.” But this is Flock.

The issue Flock’s blog post addresses was raised by Dunwoody resident Jason Hunyar and amplified by YouTuber Benn Jordan: Dunwoody PD’s event logs (similar to, but not the same as, the “ALPR audit logs” this site publishes) showed Flock executives had opened camera streams inside the JCC on numerous occasions, for durations the logs don’t record. For the details, see Jason’s write-up and the posts about the April 13 Dunwoody meeting and its outcome.

The post was published under Josh Thomas’ name—the company’s Chief Communications Officer who has been speaking for Flock for the past eight years. It’s not a slapdash production by an engineering manager. His headline reads:

This article explains how Flock tests its technology in real-world environments, strengthens search safeguards, and addresses recent privacy questions about its development practices.

Let’s discuss these topics. And the buried lede.

The Lede Thomas Buried

Tucked into the middle of the post, presented as evidence of a safeguard working, is this:

In Dunwoody, a Flock employee performed a demo of this content moderation policy by searching for both “Star of David”, which our search moderation tool blocked, and “Cowboy hat,” which the search moderation tool allowed.

Flock describes the underlying feature, FreeForm, as a search tool that allows officers to query cameras for descriptive phrases like “man wearing a cowboy hat.” Read that together with the Dunwoody example: a Flock sales employee ran an identifying search against live Dunwoody camera data. The cowboy hat search, per Flock’s own description, returned results—real people, in Dunwoody, identified by what they were wearing, surfaced to a salesperson running a demo. The Star of David search was also made.

The only thing that stopped it from returning a list of Jewish residents of Dunwoody was a content filter Flock built, maintains, and can modify at any time without telling anyone.

Flock presents this as reassuring. It is the opposite.

The architecture underneath the filter is the actual story. Flock’s patent, US 11,416,545, titled “System and method for object based query of video content captured by a dynamic surveillance network,” describes parsing video “for content” and storing it “in a database in connection with data that identifies the content (object class, aspects of the object, confidence scores, time and location data, etc.).”

The patent family extends to neural networks trained to identify clothing, estimate height and weight, and classify other physical characteristics of individuals—stored, by design, in searchable databases. That is an index. It is being built continuously, by design, and is queryable by any user Flock decides gets a search box.

The filters, which are themselves AI-based pattern matching rather than deterministic blocks, block certain query strings against that index. They do not prevent the indexing. The filter can be modified or turned off. If it even works at all.

Flock is asking for credit because its AI blocks certain searches. The thing worth noticing is what those searches are being run against, and who is running them.

Recent Privacy Questions About Development Practices

Now the post’s stated topic. In his post, Jason makes a number of factual allegations, all sourced directly from Flock event logs, before concluding:

On September 30th, 2025 - Bob [Carter, VP Business Development, Flock Safety] looked at just one camera. This camera is in the gymnastics room of the JCC. I personally am curious about why a sales employee from Flock would be viewing the gymnastics room. I think this also deserves an explanation.

The public deserves to know why Flock employees are using Dunwoody’s Flock system to look at live videos of people and children in the pool, gymnastic facilities, and fitness studios.

Note what Jason actually asks for: an explanation. Not a prosecution, not a verdict, not a character judgment. An explanation of why sales employees at a surveillance vendor are logged into a police department’s system looking at cameras inside a community center. That question has been outstanding since January, when Jason first brought it to the city council.

In its March meeting, long after Jason first contacted the city, Dunwoody IT presented the results of their security audit. Dunwoody looked at the same logs and found no issues.[1] They didn’t answer Jason’s question. A month later, the mayor didn’t mention that city staff had already gone over these logs. She didn’t answer Jason’s question.

Now, three months after the question was asked, the answer is delivered via blog post: the employees named online are well-intentioned people who accessed a camera network with the city’s explicit permission, as part of their job, and are now being called predators for it.

Josh Thomas asks us to accept that it is part of his company’s sales executives’ jobs to give sales demos when kids are piled into the pool on a Wednesday afternoon, or when the gymnastics room is in active use on a Tuesday at lunch.

Here is the core of what is verifiable: a Flock executive, who does not work for the police, logged into a police account and opened a camera stream inside the gymnastics room at a community center.

The event logs published by Jason—which Flock does not dispute—show multiple accesses by at least two Flock employees, Bob Carter and Randy Gluck, to cameras inside the JCC across multiple dates in 2025, including cameras pointed at the gymnastics room, pools, and children’s facilities.

But the event logs show when a user starts viewing a stream. They don’t show when a user stops, or any detail to provide critical context. Maybe Flock’s employees now better understand how inadequate logging can facilitate abuse.

We can’t tell if looking up a license plate over and over in the middle of the night with only the stated reason of “investigation” is stalking. We also can’t tell if the “pool” camera was viewed for 30 seconds from a terminal inside a police station, or if it was left running for hours or days on a bedroom TV in another state.

Flock’s employees are seeing the end-result of multiple layers of failed policy, inadequate transparency, insufficient auditing, and no accountability. Employees at a private company should not have unescorted access to police surveillance data. If they had not had access, we would not be having this conversation. It’s that simple.

The principle of least privilege is not optional; it’s AC-6 under CJIS Security Policy v6.0; access should be limited to what’s “necessary to accomplish assigned organizational tasks.”[2] Vendor and contractor access falls under PS-7 (External Personnel Security). Account management is AC-2. And the audit controls that would normally catch any of this are in AU-2 and AU-3, and AU-9. Nearly-identical controls exist under SOC 2 and ISO 27001. Both certifications Flock touts.

Months after the issue was first raised, Flock now claims the activity was approved under “the city’s demo partner agreement.” Flock did not provide its terms. Dunwoody never produced it in response to Jason’s open records requests. Flock employees at the March and April meetings didn’t mention it. The police chief and IT director stayed silent on it during the audit presentation at the March meeting. The mayor didn’t mention it when she addressed the issue at the April council meeting.

Dunwoody has now signed the deal. The incentive to stay on-message is gone, and Flock has moved directly to publicly accusing its “partner” of hiding an agreement as a post-hoc justification of its violation of public trust.

On Being Accused of Accusing People

Flock’s post includes this line, which is the most carefully lawyered sentence in it:

Accusing someone of spying on children is not a policy disagreement; it is a life-altering allegation.

Correct. Fortunately, no one in this story has made that accusation. Jason asked for an explanation—in writing, to the Dunwoody city council, on January 12, and every month since. What Flock has now done, three months later, is respond to an accusation Jason did not make.

Flock employees had the technical capability to watch children at a community center and accessed cameras pointed at those children. Whether any individual Flock employee used that capability maliciously is unknown and largely beside the point. The capability is the problem. The access is the problem. The absence of any meaningful oversight is the problem.

Josh Thomas would like the story to be about what is in a sales executive’s heart, because that is a story he can win. The story he can’t win is the one about Flock’s architecture.

What the Logs Actually Show

Flock’s post frames the Dunwoody events as a single routine demo at an unusually sensitive location. The event logs Jason obtained by open records request show 185 JCC-camera accesses by Flock VP Bob Carter alone since January 2025.

The network sharing is even worse. The JCC’s private camera network, labeled in Flock’s system “Dunwoody GA PD - Atlanta JCC Avigilon (Do Not Share),” was at one point actively shared by Dunwoody PD with three outside agencies, including Lawrenceville GA PD, which received permissions to view, record, and download live video streams.

That sharing was removed only after Jason disclosed it to Dunwoody’s chief, and the removal was performed by a user (“John Watson”) not in the user export—which should include historical users. A ghost administrator corrected a misconfiguration that was not supposed to exist in the first place.

At the March council meeting, Dunwoody’s own lieutenant told the public that only two neighboring agencies view live streams and that liveview access is “strictly reviewed and on a case by case basis.” The logs show 1,271 agencies with access. The logs show no access by any agency, including the two confirmed active users.

This is the environment in which Flock employees, in Josh Thomas’s description, are “well-intentioned” and “accessed a camera network with the city’s explicit permission.”

They may be. There is no way to know.

Strengthened Search Safeguards

This takes up the most space in Flock’s post; we can keep it short here. Flock describes its existing broken AI-based “FreeForm” moderation system, which did exactly nothing to prevent anything that happened here.

Testing Technology in Real World Environments

Mentioned in the same breath as “development practices.” Flock does not distinguish between “development,” “testing,” and “production”—in its post or in practice. It’s not an uncommon problem for venture-backed software companies, but it’s not a small one for Flock. I have written about this many times before, and Flock continues to signal it will do nothing to address it.

Flock’s approach is to let its developers and sales execs loose on a real police department’s account, connected to real cameras, pointed at real people—and, yes, real children.

The Cybertruck example Flock offers is this:

Here’s a concrete example: when the Tesla Cybertruck came out, we had to build a whole new ML algorithm to identify it. Nothing had been seen like that before. This requires testing and training the models in real-world conditions.

“A whole new ML algorithm” is an overstatement. Flock was failing to detect the Cybertruck as a car (or truck, or whatever it is). That’s a training task, not a new algorithm, and an entire industry exists to support exactly that kind of image-recognition training.

Even if Flock does all its ML work in-house, whether overseas or not, and uses only data collected under its government contracts, all it requires is an image and someone to answer: “Cybertruck or not Cybertruck?

Nobody at Flock needs access to a police account. Not for software development. Not for sales demos.

The Remediation

Flock describes its fix this way:

Although the camera was only viewed once during a routine demo, we understand that this is a sensitive location for many. We have therefore determined that employees will be trained to only conduct demos in more public locations, like retail parking lots.

So the reform is: Flock sales employees will continue to log into police surveillance systems, run demos against live resident data, and view live camera feeds. They will just point the cameras at people and children in more public places.

There is no commitment to stop using production police accounts for sales demos. No commitment to separate development, test, and production environments. No commitment to publish the demo partner agreements. No commitment to audit, retroactively, every access a Flock employee has made to Dunwoody’s cameras. No changes to the logs themselves. Nothing structural.

Jason’s records work also documented Flock employees using Dunwoody’s system to create API connections to third parties with whom Dunwoody has no contract; data funneled through those integrations falls outside any contractual framework. This will not be addressed.

Flock’s repetition that “local agencies—not Flock—control who can access their data” falls especially flat when it’s delivered in the same post where Flock argues that it needs access to that data because it “must be tested and demoed, both to ensure we get everything right on the technical side and so other agencies and businesses understand how the sharing works.”

If Dunwoody PD authorized Flock to share these video streams with “other agencies and businesses” then that is perhaps even more problematic than broken vendor policies and architectures. It’s a police agency acting entirely outside of the scope of its lawful duties to the detriment of the local community.

If true—if the Dunwoody chief of police allowed video from within the community center to be shared with “other agencies and businesses” without being authorized to do so by the council—he deserves to be held accountable.

The signature on the demo agreement will tell.

Addendum to My Previous Post

In my previous post I wrote:

The city’s new MSA does not prohibit Flock from accessing Dunwoody’s account, and continues to grant Flock a royalty-free license to “support and improve Flock’s products and services,” which arguably describes what happened here. The license has no specified term and cannot be revoked.

That remains true, but it understated Flock’s asserted basis for access. I had assumed Flock would rely on its license for business purposes. Instead, per the blog post:

Similarly, one of the benefits communities most value about Flock technology is the ability for law enforcement to directly access privately owned cameras, if and only if the organization allows them to, for crime-solving and security purposes. This is also a feature that must be tested and demoed, both to ensure we get everything right on the technical side and so other agencies and businesses understand how the sharing works.

In a deeply Nixonian “when I do it it’s not illegal” move, Flock treats “demos” for “other agencies and businesses” as part of the government agency’s “crime-solving and security purposes.”

That’s Flock’s real-world interpretation of “the customer owns 100% of the data” and “Flock does not access the data.”

What You Can Do

Flock has now publicly asserted that side agreements authorizing vendor access to police surveillance systems are standard practice. If that is true, such agreements may exist in your city.

They are almost certainly not posted on any public agenda. They were not, in Dunwoody, produced in response to ordinary records requests until Flock itself acknowledged them.

Consider filing a public-records request with your city or police department for any agreement or other record showing whether your agency has entered into a demo or testing arrangement with Flock.

If you obtain any such agreements, or if your agency confirms none exist, I’d love it if you let me know.

Parents across the country have a right to know whether Flock employees are watching cameras in their local daycares, community centers, and schools—whether the reason is software development, testing, sales demos, or something else.


  1. Well, they did—but they handwaved them away. Discussed in that post. ↩︎

  2. CJIS v6.0 adopts the NIST SP 800-53 Rev. 5 control designations; AC-6, AC-2, PS-7, and the AU-family audit controls are the control identifiers used throughout the policy. The full policy, released December 27, 2024, is a 600-page document organized into 20 policy areas with over 1,300 subcontrols. P1 controls (including AC-2, AC-6, and the core AU controls) are immediately auditable; full compliance with all priority levels is required by October 1, 2027. ↩︎