Flock Hardcoded the Password for America's Surveillance Infrastructure 53 Times

6 days ago (nexanet.ai)

I don't care that Flock was involved, I care that there's no consequence for it when any corporation does this. How can this not result in fines or jail time?

Although I don’t like Flock, I’m a bit skeptical of the claims in the article. Most screenshots appear to be client-side JavaScript snippets, not API responses from this key.

In the bug bounty community, Google Maps API key leaks are a common false positive, because they are only used for billing purposes and don’t actually control access to any data. The article doesn’t really prove ArcGIS is any different.

  • Security for maps is basically impossible. Maps tend to have to be widely shared within government and engineering, and if you know what you're looking for, it's remarkably straightforward to find ways to access layers you would normally have to pay for. It's a consequence of the need to share data widely for a variety of purposes -- everything from zoning debates within a local county to maps for broadband funding across an entire country create a public need to share mapping information. Keys don't get revoked once projects end as that would result in all the previously published links becoming stale, which makes life harder for everyone doing research and planning new projects.

    Moreover, university students in programs like architecture are given access to many map layers as part of the school's agreements with the organizations publishing the data. Without that access, students wouldn't be able to pick up the skills needed to do the work they will eventually be hired for. And if students can get data, then it's pretty much public.

    Privacy is becoming (or already is) nearly impossible in the 21st century.

    • privacy isnt impossible

      privacy while engaging with the digital world is

      it isn't hard to be private. you just can't live in or go near cities/towns as much.

      1 reply →

I think the issue with Flock isn't that they're a joke security wise the issue is that they exist. If you want to police somebody you don't have to police everyone. I'd argue watching my location at all times is unreasonable search.

  • If someone followed me around 24x7 with a notebook, transcribing all my movements and affixing carefully attached photos of me to every page, it would be called Stalking and I'm pretty sure I could win at least a restraining order against them in court.

    I don't get why we treat this any differently. The only difference is they're not as obvious.

    • you just described a private investigator.

      stalking requires some kind of menacing or whatnot. i seriously doubt a judge would grant a restraining order just because you think someone is following you without any interaction.

      >Stalking is a crime of power and control. It is a course of action directed at an individual that causes the victim to fear for their safety, and generally involves repeated visual or physical proximity, nonconsensual communication, and verbal, written, or implied threats.

      11 replies →

  • I'm starting to think there should be a constitutional amendment specifying a right to privacy because the last few decades have shown they'll just keep pushing the boundaries otherwise.

    • The chances of a constitutional amendment, let alone one dedicated to specifically limiting the powers of law enforcement, is, and I'll go on a limb and say I'm correct in this absolute statement, 0.

      There is zero chance of any amount of government in these United States cooperating in any fashion large enough to change the actual Constitution. Zero.

      6 replies →

    • It's pretty useless. A (US) constitutional amendment would only protect Americans from US institutions.

      Us foreigners still have to deal with Americans spying on us. (And other countries spying on us.) And Americans still have to deal with non-American organisations spying on them.

      2 replies →

Public camera feeds should be public

  • I agree with this, especially in the case of camera feeds that are run by organizations that are supposedly servicing the public.

    That being said I also don't wonder if there is a point where we're just crowdsourcing the police state?

  • To most effectively enable stalking applications

    • I have proposed elsewhere that for companies like Flock doing surveillance of the public, it should be legally required for every company executive and board member to have their cameras, ALPR systems, audio surveillance, drone systems, etc - installed outside their homes and along their routes to work and along their routes to their children's schools and their spouses workplaces - and all of that data be publicly accessible. And I'd suggest the same goes for senior management at decision makers at every town and police department and private company that signs a contract with them.

      "For their own safety", as they'd have us believe.

      Quis custodiet ipsos custodes?

      1 reply →

    • If I was being stalked I'd rather have public surveillance data that I could compile (or pay somebody else to compile) versus relying on law enforcement, who has no duty to protect me.

      Making surveillance public levels the playing field for everybody.

    • ...people can just follow you in public. there's nothing illegal about that.

      there is no reasonable expectation of privacy in a public setting, nor should there be. anyone arguing there should be is giving up basic rights because they're scared.

      the issue is when public feeds get recorded and are allowed to be viewed at a later date. the data retention is the issue, not the privacy.

      3 replies →

Has anyone had success getting their city to take down the Flock cameras? Ours just added them maybe a year and a half ago. They popped up in multiple nearby municipalities around the same time, I'm not sure if it was coordinated action or somehow pulled off at the county level.

Sheer incompetence. I hope (probably in vain) that police departments and local governments become more savvy technical evaluators of fancy tech solutions.

There was a huge fracas re: ShotSpotter in my town, where both the municipality's CIO and auditor (+ their internal research capacity) were sidelined. It took a sad amount of handholding elected officials through ShotSpotter's technical claims for them to shelve a planned deployment.

  • It’s not incompetence. This is simply not caring. If they had any interest in fixing this they would have. It just wasn’t at all important to them.

I am apprehensive of the surveillance state and it's potential for misuse. However this disclosure content is less than ideal:

- It mixes two separate issues 1) embedded default API key and 2) unauthenticated token minting

- The bulk of the disclosure focuses on enumeration of sensitive data that is implied could have been exposed via the default API key, but what is actually exposed is unclear: "The 50 "portal:app:access:item" privileges reference private item IDs that cannot be inventoried without actively querying each one which I did not do"

- The default API key was for "development" and there is no assertion that live data existed in that environment (though it wouldn't surprise me)

- The default API key was fixed in June 2025, it is only the token minting that has not been.

- The token minting issue is only asserted to "grant access to the geographic mapping of Flock's camera network locations" which would certainly be useful as a source for unethical updates to https://deflock.me/ but obviously not nearly as sensitive.

(And I've always used bullets/lists in my communications, long before AI did this)

With respect to a different public organization with a reach of millions of people, I reported a similar vulnerability where there was an exposed key that services sensitive data. Usually, I don't bother but this time it was bad. I now understand how these things are left exposed for several months to years despite notification. The level of burnout or ignorance that leads to these vulnerabilities elicits harsh backlash where admitting there was ever a problem is worse than exposing a vast amount of people's private data.

Flock is fond of saying this:

> "I'm writing to you directly because I want there to be zero confusion about what's happening. Flock has never been hacked. Ever."

They are just lying at this point. If you get involved in advocacy related to flock you will likely hear their reps parrot this. Be ready to combat it with concrete examples like this!

  • Flock CEO: my home has never been broken into before. Ever.

    House guest: but sir, where are all of your belongings?

    Flock CEO: oh that, well I leave my front door open at all times. My home has never been broken into

  • But is it really hacking if they just give you the key?

    Am i breaking into your home when you leave the door wide open? /s

    • If you have a camera and you're only taking photos. You don't have any photos of the car keys and the car going missing do you? /s

      It's how urban exploration folk get away exploring abandon buildings here in the UK. If you can prove you didn't create damage to gain access; a grey area.

      > Trespass (Civil Matter): In England and Wales, simple trespass is typically a civil matter between you and the landowner. You cannot be arrested for civil trespass alone, but the landowner can sue you for damages or an injunction, and police may get involved if you refuse to leave when asked.

In a sensible world. This would both destroy the company and get the owners jailed.

I have a controversial question; In the UK, they have blade runners who take down CCTV. I would have expected a more aggressive response in the USA, considering the culture. Is this not happening?

  • Our anti-police-state faction is toothless, while the "aggressive" faction is the one trying to install the police state.

  • They are not "taking down CCTV", they're destroying the infrastructure that lowers car fumes pollution. These cameras are not used for anything else.

    You know, that thing killing school children: https://www.lbc.co.uk/article/air-pollution-ella-kissi-debra...

    The evidence for ULEZ is solid so seriously bringing it as an example of white knight activity whole they're at best malignant, brainwashed goons doesn't help anyone: https://www.bbc.co.uk/news/uk-england-london-67653609

    • I don't understand? I did some basic research, and it doesn't seem like these cameras have air quality sensors. How exactly would some Android cameras reduce pollution?

      1 reply →

  • Somewhat, but the legal cosequences for getting caught and brought to court if you don't have a few thousand to drop on a lawyer will screw up your life. So it happens less.

    Not to mention the risk of dealing with trigger happy and corrupt cops.

  • I mean we're also increasingly being terrorized by our new gestapo, so far with limited resistance. We aren't really the "radical freedom defenders" we like to claim to be...

  • Americans are largely cowards. You can see this as we're still mostly afraid of accurately defining and educating about genocide and how we all contribute to it by going to work every day, as well as afraid of feelings that arise around it.

I love it when the entire HN comment section devolves into a mere public shaming square with absolutely no substance.

  • I mean, there is a certain level of incompetence at which that becomes the only reasonable response?

I wouldn't be surprised if the code is just a Chinese stuff with a customisation on top

Maybe it was on purpose. They might have been forced by the FBI to implement those keys, so they left everything open to be able to track the enforcers also. 53 = 52 states plus gov

Do the MBAs now running tech just have a hardon for becoming the scifi dystopians they read as children?

  • Not always, sometimes they like to role-play as fallen angels from fantasy books (see Palantir.) (Edit: upon review, the metaphor is strained because Sauron didn’t create the palantíri… he did control them later, and there is deeper metaphor that they are unreliable.)

  • CEO/founder of Flock has a BS in Electrical Engineering with highest honors from Georgia Tech, and does not appear to have an MBA.

Does anyone else feel like the LLM-tone of this article makes it difficult to understand what's actually important in it? It's not clear to me if the issue is ongoing (like it says) or that it's been resolved by rotating the API key (like it also says). And that's like, the most basic piece of information the article could have in it.

  • Obviously more than just tone. Based on the lack of structure and wording it's clearly substantially AI written.

  • The article mentions two vulnerabilities. One was remediated June 2025. The other has not been remediated.

  • I hate that every article nowadays has to be judged on whether it's AI or not.

    So annoying.

    • For me it's not about "is this AI", it's "this writing is obnoxious and disrespectful of the reader, and here's why I think AI is likely at the root of it."

    • I'd like to read stuff written by a human. I know other people like reading LLM output. I don't see what's wrong with telling people whether it's AI-written or not.

Who could have guessed that the greedy, opportunistic, evil corporation whose sole intent is to invade our privacy in the name of "security" would be run by incompetents in the security realm?

  • Their CEO comes off as a real self-righteous character.

    One has to wonder whether these passwords were that way purposefully to avoid accountability for privileged partners. Most of these systems are deployed with grant money that it comes from the department of justice.

  • I'm surprised they didn't name it after some Tolkien reference that they completely misinterpreted...

  • FYI; Flock was/is a YC backed company

    https://www.ycombinator.com/companies/flock-safety

    • > We are committed to protecting human privacy and mitigating bias in policing with the development of best-in-class technology rooted in ethical design, which unites civilians and public servants in pursuit of a safer, more equitable society.

      …and of course they do the exact opposite. All a bunch of bullshit from inception.

      1 reply →

    • VC firms are behind the police state and the break down in world order in general.

      YC is not the good guy in this world.

  • Here's an elucidation, taking that question seriously, supplying a bunch of "Why's" --

    * https://medium.com/@ajay.monga73/why-developers-still-hardco...

    • A root-cause analysis here that's about intrinsic difficulty is misguided IMHO. Secrets and secrets-delivery are an environment service that individual developers shouldn't ever have to think about. If you cut platform/devops/secops teams to the bone because they aren't adding application features, or if you understaff or overwork seniors that are supposed to be reviewing work and mentoring, then you will leak eventually. Simple as. Cutting engineering budgets for marketing budgets and executive bonuses practically guarantees these kinds of problems. Engineering leadership should understand this and deep down, it usually does. So the most direct way to talk about this is usually acknowledging willful negligence and/or greed

      1 reply →

Then time for responsible disclosure or CFAA charges.

  • You could just read the article before knee-jerking to state repression.

    > November 13, 2025 — Initial disclosure sent to Flock Safety security team

    > November 14, 2025 — First follow-up requesting confirmation of receipt

    > November 19, 2025 — Second follow-up; Flock Safety finally acknowledges receipt

    > January 7, 2026 — Vulnerability remains unpatched (55+ days)

    > I am withholding specific technical details to prevent exploitation while the vulnerability remains unpatched. However, its existence more than 55 days after responsible disclosure with no remediation, demonstrates a systemic pattern of credential mismanagement.

In fairness to flock, they just hired a CISO and are actively recruiting for a head of product security and privacy as well. So I'm not surprised they're dealing with some of this.

Edit: I'm standing by it. The person they hired for it has a good track record elsewhere. And much as I don't like what Flock is building as a company, at least they're building security in now, even if it wasn't front of mind for them in the past.

He's got his work cut out for him though.

  • That’s fairness to a new employee. Does the multibillion company of a widely-deployed sensitive product deserve a pass for having poor or nonexistent employees doing security previously? Not really IMO.

  • > And much as I don't like what Flock is building as a company, at least they're building security in now,

    This phrasing implies that the "building security in now" part improves (or decreases the awfulness of) what you don't like.

    If what you don't like = bulk, systemic surveillance (of people not suspected of a crime) - how does fixing broke security make that less awful?

  • There should be no "Fairness to Flock" they're building the panopticon. Freethinking Americans should do what they can to dismantle this overreach, lobby their city leaders with their poor track record on security and thereby safety.

  • I'm fine giving the new employees a pass on this, but not the company as a whole. Not building security into a product like this from day one should be a criminal offense.