I don't like that this is the case, but you understand that a pretty huge fraction of the country doesn't share your set of political premises that providing data for immigration enforcement is unethical, right? (I do, but that shouldn't matter for the analysis.)
It seems weird to me to hyperfocus on Flock's role here rather than the role your own local municipalities play in deciding how to configure these things. Not sharing with ICE is apparently quite doable? At least to the point of requiring a court order to get access to the data, which is a vulnerability all online cameras share.
As the CEO of Flock, don't you feel you have more information to offer this community outside of the "we do not sell data" statement you've made over and over? The fact that you do not engage here in the ethical aspects of your product doesn't look good for you and only deepens suspicion that something darker is going on behind your doors.
The comment adjacent to mine links to several findings, including from the EFF, demonstrating doubt on your assertions here. Specifically the case of Texas using Flock data outside of their jurisdiction (on a national level even) to use against abortion seekers. You have no substantial comments to make on those or any of the other active discussions that have spawned on this platform over the past year? You're obviously reading them, yet you only remain "consistent" on a technicality.
What steps is Flock taking to address the privacy overreach? Do you have data sharing agreements with Palantir? If so, do they respect the same geofencing properties that your clients supposedly have full control over?
Every community in the nation that is home to Flock cameras should look at the user agreement between their police department (or other Flock customers) and the company, to see whether it contains a clause stating that the customer “hereby grants Flock” a “worldwide, perpetual, royalty-free free right and license” to “disclose the Agency Data… for investigative purposes.” This is the language that will govern in a community unless a department demands changes to the standard user agreement that Flock offers. That is something we absolutely urge any agencies doing business with Flock to do — and, the ACLU of Massachusetts found, is exactly what the Boston police department did.
---
What assurance does any member of the public have that your company does not and will not ever share data to which you claim a "worldwide, perpetual, royalty-free free right and license" to? Are you saying that the "customer" has the ability to choose a "do not share" flag or something? What happens when they flip that flag at some point in the future? What redress does a victim have if you share data you did not, at that point in time, have permission to share?
You are selling tools that have zero upside and a lot of downsides and that are used for structural violation of the privacy of citizens. Don't hide behind that you're trying to help people stay safe, that is not what you are doing and if you believe that you can take credit for the upsides then you really should take responsibility for the downsides.
This is part of the problem with Flock, IMO. Lack of adherence to or support of norms. Psychopathy actualized as a corporation.
The societal impact of disruption of trust, of personal privacy, is under-appreciated by the corporation. It's concerned with winning profit.
(Meta) It's an inspecific argument I'm lazily laying out, yes, however the problem is ridiculously obvious.
We should not have to ask to be respected, and here we are.
Democratic decline (both the systems and participation in), truth, self respect/understanding of one's own rights ... those qualities are dying at the relentless toxic, ethically under-explored capitalization of our laws and resources. (Especially USA, compare to corporate social responsibility countries, I suspect)
Tech disruption is amazing to watch, and participate in, like a fire consuming the forest. "But what about the children?"
Seems like a broad dismissal of the claim made upthread ("Flock sells its data to ICE and law enforcement"). Why do you think it is excessively specific?
Because the specific part here is selling the data and them doing it.
Flock does not sell data, they willingly give it away for free. And, technically, they don't do it - their customers do, and Flock knows and lets them.
Personally, in my view, this is worse. But they don't specifically sell data.
Correct. Flock sells cameras and platform access, but gives data from their shared, nationwide surveillance utility to ICE and law enforcement.
https://www.aclu.org/news/privacy-technology/flock-massachus...
https://www.404media.co/ice-taps-into-nationwide-ai-enabled-...
https://www.aclu.org/news/national-security/surveillance-com...
I don't like that this is the case, but you understand that a pretty huge fraction of the country doesn't share your set of political premises that providing data for immigration enforcement is unethical, right? (I do, but that shouldn't matter for the analysis.)
It seems weird to me to hyperfocus on Flock's role here rather than the role your own local municipalities play in deciding how to configure these things. Not sharing with ICE is apparently quite doable? At least to the point of requiring a court order to get access to the data, which is a vulnerability all online cameras share.
Later
s/company/country, thanks for the correction!
Did you mean country?
>but gives data from their shared, nationwide surveillance utility to ICE and law enforcement
I don't think anyone with a network like that can not "give" the contents to the feds for very long without drawing ire.
Ah yeah the poor businessmen, god forbid they draw ire. Much better to just act unethically.
1 reply →
As the CEO of Flock, don't you feel you have more information to offer this community outside of the "we do not sell data" statement you've made over and over? The fact that you do not engage here in the ethical aspects of your product doesn't look good for you and only deepens suspicion that something darker is going on behind your doors.
[flagged]
The comment adjacent to mine links to several findings, including from the EFF, demonstrating doubt on your assertions here. Specifically the case of Texas using Flock data outside of their jurisdiction (on a national level even) to use against abortion seekers. You have no substantial comments to make on those or any of the other active discussions that have spawned on this platform over the past year? You're obviously reading them, yet you only remain "consistent" on a technicality.
What steps is Flock taking to address the privacy overreach? Do you have data sharing agreements with Palantir? If so, do they respect the same geofencing properties that your clients supposedly have full control over?
3 replies →
From the ACLU article above:
Every community in the nation that is home to Flock cameras should look at the user agreement between their police department (or other Flock customers) and the company, to see whether it contains a clause stating that the customer “hereby grants Flock” a “worldwide, perpetual, royalty-free free right and license” to “disclose the Agency Data… for investigative purposes.” This is the language that will govern in a community unless a department demands changes to the standard user agreement that Flock offers. That is something we absolutely urge any agencies doing business with Flock to do — and, the ACLU of Massachusetts found, is exactly what the Boston police department did.
---
What assurance does any member of the public have that your company does not and will not ever share data to which you claim a "worldwide, perpetual, royalty-free free right and license" to? Are you saying that the "customer" has the ability to choose a "do not share" flag or something? What happens when they flip that flag at some point in the future? What redress does a victim have if you share data you did not, at that point in time, have permission to share?
>Welcome feedback or new ideas to make our communities safe.
Nuture not control.
Living wage.
Access to day care.
1 reply →
This is totally disingenuous.
You are selling tools that have zero upside and a lot of downsides and that are used for structural violation of the privacy of citizens. Don't hide behind that you're trying to help people stay safe, that is not what you are doing and if you believe that you can take credit for the upsides then you really should take responsibility for the downsides.
10 replies →
[dead]
Royal we.
This is part of the problem with Flock, IMO. Lack of adherence to or support of norms. Psychopathy actualized as a corporation.
The societal impact of disruption of trust, of personal privacy, is under-appreciated by the corporation. It's concerned with winning profit.
(Meta) It's an inspecific argument I'm lazily laying out, yes, however the problem is ridiculously obvious.
We should not have to ask to be respected, and here we are.
Democratic decline (both the systems and participation in), truth, self respect/understanding of one's own rights ... those qualities are dying at the relentless toxic, ethically under-explored capitalization of our laws and resources. (Especially USA, compare to corporate social responsibility countries, I suspect)
Tech disruption is amazing to watch, and participate in, like a fire consuming the forest. "But what about the children?"
that's an odly specific answer.
Seems like a broad dismissal of the claim made upthread ("Flock sells its data to ICE and law enforcement"). Why do you think it is excessively specific?
Because the specific part here is selling the data and them doing it.
Flock does not sell data, they willingly give it away for free. And, technically, they don't do it - their customers do, and Flock knows and lets them.
Personally, in my view, this is worse. But they don't specifically sell data.
4 replies →
[dead]