Comment by JohnMakin
9 months ago
> Anyone have experience with these sort of services?
Quite a bit. Often if you request removal or opt-out, you'll reappear in a matter of a few months in their system, regardless of whether you use a professional service as a proxy or do it yourself. The data brokers usually go out of their way to be annoying about it and will claim they can't do anything about you showing up in their aggregated sources later on. They'll never tell you what these sources are. A lot of them will share data with each other, stuff that's not public. It's entirely hostile and should be illegal. I am trying to craft a lawsuit angle at the moment but they feel totally unassailable.
I'm extremely skeptical of any services that claim they can guarantee 100% removal after any length of time of longer than 6 months. From my technical viewpoint and experience, it is very much an unsolved problem.
my understanding is that there's a bit of a catch-22 with data removal - if you request that a data broker remove ALL of your information, it's impossible for them to keep you from reappearing in their sources later on because that would require them to retain your information (so they can filter you out if you appear again).
I’ve heard this claim, but they could use some sort of bloom filter pr cryptographic hashing to block profiles that contain previously-removed records.
There could also be a shared, trusted opt-out service that accepted information and returned a boolean saying “opt-out” or “opt-in”.
Ideally, it’d return “opt-out” in the no-information case.
Hash-based solutions aren't as easy as we might hope.
You store a hashed version of my SSN, or my phone number, to represent my opt-out? Someone can just hash every number from 000-00-0000 to 999-99-9999 and figure out mine from that.
You hash the entire contents of the profile - name+address+phone+e-mail+DOB+SSN - and the moment a data source provides them with a profile only containing name+address+email - the missing fields mean the hashes won't match.
A trusted third party will work a lot better IMHO.
And of course none of the data brokers have much reason to make opt-outs work well, in the absence of legislation and strict enforcement - it's in their commercial interests to say they "can't stop your data reappearing"
10 replies →
Yup, it would be trivial to create a one way hash of various attributes to perma ‘opt out’ someone.
But how would they keep making money that way?
So for a perfect match they'd need to have some sort of unique identifier that's present in the first set of data you ask them to remove, as well as being present in any subsequent "acquisitions" or "scrapes" of your data.
If these devs that scrape/dump/collate all this info are anything like the ones I've seen, and they're functioning in countries like the US and UK whereby you don't have individual identifiers that are pretty unique, then I'd say the chance of them being able to get such a "unique" key on you to remove you perpetually, is next to impossible. And if it's even close to being "hard", they'll not even bother. Doubley-so if this service/people/data is anything like the credit-score companies, which are notoriously bad at data de duplication and sanitation.
Likewise, if you want them to do some sort of removal using things other than a unique identifier, then you have to have some sort of function that determines closeness between the two records. From what I've heard, places like Interpol, countries' border-control and police agencies usually use name, surname and dob as a combination to match. Amazingly unique and unchanging combination, that one! /s
Sorry, I value my legal rights over the viability of the data broker industry. If they can’t figure out a way for lawfully not collecting my data, they should not collect data period.
I mean, if we’re not allowed to know that we’re not allowed to surveil the shit out of you, it seems like something we can’t worry about
1 reply →
1. They could be required to store a private copy of the removal requests, data that they can't sell (not ideal)
2. Sounds like "data brokers" that sell private information just shouldn't exist...
> They could be required to store a private copy of the removal requests
They would leak that in the next data breach.
They could store a hash.
Which would never work because real life data is messy so the hashes would not match. Even something as simple as SSN + DOB runs into loads of potential formatting and data entry issues you'll have to perfectly solve before such a system could work, and even that makes assumptions as to what data will be available from each dataset. Some may be only name and address. Some may include DoB, but the person might have lied about their DoB when filling out the form. The people entering it might have misspelled their name. It might be a person who put in a fake SSN because they're an illegal immigrant without a real one. Data correlation in the real world is a nightmare.
When you tell a data broker to delete all of the data about you, how can you be sure they get ALL of the data about you, including the ones where your name is misspelled or the DoB is wrong or it lists and old address or something? Even worse if someone comes around later and discovers the orphan data when adding new data about you and fixes the glitch, effectively undoing the data delete.
It's a catch-22 that if you want them to not collect data about you they need a full profile on you in order to be able to reject new data. A profile that they will need to keep up-to-date, which is what they were doing already.
7 replies →
I've had a very bad experience with Liberty Mutual following a data opt-out from another service. They sent me on a runaround, ending with an email saying to follow "this link" to verify myself. (There was no link, only sketch.) I ended up getting a human on a phone through special means, and they sent me a fixed email with a working link.
I should be hearing back from them in the next 32 days, as this was 13 days ago.
I got a quote from them and immediately initiated a data removal request. It seems like it went through, got a link in the email. Thanks for the reminder that I might need to follow up to make sure they followed through.
It's hard to make collection, aggregation, and sharing of facts illegal.
Not to minimize the harm that can be done by such collections, but the law is justifiably looking for a scalpel treatment here to address the specific problem without putting the quest to understand reality on the wrong side of the line.
> It's hard to make collection, aggregation, and sharing of facts illegal.
Sure, but the US has a precedent in HIPAA. Not saying it's copy-paste, but... maybe it should be.
I would prefer the law be more restrictive than less, because I don't believe this is true:
> law is justifiably looking for a scalpel treatment here to address the specific problem without putting the quest to understand reality on the wrong side of the line.
I believe the law may use that noble goal as cover for the actual goal: restrict the ability of capital holders to accumulate capital as little as possible. Data sharing isn't a public good in any way. It's mostly not even useful for the targeting purposes it claims. It's extremely reckless rent-seeking that knowingly allows innocent people to have their lives wrecked by identity theft.
As someone who helps care for elderly relatives with widely-dispersed out-of-state families, I can point to HIPAA as an excellent example of why crafting this kind of law is difficult.
I think we are going to discover, once people do the research, that HIPAA has done net harm by delaying flow of information for critical-care patients resulting in lack of patient compliance, confusion, and treatment error.
Yes, there is harm potential in insurance companies denying coverage or claims because they are privy to too much information about clients (a scenario that, I'd note, we could address directly by law via a national healthcare system or banning denial of coverage for various reasons) or by employers or hostile actors (including family) discovering medical facts about a patient. I have to weigh that harm potential against my day-to-day of having to fight uphill to get quality care because every specialist, every facility, and every department needs a properly-updated HIPAA directive for a patient (and the divisions between these categories aren't clear to the average non-medical observer).
2 replies →
Europe figured it out.
Sure, I should probably have clarified "In the United States," where there's a First Amendment that most attempts to make fact-sharing illegal immediately fall afoul of.
There are definitely exceptions, but it puts strict scrutiny on any novel prior constraint of speech.
Instead of making it illegal, we could simply make the people who aggregate the data liable for making people whole if the data is misused.
this is true and nothing new.. mass "gray market" personal information services lept into markets since VISA and Mastercard fifty years ago, and somewhat before that with driving records, in the USA. The "pure land" of democracy in North America was never pure, and the Bad Old Ways have crept into the corners since the beginning.
The difference now though is an attempt to legislate personal data collection, such as the CCPA. I strongly believe they are violating the law, and that if I opt-out or request removal, an answer of "oh well nuthin we can do" is not acceptable when my data re-appears either on their platform or on another platform they provided data aggregation services to.
>The "pure land" of democracy in North America was never pure
don't mix your pet grievances together, having full public knowledge of every person in your country is democratizing, frankly, an aid to democracy, not a hindrance. Not saying I want to live in that world, but it's not an impure democracy.
Norway (and others?) already publishes everybody's income statements. Not healthy imo but I guess would aid more accurate snitching (and envious resentment).