Health care data breach affects over 600k patients, Illinois agency says

1 day ago (nprillinois.org)

Unfortunately there's no money in privacy, and a lot of money in either outright selling data or cutting costs to the bare minimum required to avoid legal liability.

Wife and I are expecting our third child, and despite my not doing much googling or research into it (we already know a lot from the first two) the algorithms across the board found out somehow. Even my instagram "Explore" tab that I accidentally select every now and then started getting weirdly filled with pictures of pregnant women.

It is what it is at this point. Also I finally got my last settlement check from Equifax, which paid for Chipotle. Yay!

  • Interestingly in healthcare there is a correlation between companies that license/sell healthcare data to other ones (usually they try to do this in a revokable way with very stringent legal terms, but sometimes they just sell it if there is enough money involved) and their privacy stance... and it's not what you would think. Often it's these companies that are pushing for more stringent privacy laws and practices. For example, they could claim that they cannot share anonymized data with academic researchers, because of xyz virtuous privacy rules, when they are actually the ones making money off of selling patient data. It's an interesting phenomenon I have observed while working in the industry that seems to refute your claim that "there's no money in privacy". Another way to think about it is that they want to induce a lower overall supply for the commodity they are selling, and they do this by championing privacy rules.

  • As new moms tend to change their consumer purchasing habits they are coveted by advertisers. http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h... Certain cohorts and keywords are very valuable so even searching a medical condition once or clicking on a hiring ad for an in-demand job can shift ads toward that direction for a long time.

    • It seems more important than ever to have self hosted apps or browser extensions that will intermittently search for these valuable keywords. Ad Nauseum is much better than bare Ublock Origin for the same reason.

    • Yeah I'm less shocked that it got picked up and more how quickly it spread to literally every platform we use, even those that wouldn't have much if any hint that it was happening.

      There's clearly quite the active market for this information

  • > the algorithms across the board found out somehow.

    It's worth keeping in mind that this is basically untrue.

    In most of these algorithms, there's no "is_expecting: True" field. There are just some strange vectors of mysterious numbers, which can be more or less similar to other vectors of mysterious numbers.

    The algorithms have figured out that certain ad vectors are more likely to be clicked if your user vector exhibits some pattern, and that some actions (keywords, purchases, slowing down your scroll speed when you see a particular image) should make your vector go in that direction.

  • > Unfortunately there's no money in privacy

    But there should be and there should be punishments for data breaches, or at least compensations for those affected. Then there would be an incentive for corporations to take their user's privacy more seriously.

    Your personal data is basically the currency of the digital world. This is way data about you is collected left, right, and center. It's valuable.

    When I trust a bank to safely lock away my grandmother's jewelry, I have to pay for it, but in return, if it just so happened that the bank gets broken into and all my possessions get stolen, at least I'll get (some) compensation.

    When I give my valuable data to a company, I have already paid them (with the data themselves), but I have no case whatsoever if they get compromised.

  • Also on the front page of HN right now is a job posting for Optery (YC W22). Seems like they are growing really fast.

  • Could be as simple as buying a bunch of scent free soap / lotion and some specific vitamin supplements. Walmart / Target were able to detect pregnancy reliably back in 2012 from just their own shopping data.

  • Also possible they have your location if you went to the hospital. Maybe from any Meta "partners" or third party brokers.

FYI, there's a .gov-maintained portal where healthcare companies in the U.S. are legally obliged to publish data breaches. It's an interesting dataset!

https://ocrportal.hhs.gov/ocr/breach/breach_report.jsf

  • This is a suboptimal characterization of this site.

    I think it would be less wrong to say this is where covered entities that discover reportable breaches of PHI (whether their own or that of a BA) that trigger the immediate reporting obligation report them.

    This is a narrower scope of coverage and shallower depth of epistemic obligation than you implied.

  • One of my favorite HIPAA stories is about a doctor who utilized his patient list when sending out campaign-related information when he was running for local office. Over 2 decades of schooling and still didn't understand how stupid this was.

Although almost every company issues a 'we care about your privacy' statement, but there is often very little 'money where your mouth is' resources to back that up.

This is why I am almost always very reluctant to give out any information that is not absolutely necessary to provide me the service that I need. If they don't know it, they can't leak it.

Every company wants you to fill out their standard form that tries to get you to volunteer way more info than they really need.

  • If it's a paper form, I leave it blank. If it's a digital form and required, I put in the business's own phone number, address, etc.

Several maps created to assist the agency with decisions — like where to open new offices and allocate certain resources — were made public through incorrect privacy settings between 2021 and 2025 ... the mapping website was unable to identify who viewed the maps ... implemented a secure map policy that prohibits uploading customer data to public mapping websites.

So a state employee/contractor (doesn't say) uploaded unaggregated customer records to a mapping website hosted on the public internet?

And everyone was fired, the top management has stepped down, and the fines were so massive that nobody ever took a chance with sloppy security ever again. Oh, it's actually the opposite of all that.

The last time this happened, did the AG prosecute the person who discovered the vulnerable data?

  • Ah, I think I recall the story you're referring to: reporter Josh Renaud of the St. Louis Post-Dispatch discovered that a public web site was exposing Social Security numbers of teachers in Missouri. He notified the site's administrators, and later published a story about the leak after it was fixed.

    The governor of Missouri at the time, Mike Parson, called him a hacker and advocated prosecuting him. Fortunately the prosecutor's office declined to file charges though.

I've built Healthcare SAAS APIs that required custom integrations with EHR partners, as well as consulted on similar apps for others.

On top of common OWASP vulnerabilities, the bigger concern is that EHR and provider service apps do not have the robust security practices needed to defend against attacks. They aren't doing active pen testing, red-teaming, supply chain auditing -- all of the recurring and costly practices necessary to ensure asset security.

There are many regulations, HIPAA being the most notable, but their requirements and the audit process are incredibly primitive . They are still using a 1990s threat model. Despite HIPAA audits being expensive, the discoveries are trivial, and they are not recurring, so vulns can originate between the audit duration and the audit summary delivery.

I'm sure they "take security very seriously".

  • I will admit that a level of fatigue has reached me as well. I am not even sure what would be an appropriate remedy at this point. My information has been all over the place given multiple breaches the past few years ( and, I might add, my kid's info too as we visited a hospital for her once ).

    Anyway, short of collapsing current data broker system, I am not sure what the answer is. Experian debacle showed us they are too politically entrenched to be touched by regular means.

    At this point, I am going through life assuming most of my data is up for grabs. That is not a healthy way to live though.

    • >I am not even sure what would be an appropriate remedy at this point.

      It will have to be political and it's got to be fines/damages that are business impacting enough for companies to pause and be like A) Is it worth collecting this data and storing it forever? and B) If I don't treat InfoSec as important business function, it could cost me my business.

      It also clear that certification systems do not work and any law/policy around it should not offer any upside for acquiring them.

      EDIT: I also realize in United States, this won't happen.

      4 replies →

    • This has nothing to do with the "data broker system." Reading between the lines it was more of a "shadow IT" issue where employees were using some presumably third-party GIS service for a legitimate business purpose but without a proper authentication & authorization setup.

      4 replies →

    • Did you actually suffer any negative consequences of these breaches?

      I see so many comments about how punishments for data breaches should be increased, but not a single story about quantifiable harm that any of those commenters has suffered from them.

    • If you want to get more stressed about it and consider the impeding dystopian future, I invite you to think about the “harvest now, decrypt later” potential reality that quantum cryptography is going to enable.

      At some point, everything that we have ever assumed to be confidential and secure will be exposed and up for grabs.

      2 replies →

  • Would you like 2 years of credit monitoring? Or perhaps you can get $5 from this class action settlement.

    • I don't even understand paid credit monitoring.

      Each of the big three credit bureaus offer free accounts where they email me if something changes and allow me to lock and thaw my credit.

Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.

Have the government own data collection? Yeah, I don't even know where to start with all the problems this would cause.

Ignore it and let companies keep abusing customers? Nope.

Stop letting class-action lawsuits slap the company's wrists and then give $0.16 payouts to everyone?

What exactly do we do without killing innovation, building moats around incumbents, giving all the power to politicians who will just do what the lobbyists ask (statistically), or accepting things as is?

  • We apply crippling fines on companies and executives that let these breaches happen.

    Yes, some breaches (actual hack attacks) are unavoidable, so you don't slap a fine on every breach. But the vast majority of "breaches" are pure negligence.

  • Honestly I'd take the 16 cents. Usually its a discount voucher on a product you'd never buy.

    Or if it's a freebie then it's hidden behind a plain text link 3 levels deep on their website.

  • > Restrict data collection? It would kill all startups and firmly entrance a terrible provider monopoly who can comply.

    That's a terrible argument for allowing our data to be sprayed everywhere. How about regulations with teeth that prohibit "dragons" from hoarding data about us? I do not care what the impact is on the "economy". That ship sailed with the current government in the US.

    Or, both more and less likely, cut us in on the revenue. That will at least help some of the time we have to waste doing a bunch of work every time some company "loses" our data.

    I'm tired of subsidizing the wealth and capital class. Pay us for holding our data or make our data toxic.

    Obviously my health provider and my bank need my data. But no one else does. And if my bank or health provider need to share my data with a third party it should be anonymized and tokenized.

    None of this is hard, we simply lack will (and most consumers, like voters are pretty ignorant).

  • The solution is to anonymize all data at the source, i.e. use a unique randomized ID as the key instead of someone's name/SSN. Then the medical provider would store the UID->name mapping in a separate, easily secured (and ideally air-gapped) system, for the few times it was necessary to use.

    • ...use a unique randomized ID as the key...

      33 bits is all that are required to individually identify any person on Earth.

      If you'd like to extend that to the 420 billion or so who've lived since 1800, that extends to 39 bits, still a trivially small amount.

      Every bit[1] of leaked data bisects that set in half, and simply anonymising IDs does virtually nothing of itself to obscure identity. Such critical medical and billing data as date of birth and postal code are themselves sufficient to narrow things down remarkably, let alone a specific set of diagnoses, procedures, providers, and medications. Much as browser fingerprints are often unique or nearly so without any universal identifier so are medical histories.

      I'm personally aware of diagnostic and procedure codes being used to identify "anonymised" patients across multiple datasets dating to the early 1990s, and of research into de-anonymisation in Australia as of the mid-to-late 1990s. Australia publishes anonymisation and privacy guidelines, e.g.:

      "Data De‑identification in Australia: Essential Compliance Guide"

      <https://sprintlaw.com.au/articles/data-de-identification-in-...>

      "De-identification and the Privacy Act" (2018)

      <https://www.oaic.gov.au/privacy/privacy-guidance-for-organis...>

      It's not merely sufficient to substitute an alternative primary key, but also to fuzz data, including birthdates, addresses, diagnostic and procedure codes, treatment dates, etc., etc., all of which both reduces clinical value of the data and is difficult to do sufficiently.

      ________________________________

      Notes:

      1. In the "binary digit" sense, not in the colloquial "small increment" sense.

I've been saying this forever. Computer security is and always will be nothing more than theater for with some minimal effort to cover bases, like hiring an INFOSEC then ignoring them. No on in charge cares about security because the number of people in charge punished for these breaches is still ZERO.

one more reason to overhaul the system. if a health care provider has a security incident they should be sued for the value of the data - and if that bankrupts them, then other providers will (hopefully) learn from that mistake. sort of like OSHA

Sounds like some patients are in for some lucrative free credit and identity monitoring /s

I just heard a chorus of AI agents rejoicing that there's more private data now made public available to train on.

  • It's not often anybody writes a sentence that combines cynicism/negativity about AI with anthropomorphising AI agents!