CISA’s acting head uploaded sensitive files into public version of ChatGPT

2 days ago (politico.com)

Yay, on-premise llms are what is recomended for serious use, at least US gov thinks that :) But rest of us need to pay subscriptions for 3r party businesses passing back and forth our... everything ?

In old days ppl was saying: "I have no secrets" and now we evolved into "I know how to not upload important docs" ;)

It's so often the guys that are at the top who are the exception to the rules that are the problem.

I knew some folks who worked military communications and they broke rules regularly because senior officers just didn't want to walk across the street to do something secure...

  • Have worked in places where juniors had to lock devices when on prem; only authorized hardware in the rooms. Yet, the danger was from sloppy O6+ not the O1/GS6 who would (ready&abel) carry the water.

    The is a serious problem with folk with power and authority and somehow no responsibility.

    That's across government, service and corporate.

It’s absolutely necessary to have ChatGPT.com blocked from ITAR/EAR regulated organizations, such as aerospace, defense, etc. I’m really shocked this wasn’t already the case.

  • ITAR, yes, but there's no such thing as a person or organization that's not EAR-regulated. Everything exported from the US that's not covered by ITAR (State Department) is covered by EAR (Department of Commerce), even if only EAR99.

I really enjoyed unchecking all those cookie controls. Of the 1668 partner companies who are so interested in me, a good third have a "legitimate interest". With each wanting to drop several cookies, it seems odd that Privacy Badger only thinks there are 19 cookies to block. Could some of them be fakes - flooding the zone?

Damn. I forgot to read the article.

  • The same cookie can be shared with several partners or collected data can be passed to the partners.

    It's not a cookie law — it's a privacy law about sharing personal data. When I know your SSN and email address, I might want to sell that pairing to 1668 companies and I have to get your "consent" for each.

I for one, after doing a bit of reserach, was shocked to find out the person in question is apparently completely unqualified for the job (if him pasting sensitive information into public ChatGPT didn't already make that abundantly clear). But the highlight from his Wikipedia page is this one:

>In December 2025, Politico reported that Gottumukkala had requested to see access to a controlled access program—an act that would require taking a polygraph—in June. Gottumukkala failed the polygraph in the final weeks of July. The Department of Homeland Security began investigating the circumstances surrounding the polygraph test the following month and suspended six career staffers, telling them that the polygraph did not need to be administered.[12]

So the guy failed a polygraph to access a highly controlled system full of confidential information, and the solution to that problem was to fire the people in charge of ensuring the system was secure.

We're speed running America into the ground and half the country is willfully ignorant to it happening.

It's bizarre that someone would choose to use the public, 4o bot over the ChatGPT Pro level bot available in the properly siloed and compliant Azure hosted ChatGPT already available to them at that time. The government can use segregated secure systems set up specifically for government use and sensitive documents.

It looks like he requested and got permission to work with "For Unofficial Use Only" documents on ChatGPT 4o - the bureaucracy allowed it - and nobody bothered to intervene. The incompetence and ignorance both are ridiculous.

Fortunately, nothing important was involved - it was "classified because everything gets classified" bureaucratic type classification, but if you're CISA leadership, you've gotta be on the ball, you can't do newbie bullshit like this.

  • > It's bizarre that someone would choose to use the public, 4o bot over the ChatGPT Pro level bot available in the properly siloed

    You're assuming the planted lackey has any knowledge of these tools.

People were already careless with social media which was openly public. I imagine it’ll be worse with these LLMs for the average person.

  • This is the real risk I think. Currently there are no means to even pretend to get anything deleted from LLMs either.

    • Yeah and ultimately those tools will be used as advertising machines. You'll get hyper specific targeted ads.

      I'm pretty pessimistic about the future with LLMs, but I can't see it being a net positive for humanity in the long run.

      1 reply →

the current united states government is staffed mostly with unserious people, or people who are serious about doing crimes against humanity. there's very little in between.

  • The vast majority of government staff are career professionals who know what they are doing, not political appointees who showed up in the past year.

    • Right, if we change parent-poster's "staffed mostly with" to "controlled mostly by", I think that's an adequate fix.

    • And who've been subjected to a firing spree I wouldn't wish on my worst enemy. It's the political appointees that are, frankly, there because of the connections and their willingness to, say, "work towards the ultimate leader".

There have to be GovCloud only LLMs just for this case.

I swear this government is headed by appointed nephews of appointed nephews.

I keep thinking back about that Chernobyl miniseries; head of the science department used to run a shoe factory. No one needs to be competent at their job anymore

  • The article says

    > [ChatGPT] is blocked for other Department of Homeland Security staff. Gottumukkala “was granted permission to use ChatGPT with DHS controls in place,” adding that the use was “short-term and limited.”

    He had a special exemption to use it as head of Cyber and still got flagged by cybersecurity checks. So obviously they don't think it's safe to use broadly.

    They already have a deal with OpenAI to build a government focused one https://openai.com/global-affairs/introducing-chatgpt-gov/

    • > So obviously they don't think it's safe to use broadly.

      More likely, everything gets added to the list because there shouldn't be false positives, it's worth investigating to make sure there isn't an adjacent gap in the security systems.

      3 replies →

    • Somehow I think that the weak link in our government security is at the top - the President, his cabinet, and various heads of agencies. Because nobody questions what they're allowed to do, and so they're exempt from various common-sense security protocols. We already saw some pretty egregious security breaches from Pete Hegseth.

      24 replies →

  • > No one needs to be competent at their job anymore

    That's actually the whole point. Placing incompetents in positions of authority means they know absolutely to whom they owe their loyalty. Because they know they would never have that job on merit. And since they don't really know how to do the job, they have no moral qualms about doing a poor job, or strong opinions on what they should be doing -- other than whatever mission their patron has given them. It's a tool used by weak leaders and it's unfortunately very effective.

  • It's all part of the plan.

    Make the government look so incompetent that it is a no brainer to let a private company (headed by your friends and family of course) to do the important jobs and siphon resources much more effectively.

  • > I swear this government is headed by appointed nephews of appointed nephews.

    No joke, the previous head of the State Department task force tasked with fighting corruption and nepotism in international contracting was named Rich Nephew. (He's a very talented career civil servant and I mean no shade I just find that hilarious.)

  • Guess what this administration would love to do with nuclear facilities...

    Any time you have to include "competent" in a description of a job or related technology, that's a clue that it needs requisite oversight and (possibly exponetial) proportionate cost.

  • DEI in action (funny people thst voted for this were apparently anti-DEI and now they get 100% DEI)

    • Of course this comment is mostly ironic, but noting for the whole class, when the MAGA talked about DEI they only ever meant ethnic and sexual minorities, competence be damned!

      That is of course the thing about ideologies like it: loyalty before all else.

  • Hey, working at a shoe factory is serious business. You have to be a real bootlicker to get ahead in a place like that.

    • And when you get to the top, you actually experience how the shoe is on the other foot. One should get out early, not waiting for the other shoe to drop.

  • Isn't using azure openai enough? I read their docs and they have self hosted instances for corporate data compliance.

  • They say that most fascist governments fall apart because they actively despise competence, which it turns out you need if you are trying to run a country.

    • They say it, but they're wrong. Historically speaking there have been basically about 2 fascist governments, and they fell because they lost wars. And Germany, for one, did run them with high competence, to the extend that it took years for many countries to do anything about.

      It we loosen "fascist" to just mean any authoritarian government, there are many that run of very long time.

      2 replies →

    • That’s because eventually reality catches up to you.

      If the reality of a thing is in opposition to the regime’s wishes, you can’t just wish that away.

      However, the regime will favor those who say “yes” over those who accept reality.

    • Competence gives way to ideology.

      I once read an interesting book on the economy of Nazi Germany. There were a lot of smart CEOs and high ranking civil servants who perfectly predicted US industrial might.

  • > There have to be GovCloud only LLMs just for this case.

    I hear Los Alamos labs has an LLM that makes ChatGPT look like a toy. And then there's Sentinel, which may be the same thing I'm not sure.

The Dept of Homeland Security has had its own internal gen-AI chat bot since before Trump took office [0]. That this guy couldn’t make do with that, and didn’t think through the repercussions of uploading non-public documents to a public chatbot doesn’t bode well for his ability to manage CISA

[0] https://www.dhs.gov/archive/news/2024/12/17/dhss-responsible...

I wonder how far removed the interim director of the CISA is from any real world security. I bet they have not seen or solved any real security problems and merely are an executive looking over cybersec. This probably is another example of why you need rank and file security peeps into security leadership roles rather than some random exec.

I would like to be able to say that it is uncommon, but based on what I am seeing in my neck of the woods, all sorts of, one would think, private information is ingested by various online llms. I would have been less annoyed with it had those been local deployments, but, uhhh, to say it is not a first choice is being over the top charitable with current corporates. And it is not even question of money! Some of those corps throw crazy money at it.

edit: Just in case, in the company I currently work at, compliance apparently signed off on this with only a rather slim type of data verbotten from upload.

I adore that this guy had security clearance and I doubt I'd clear that bar. Last time I looked at the interview there was a question:

> have you ever misused drugs?

and I doubt I'd be able to resist the response:

> of course not, I only use drugs properly.

also I wouldn't lie, because that's would undermine the purpose. Still sad I can't apply for SC jobs because I'm extremely patriotic and improving my nation is something that appeals.

  • FWIW I have held a security clearance during my career, and telling them I smoked weed was not a dealbreaker. What they are ultimately looking for is reasons why you could be coerced into divulging classified information. If you owe money due to drugs/gambling, etc, that's where it becomes a dealbreaker.

    • The general rule is not to lie to them, because they will interview all your friends and someone somewhere will rat you out. It’s pointless to try to hide anything during these interviews, and, if you do it, then it’s a dealbreaker.

    • You can see an archived list of industrial security clearance decisions here [0] which is interesting, and occasionally entertaining, reading. "Drug involvement security concerns" usually involve either actively using drugs or, worse, lying to cover up drug use, both of which are viewed as security concerns and grounds for rejection.

      [0] https://web.archive.org/web/20170218040331/http://www.dod.mi...

  • You would not get a security clearance, and the admin would make a note on your IQ. The correct answer is simply

    > no

    and keep the rest of it in your head.

    • how is it low IQ to be honest? People have to make decisions and if the decision is "no", I can handle that. Empowering the person making the decision to the fullest extent is something I'd still be interested in, even if it is to my detriment. Its like when middle-management ask me to lie or withold information from the COO or CEO, its just a no. If they're shit then its on the organisation to sort that out. Second guessing everything leads to even worse dysfunction.

      We're not talking about sneaking into a concert or something low-stakes, the security of our nation is the foundation of our very civilization. I have dual citizenship of a nation that borders Russia and was once the USSR, so I appreciate the stakes of worst case scenarios because one of my nations was under that boot rather recently.

It’s happening all across corporate too

  • And in all manner of regulated industries. People simply cannot resist throwing anything and everything at the magic text machine. A company can control its IT assets, but if the content is displayable on a screen, rest assured users will just take photos and upload to their personal LLM accounts to get the generative answers they endlessly desire.

    • I’m actually shocked that security teams aren’t up in arms over this exfiltration of company secrets. I know some companies that are running their own models and agents but the vast majority are copilot/claude/codex’ing away sending all that sweet sweet IP to 3rd parties

This administration's op-sec has been consistently "barney fife" levels of incompetence.

  • Maduro and his bodyguards would slightly disagree.

    • Unfortunately for Maduro, that operation was run by military professionals rather than directly by Trump's lackeys. But give Hegseth enough time and he'll bring them around to the new standard.

  • When I saw mention it was in context of a “contracting” type set of info / document I actually chuckled - I spent a decade in procurement and sales for high stakes contracts. Incompetent person has no idea how to manage a procurement and goes online. Basically this is a 2026 version of an inept executive bashing “what is an RFP” into a search engine from 2007.

  • And when the CCP compromised the law enforcement portal for every American ISP, stealing info on 80% of Americans, including both the Kamala and Trump campaigns, under the previous admin it was rock solid op-sec, presumably.

    Or when the previous admin leaked classified Iran attack plans from the Pentagon, so bad that they didn't even know whether they were hacked or not.

    You can at least pretend to make a technical argument over a political one.

    • You're the one making a political argument by doing a whataboutism that attempts to negate the failings of this administration. Which you're not even doing correctly because by every measure the previous administration was drastically more competent by looking at the qualifications of the people who filled their posts.

      2 replies →

  • It's been the same with every administration, unfortunately. It's just a side effect of such an unnecessarily big goverment.

    • Inviting a reporter from the Atlantic to your signal chat where you coordinate military plans has nothing to do with government being too big

      3 replies →

    • You have to actively maintain a state of ignorance to say this isn’t different. Go look at all of the public reporting starting in January about the way appointees in the Pentagon, DOGE, etc. blew through the normal policies and procedures controlling access, clearing people, or restricting sharing.

      For example, this wasn’t just “oops, I used the wrong number” but Hegseth getting a custom line run into a secure facility so he could use a personal computer of unknown provenance and security:

      https://www.nytimes.com/2025/04/24/us/politics/hegseth-signa...

      That’s one of the reasons why one of the first moves they made was to fire CISOs and the inspectors general who would normally be investigating serious policy violations.

      This isn’t “big government”, it’s the attitude that the law is a tool used to hurt their opponents and help themselves but never the reverse.

    • You really think that every other administration has had this level of incompetence? The current bumbling and corruption is absolutely unparalleled.

Sounds about on par with what I would expect competence wise.

  • Hand-picked by Noem, so yeah.

    https://en.wikipedia.org/wiki/Madhu_Gottumukkala

    > In April 2025, secretary of homeland security Kristi Noem named Gottumukkala as the deputy director of the Cybersecurity and Infrastructure Security Agency; he began serving in the position on May 16. That month, Gottumukkala told personnel at the agency that much of its leadership was resigning and that he would serve as its acting director beginning on May 30.

    • > Gottumukkala had requested to see access to a controlled access program—an act that would require taking a polygraph

      Are the US ok? It's 2026 not 1926

      7 replies →

    • This is what you get when you prize personal loyalty over competence.

      This issue is the one thing that gives me some hope that they can be ousted -- they are collectively too stupid and motivated only by their self interests to hold their power indefinitely.

      1 reply →

If I did this with a banal internal documentation at work I would be written up and maybe fired over breaking known policy. This administration is so ridiculously incompetent, and interim head of cyber security.. leaks. The onion wouldn't write this.

I’m a little surprised by the takes in the comments. Obviously, heads of departments or agencies, CEOs, or similar personnel are generally not in the same league as normal employees when it comes to compliance.

Productivity and efficiency are key for their work. I am sure there are lots of Sysadmins here, that had to disable security controls for a manager or had to configure something in a way to circumvent security controls from actually working. I have been in many situations where I have been asked by IT colleagues if doing something like that was fine, because an executive had to read a PowerPoint file NOW.

  • Sysadmins are afforded special leniency because of their demonstrated competence. Their leeway is earned. In this case, the "cyber security chief" has no proven skill other than absolute loyalty to his boss, which justified his skipping the usual vetting procedure.

  • Obviously those kinds of stories are common, but you can’t seriously be suggesting that it is a good or acceptable thing?

    Execs are just as stupid as your average person and bypassing security controls for them puts an organization at an even greater risk due to the kinds of information they have access to. They just get away with it because they’re in charge.

  • It touched a nerve because no one in the trump admin is qualified to do their job. There's a lot of corruption and a lot of people getting access to things they shouldn't due to their relationship and loyalty, not merit. There's a big difference from a sys admin having super user access and some random politically connected hack abusing their privilege.

    DOGE/Musk, noem, Kash, hegseth, etc.

This is a "Cybersecurity chief" causing an intern-level IT incident.

In many industries, this would be a rapid incident at the company-level and also an immediate fireable offense and in some governments this would be a complete massive scandal + press conference broadcasted across the country.

  • I think he is real deal. I mean in reality he learned or knows very little about technical matters. No fraud needed.

BTW, what's the current status on LLMs and confidential documents ? Which license from which suppliers are fine and which aren't ?

Where does this "cybersecurity monitoring" take place? On OpenAIs side? Or some kind of monitoring tools on the devices themself?

  • In any enterprise, normal would be to have monitoring on all ingress and egress points from the network and on devices themselves. You can't only have monitoring on managed devices because someone might BYOD and plug in an unmanaged device/connect it to internal wifi etc.

    You bring in vendors and they need guest wifi to give you a demo, you need to be able to give them something to connect to but you don't want that pipe to be unmonitored.

    • What I'm really asking/wondering is how (and who or which party) figured out that this was leaked, and secondly how that propagated to the public. I don't really expect to find that answer. But if I had to guess OpenAI found out first, because employees there are more likely to leak the fact that the leak happened.

      But also, how was it caught in the first place? Was it automatically flagged because content scanners automatically identified this as a concern, or was his account specially flagged for extra monitoring because of who he is?

      1 reply →

Once again, if you or I did this, it's federal crime and federal time.

But when the chief does it, it's an oopsie poopsie "special exemption".

> Cybersecurity monitoring systems then reportedly flagged the uploads in early August. That triggered a DHS-led damage assessment to determine whether the information had been exposed.

So it means, a DLP solution, browsers trusting its CA and it silently handling HTTP in clear-text right?

From wikipedia:

He graduated from Andhra University with a bachelor of engineering in electronics and communication engineering, the University of Texas at Arlington with a master's degree in computer science engineering, the University of Dallas with a Master of Business Administration in engineering and technology management, and Dakota State University with a doctorate in information systems.

And he still manages to make a rookie mistake. Time to investigate Mr. Gottumukkala's credentials. I wouldn't be surprised if he's a fraud.

https://en.wikipedia.org/wiki/Madhu_Gottumukkala

He was the 'CTO' of South Dakota and later the CIO/Commissioner of the South Dakota Bureau of Information and Telecommunications under governor Kristi Noem.

Edit: (From a European perspective) it seems like the southern states really took over the US establishment. I hadn't really grasped the level of it, before.

  • > Edit: (From a European perspective) it seems like the southern states really took over the US establishment. I hadn't really grasped the level of it, before.

    It's good to know the Americans aren't the only ones who never look at maps outside their own country

  • South Dakota has a population of less than 1 million people and the complexity of a CTO job of a state like South Dakota would be quite low. It is < 0.3% of the US Population and likely has de minimis benefit programs.

  • South Dakota is in the northern portion. But to your statement, historically speaking the southern states after the civil war kept trucking along in terms of power and influence.

    • The Dakotas weren't really north/south in the Civil War context; only about 4k people lived there in 1860. It was largely empty land, and not a state until 1889.

      1 reply →

  • That is one of the best comments I've seen on HN to date!

    It seriously got me laughing. Thanks.

    • I am so happy that my embarrassing lack of geographical knowledge of the US states' internal geographies amused you. A good laugh is great for your health, I've heard.

      At least I know where your country is located.

      Now, let me quiz you on the geographical locations of French regions? Or perhaps Finnish regions, if that's something you work closer with, day-to-day?

      ;)

      1 reply →

> None of the files Gottumukkala plugged into ChatGPT were classified, according to the four officials, each of whom was granted anonymity for fear of retribution. But the material included CISA contracting documents marked “for official use only,” a government designation for information that is considered sensitive and not for public release.

Guys... we're talking about FOUO. Not even low-level classified. This is a nothingburger. The toilet paper you wipe with is FOUO, there is essentially no document in the government that isn't at least FOUO.

Leaked is not the correct word here. Generally as it's used, it implies some intent to disclose, the information for it's own purposes. You would call a disclosure to the war thunder forums a leak, because the intent was to use that information to win an argument. You wouldn't call Leaving boxes of classified information in a wearhouse where you'd normally read them a leak. (At least not as a verb). Likewise you wouldn't call it a leak if you mistakenly abandoned them in a park.

That said, IIRC For Official Use Only is the lowest level of classification (note not classified) it's not even NOFORN. It's even multiple levels below Sensitive But Unclassified.

So, who cares?

Much more significant is he failed the SCI/full poly... that means you lied about something. Yes I know polys don't work, but the point of the poly is to try to ensure you've disclosed everything that could be used against you, which ideally means no one could flip you or manipulate you. The functional part is to determine if you have anxiety about things you might try to hide, because that fear can be used against you. No fear/anxiety, or nothing you're trying to hide means you're harder to manipulate.

That feels bad even ignoring the whole hostile spys kinda thing.