Comment by jacquesm

3 days ago

Anyone using non-self hosted AI for the processing of sensitive information should be let go. It's pretty much intentional disclosure at this point.

Worst local (Australia) example of that

  Following a public statement by Hansford about his use of Microsoft's AI chatbot Copilot, Crikey obtained 50 documents containing his prompts...

  FOI logs reveal Australia's national security chief, Hamish Hansford, used the AI chatbot Copilot to write speeches and messages to his team. 

(subscription required for full text): https://www.crikey.com.au/2025/11/12/australia-national-secu...

It matters as he's the most senior Australian national security bureaucrat across five eyes documents (AU / EU / US) and has been doing things that makes the actual cyber security talent's eyes bleed.

  • Holy crap that is such a bad look. That guy should immediately step down and if he doesn't he should be let go.

    • That wasn’t my first thought. My first thought was; every senior executive everywhere is probably doing the same thing.

Years ago people routinely uploaded all kinds of sensitive corporate and government docs to VirusTotal to scan for malware. Paying customers then got access to those files for research. The opportunities for insider trading were, maybe still are, immense. Data from AI companies won't be as easy to get at, but is comparable in substance I'm sure.