Comment by TheRoque
6 days ago
One of the biggest anti LLM arguments for me at the moments is about security. In case you don't know, if you open a file with copilot active or cursor, containing secrets, it might be sent to a server a thus get leaked. The companies say that if that file is in a cursorignore file, it won't be indexed, but it's still a critical security issue IMO. We all know what happened with the "smart home assistants" like Alexa.
Sure, there might be a way to change your workflow and never ever open a secret file with those editors, but my point is that a software that sends your data without your consent, and without giving you the tools to audit it, is a no go for many companies, including mine.
It's why I use Aider, because it only operates on explicit files that you give it. Works great with OpenAI but if you are really worried, it interfaces perfectly with Ollama for local LLMs. A 12b model on my Mac does well enough for coding that it's serviceable for me.
Which 12b model are you running?
Gemma 12b quantized (gemma3:12b-it-qat in ollama)
1 reply →
At day job while someone was trying out windsurf, it simply picked up an environment variable that contained sensitive data and used it in code. This is wild.
Sometimes I wonder if all the hype around being left behind and needing to try these things just to see how great they can be is being deliberately put out there to increase their training data.
Too many vibe coders contribute trash code if any. They need more code from so called experts that isn't open source yet.
It's pretty unlikely someone at Cursor cares about accessing your Spring Boot project on GitHub through your personal access token – because they already have all your code.
I don't think that's the threat model here. The concern is regarding potentially sensitive information being sent to a third-party system without being able to audit which information is actually sent or what is done with it.
So, for example, if your local `.env` is inadvertently sent to Cursor and it's persisted on their end (which you can't verify one way or the other), an attacker targeting Cursor's infrastructure could potentially compromise it.
You write your secrets to disk?
Having a gitignored .env file is a pretty common pattern.
If you say so. I’d never do that.