Comment by walterbell
19 days ago
Attackers don't have a monopoly on LLM expertise, defenders can also use LLMs for obfuscation.
Technology arms races are well understood.
19 days ago
Attackers don't have a monopoly on LLM expertise, defenders can also use LLMs for obfuscation.
Technology arms races are well understood.
I hate LLM companies, I guess I'm going to use OpenAI API to "obfuscate" the content or maybe I will buy an NVIDIA GPU to run a llama model, mhm maybe on GPU cloud.
With tiny amounts of forum text, obfuscation can be done locally with open models and local inference hardware (NPU on Arm SoC). Zero dollars sent to OpenAI, NVIDIA, AMD or GPU clouds.
What specifically are you suggesting? Is this a project that already exists or a theory of yours?
1 reply →
>local inference hardware (NPU on Arm SoC).
Okay the battle is already lost from the beginning.
1 reply →