← Back to context

Comment by chenzhekl

13 hours ago

It feels like the current trend is a bit scary: the more AI advances, the more people with money and resources will gain disproportionately greater advantages. For example, they can make their own software more secure, while also finding it easier to discover ways to attack other software.

You can already do that today by hiring a security researcher. I can guarantee you that Apple has access to people of a higher caliber than my startup.

I could see a world where 1 year from now I can have glassing do a full sweep of my codebase for a given price (say: $10k). Running that once a year is within my means and would make my software much more secure than it is today.

  • I spend well over that of my employers money on pentesting every year. I’m absolutely certain Claude could perform as good or better a job using what’s available today.

    It had crossed my mind that an AI agent pentester would be an interesting product to build. Once again though, the labs are just going to build it because it’s a thin thin wrapper.

    Beyond existing software with vulnerabilities, the really important aspect of this for Anthropic et al is that the gigatons of code that are being generated every day needs to be secured.

    • There are quite a few such startups already out there. Results are mixed so far. Though I believe they get much better over the coming months and years.

  • Yeah but even Carlini who is a good security researcher said he has found more valid vulnerabilities in the last week than his entire career before this. That sounds like it’s clearly better/faster/cheaper than a human security researcher that would cost $300,000 a year.

Sounds normal to me!

i.e. it may be a step change and that could very well have distinct and noticeable real world effects, like other technologies have in the past, but it’s nothing fundamentally new.

This has increasingly been my take. If we accept that AI is an amplifier of impact, then it follows it will amplify disparities.