← Back to context

Comment by jgerrish

3 years ago

> Nowadays this metadata should be extended with description of AI postprocessing operations.

Of course. But to ensure that's valid for multiple purposes we need a secure boot chain, and the infrastructure for it.

To get there we need an AI arms race. People trying to detect AI art with machine learning vs. increasing AI sophistication. Companies trying to discourage AI leaks of company secrets and reduce liability (and reduce the tragic cost of mistakes of course) vs. employees being human.

Or we could have built a responsible and reasonable government that can debate and implement that.

Maybe I'm naive. I'll take responsibility for that.

In the meantime, it's playtime for the AIs. Bring your fucking poo bags, theyre shitting everywhere (1), pack it in, pack it out.

(1) what the world didnt know, was that this was beautiful too.

> Of course. But to ensure that's valid for multiple purposes we need a secure boot chain, and the infrastructure for it. > > To get there we need an AI arms race. People trying to detect AI art with machine learning vs. increasing AI sophistication.

Or we can just recognize the lunacy of it and opt out of caring. You can't stop the flood, so you just learn to live with it. With the right view, the flood becomes unimportant.

Secure boot in practice always become slave boot. The user loses control to even control the operating system running on his device. It is the final nail in the coffin for the already dying concept of general purpose computing.

What measures can the government implement to combat this? AI image modification is realistically possible even on consumer hardware running locally. There is no going back.