Comment by djoldman
7 days ago
A great example that underscores the ordinariness of AI. It's a tool and tools can be used for good/bad/neither and inbetween.
Fake pics have existed since pics existed pretty much.
Kids have been looking for ways to cheat on tests since tests began. If you're a teacher, you're gonna have to test in person.
Fake phone calls, fake other things... yea they're of a different/better quality as the technology has gotten better. Is it so fundamental a shift that nothing can be done? I'm not convinced.
The ease of cheating/creating fakes surely influences how much cheating/fakes are in circulation, and while we can tolerate a little, excessive amounts will be disruptive. So many technologies moved from obscure curiosities to mass adoption just because somebody made them easier/cheaper to use.
If at some point the cheats/fakes will be cheaper and easier than the real thing, you can bet that will be a fundamental shift in how we approach the world.
> If at some point the cheats/fakes will be cheaper and easier than the real thing
Is there any evidence this is going to ever happen? The evidence I see points in opposite direction - everyone has so many sensors and data being recorded about the real world that it is actually harder to fake things.
For example, there used to be a widespread belief in aliens and animal cryptids in the 70s. Today, less so, because people capture everyday reality on much bigger scale than they used to.
>there used to be a widespread belief in aliens and animal cryptids in the 70s. Today, less so
I looked it up and belief in aliens has actually been slowly creeping up since the 70's where it hovered around 29%. The belief spiked in the late 90's and is now around 40%. My guess is the internet played a role.
It's not only the excess, it's the ease of access. Kids can produce lewd pics of class mates, and make their lives hell. This technology is fundamentally evil.
I know it's not the same - but I remember the "bubbling" phase a few years back. It was a bit messy and fortunately faded away pretty quick
This argument is repeated relatively often but I can't take it seriously. Just how???
With as easy and widespread it is I wonder how long before the general assumption will be that nude pics and videos are fakes and will loose its power. It will be just another ai porn on the mountain of other shitty ai porn.
1 reply →
What good can it be used for? Because I haven't seen anything that makes faking pics with AI so good we can ignore the negatives.
The article also seems to take the relativist stance: nothing new to see here, move along now. Why? For the clicks? Just being contrarian?
Many manifestations of generative AI allow people to put concepts onto screens faster. It generally serves as a more efficient translator of "I want a contract like this one but more tailored to [new client]" or "I want to make a strategy for my [new business]."
In information economy jobs, translating thoughts and ideas into better formal communications more efficiently is valuable. Be it pictures or text.
A cynic is a man who knows the price of everything, and the value of nothing.
The same generation process is also used for... well... generating anything. They are compression functions so you are learning an intractable data distribution (you can't write down the equation) and then turning it into something you have a bit more control over. Images were/are a great test platform for this since we humans can visually identify the outputs and verify that we've accurately learned a good generating function. But this process can be applied to any data and truthfully, variants of it are used all throughout since and have been for decades (arguably at least a century, but statistics really benefited from computers).
For just the domain of image generation there's still a lot of useful things. Want to do any upscaling? The processes can help there as you're learning a more complex transform than something like a bicubic interpolation (yeah, there are more advanced algorithms, this is just an example). Same is actually true for downsampling. We can even talk about rotating images, which is a classic problem in old videogames. There's also typical photo editing. This is done widely, most notably by Hollywood. Even if your AI only gets you 70% of the way there it can still be helpful (if the first 70% isn't trivial). It is also directly used in compression algorithms. It is much cheaper to share an encoder and decoder structure which can be computed locally and then transmit a smaller signal. The transmission is not only typically the more expensive part but usually also the bottleneck and has the largest chance of data corruption.
Yeah, I agree, most people are using the tech in weird ways and there's a lot of weird hype around malformed images that are obviously malformed if you looked at it with more than a passing glance (or not through rose colored glasses). But there are a lot of useful applications to this stuff. Ones that could far more benefit the world and personally I'm left wondering "why isn't even a small fraction of the investment that's going into status quo image generators and LLMs going into these other domains?" I'm guessing because image generators and LLms are easier to understand? But it is a shame.
So tired of this lazy argument. Projectile murder with bows existed before guns. Guns changed the world. A severe force multiplier for something bad can't automatically be handwaved away.
Guns have little use beyond injuring or killing or threatening the same. On the good side: one could argue it's sometimes good to kill for hunting. On the bad side... well there is a lot of suicide, murder, and potential for the same.
I'm not sure we understand yet how much positive and how much negative potential there is in AI.
On the good side you can stop other people with guns.
Variations of guns (high pressure tubes with plugs) also shoot nails, pilings, and can quickly split hard surfaces like rocks or pavement. They are also natural parents of internal combustion engines.
Not arguing, just saying.
> Is it so fundamental a shift that nothing can be done? I'm not convinced.
A fundamental shift in our complete trust of technology is good. It encourages ignorance and obedience, and alienates people from each other.
And the fact that AI can be used to fake pictures of your neighbors having sex is nothing but good. No one will be able to say whether any picture is real, so the public won't be able to destroy another young girl's life over it. I also think that arguing about the distribution of pretend movies of your neighbors having sex will have to lead to clear legislation regarding the distribution and sale of personal data.
I wish I could be as optimistic as you with regards to human nature. While we may come out the other side with a world that solves the real problems AI will create. I fear millions of people will have their lives destroyed along the way. Half of America thinks “criminals” don’t deserve Due Process. The guilt stems directly from the accusation. In short, people suck.
It also means less accountability, which is not good. Suddenly anyone can also claim that real videos of them are faked. Tesla tried to argue just that to skirt liability.
https://www.theguardian.com/technology/2023/apr/27/elon-musk...
You have to factor in the overall lower barrier of entry (little to no technical skills required, cheap tools easily accessible, etc) paired with distribution capacity on a massive scale at little cost (like you don't need to be featured in a local newspaper to try to picked up by national networks and go "viral").
You can literally produce fake information at an industrial scale, distribute it in real time, and see what sticks at virtually no cost.
How do you think we're at the point of breaking the world?
People have been killing each other since people exist yet an M30A1 rocket filled with 180k tungsten beads exploding above your city is much more effective than a a dude silex knife. Should we give people military grade weapons ? They're going to kill each others anyways right ? Would you argue they're just the same and not fundamentally different ?
> Kids have been looking for ways to cheat on tests since tests began. If you're a teacher, you're gonna have to test in person.
Access is important. Yes you could hire a scholar to write for you, but that's far more expensive, and detectible by your parents, than asking ChatGPT. Now every student has access to some of the best cheat software on the planet.