Comment by WarmWash

9 days ago

People offing themselves because their lover convinced them it's time is absolutely not worth the extra addiction potential. We even witnessed this happen with OAI.

It's a fast track to public disdain and heavy handed government regulation.

Regulation would be preferable for OpenAI to the tort lawyers. In general the LLM companies should want regulation because the alternative is tort, product liability tort, and contract law.

There is no way without the protections that could be afforded by regulation to offer such wide-ranging uses of the product without also accepting significant liability. If the range of "foreseeable misuse" is very broad and deep, so is the possible liability. If your marketing says that the bot is your lawyer, doctor, therapist, and spouse in one package, how is one to say that the company can escape all the comprehensive duties that attach to those social roles. Courts will weigh the tiny and inconspicuous disclaimers against the very large and loud marketing claims.

The companies could protect themselves in ways not unlike the ways in which the banking industry protects itself by replacing generic duties with ones defined by statute and regulation. Unless that happens, lawyers will loot the shareholders.

  • It’s funny seeing you frame regulation as needed to protect trillion dollar monopolies from consumers and not the other way around.

    • Yes because that is how regulations really work and what the purpose is. In practice all companies both tiny and massive do everything that they can to use the state to quash competition and to reduce the risks of litigation.

      Software in general has been subject to light touches in part because most of the damage that software can really cause is economic and not personal injury. The lines blur when the companies release products that cause mental injuries to users that courts interpret as physical injuries; or if the software reasonably contributes to someone e.g. going crazy and killing another person.

      No one would seriously think of holding Microsoft liable if a kidnapper uses Word to draft a ransom note. But if CoPilot tells you to microwave a baby and you do it, many judges will want to take a close look at the operation of that software service irrespective of voluminous contract disclaimers. The only way the Microsofts of the world can escape that type of liability is with comprehensive regulation.

Or sama is just waiting to premium subscription gate companions in some adult content package as he has hinted something along these lines may be forthcoming. Maybe tie it in with the hardware device Ive is working on. Some sort of hellscape tamogotchi.

Recall: "As part of our 'treat adult users like adults' principle, we will allow even more, like erotica for verified adults," Altman wrote in the Oct.

  • I'm struggling a bit when it comes to wording this with social decorum, but how long do we reckon it takes until there's AI powered adult toys? There's a market opportunity that i do not want to see being fulfilled, ever..

    • id assume they already exist

      if there's porn chat controlled vibrators, nothing would stop somebody from hooking the same up to an AI.

      not sure what the real point would be, since its not like it would have timing, but all the pieces already exist