← Back to context

Comment by hsbauauvhabzb

1 year ago

This seems like an odd kneejerk to a valid problem that may prevent legitimate* uses of the tech - my doctors automated ‘press 1 to speak to a doctor’ could be improved by an AI voice like siri.

The problem is misuse of AI to impersonate a real person, and failing to disclose that the content you are about to see/hear/read has been autogenerated.

The mechanism used might solve one issue, but has turned the entire thing into a game of whack-a-mole.

*I use the term legitimate, but note I absolutely despise the use of online chatbots and imagine I’ll hate voice ones as much if not more.

I think robocall means an unsolicited call someone makes to you. Answering services aren’t affected here.

  • Either way, I could see some valid use cases, I don’t like them but I don’t see how they’re any different to a human reading a script or recorded message. Bad actors won’t be stopped by this law, so it seems like pissing in the wind.