Comment by jader201

9 days ago

Yeah, this shouldn’t even be on HN, or Washington Post for that matter.

There are going to be countless people that think AI is using their voice. Humans share remarkably similar voices, but obviously you can’t copy that (other than impersonations, obviously).

Unless there is evidence that a company intentionally went after a specific human voice to train their AI, there’s no reason to report on these people claiming AI is using their voice.

Maybe if it’s someone with a very distinctive voice. But this guy, as the OP said, just has a “generic podcast guy” voice.

You absolutely can copy that, it’s called voice cloning and you can do it on as little as a few seconds of audio. Once cloned, you can generate audio with that voice, saying whatever you want it to.

  • To be clear, I mean someone can’t file a lawsuit against someone else for sounding like them.

    Of course you can have an AI target someone else’s voice. My point is that unless there is evidence it was intentional, it’s silly to claim that just because it sounds similar to a human’s voice, that means it must’ve been intentional.

    • I mean someone can’t file a lawsuit against someone else for sounding like them.

      But they did. It's literally what the article, and this thread are about.

      2 replies →