← Back to context

Comment by jmull

2 years ago

Well, here are some things that aren't really being disputed:

* OpenAI wanted an AI voice that sounds like SJ

* SJ declined

* OpenAI got an AI voice that sounds like SJ anyway

I guess they want us to believe this happened without shenanigans, but it's bit hard to.

The headline of the article is a little funny, because records can't really show they weren't looking for an SJ sound-alike. They can just show that those records didn't mention it. The key decision-makers could simply have agreed to keep that fact close-to-the-vest -- they may have well understood that knocking off a high-profile actress was legally perilous.

Also, I think we can readily assume OpenAI understood that one of their potential voices sounded a lot like SJ. Since they were pursuing her they must have had a pretty good idea of what they were going after, especially considering the likely price tag. So even if an SJ voice wasn't the original goal, it clearly became an important goal to them. They surely listened to demos for many voice actors, auditioned a number of them, and may even have recorded many of them, but somehow they selected one for release who seemed to sound a lot like SJ.

Clearly an SJ voice was the goal, given that Altman asked her to do it, asked her a second time just two days before the ChatGPT-4o release, and then tweeted "her" on the release day. The next day Karpathy, recently ex-OpenAI, then tweets "The killer app of LLMs is Scarlett Johansson".

Altman appears to be an habitual liar. Note his recent claim not to be aware of the non-disparagement and claw-back terms he had departing employees agree to. Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!

  • They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.

    • My guess: Sam wanted to imitate the voice from Her and became aware of Midler v. Ford cases so reached out to SJ. He probably didn't expect her decline. Anyway, this prior case tells that you cannot mimic other's voice without their permission and the overall timeline indicates OpenAI's "intention" of imitation. It does not matter if they used SJ's voice in the training set or not. Their intention matters.

      68 replies →

    • Sure, no-one is disputing that, and despite this Altman then contacts SJ again two days before release asking her to reconsider, then tweets "her" to remind the public what he was shooting for. The goal could have just been ChatGPT with a voice interface, but instead Altman himself is saying the the goal was specifically to copy "her".

      20 replies →

    • Even if the voice actor was sourced before they originally contacted SJ, it was clearly the intent to sound like her. There are so many other distinctive voices they could have chosen, but instead they decided to go as close as possible to "her" as they could. Many people thought it was SJ until she stated it wasn't. I appreciate the voice actor may sound like that naturally, but its hardly coincidental that the voice that sounds most like the voice from "her" was the one chosen for their promotion. It is clearly an attempt to pass-off.

      11 replies →

    • It was not claimed that they cloned ScarJo's voice. They hired a soundalike when they couldn't get the person they wanted. Use or lack of use of AI is irrelevant. As I said before, both Bette Midler and Tom Waits won similar cases.

      Since they withdrew the voice this will end, but if OpenAI hadn't backed off and ScarJo sued, there would be discovery, and we'd find out what her instructions were. If those instructions were "try to sound like the AI in the film Her", that would be enough for ScarJo to win.

      I know that the Post article claims otherwise. I'm skeptical.

      3 replies →

    • Your immediate acceptance that a timeline that represents the best spin of a deep-pocketed company in full crisis PR mode proves the story "false", full stop, no caveats is... I wouldn't say mind-bending, but quite credulous at a minimum. The timeline they present could be accurate but the full picture could still be quite damning. As Casey Newton wrote today [1]:

      > Of course, this explanation only goes so far. We don’t know whether anyone involved in choosing Sky’s voice noted the similarity to Johansson’s, for example. And given how close the two voices sound to most ears, it might have seemed strange for the company to offer both the Sky voice and the Johansson voice, should the latter actor have chosen to participate in the project. [...] And I still don’t understand why Altman reportedly reached out to Johansson just two days before the demonstration to ask her to reconsider.

      They absolutely have not earned the benefit of the doubt. Just look at their reaction to the NDA / equity clawback fiasco [2], and their focus on lifelong non-disparagement clauses. There's a lot of smoke there...

      [1] https://www.platformer.news/openai-scarlett-johansson-chatgp...

      [2] https://www.vox.com/future-perfect/351132/openai-vested-equi...

      1 reply →

    • >They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.

      People lose their rational mind when it comes to people they hate (or the opposite I suppose). I don't care for Sam Altman, or OpenAI one way or another, so it was quite amusing to watch the absolute outrage the story generated, with people so certain about their views.

    • I don't understand the point you are trying to make. The essential question is whether they were trying to imitate (using a voice actor or otherwise) Scarlett Johansson's voice without her permission. Nothing in the article refutes that they were; whether they sought the permission before or after they started doing the imitation is irrelevant. Others have pointed to previous case law that shows that this form of imitation is illegal.

      Moreover I can't see any reasonable person concluding that they were not trying to imitate her voice given that:

      1. It sounds similar to her (It's unbelievable that anyone would argue that they aren't similar, moreso given #2).

      2. Her voice is famous for the context in which synthetic voice is used

      3. They contacted her at some point to get her permission to use her voice

      4. The CEO referenced the movie which Johansson's voice is famous for (and again depicts the same context the synthetic voice is being used) shortly before they released the synthetic voice.

    • Except the story isn't false? They wanted her voice, they got her voice*, they did marketing around her voice, but it's not her voice, she didn't want to give them her voice.

      Notice how the only asterisk there is "it's technically not her voice, it's just someone who they picked because she sounded just like her"

    • >> They hired the actor that did the voice months before they contacted SJ.

      Are you saying that story is false?

    • Yeah, but then again, I totally expected this opening the comment threads. Same happened with RMS debacle, same happened with similar events earlier, same happened on many a Musk stories. It seems that a neat narrative with clear person/object to hate, once established, is extremely resilient to facts that disprove it.

      25 replies →

    • It’s human nature: people see others achieve what they cannot, and try to pull them down. You see this wrt Musk on this site a lot, too.

      18 replies →

    • Tbf here Altman really screwed this over with that tweet and very sudden contacting. There probably wouldn't be much of a case otherwise.

      If I had to guess the best faith order of events (more than what OpenAi deserves):

      - someone liked Her (clearly)

      - they got a voice that sounded like Her, subconsciously (this is fine)

      - someone high up hears it and thinks "wow this sounds like SJ!" (again, fine)

      - they think "hey, we have money. Why not get THE SJ?!"

      - they contact SJ, she refuses and they realize money's isn't enough (still fine. But this is definitely some schadenfreude here)

      - marketing starts semi-indepenently, and they make references to Her, becsuse famous AI voice (here's where the cracks start to form. Sadly the marketer may not have even realized what talks went on).

      - someone at OpenAi makes one last hail Mary before the release and contacts SJ again (this is where the trouble starts. MAYBE they didn't know about SJ refusing, but someone in the pipeline should have)

      - Altman, who definitely should have been aware of these contacts, makes that tweet. Maybe they forgot, maybe they didn't realize the implications. But the lawyer's room is now on fire

      So yeah, hanlon's razor. Thus could he a good faith mistake, but OpenAi's done a good job before this PR disaster ruining their goodwill. Again, sweet Schadenfreude even if we are assuming none of this was intentional.

      3 replies →

    • The population of this site reacts to all stories like this. It’s only Gell-Mann Amnesia that causes your mind to bend.

    • Legally, the issue isn’t what they were thinking when they hired the actor, it’s what the intent and effect was when they went to market. (Even if there was documentary evidence that they actively sought out an actor for resemblance to SJ’s voice from day one, the only reason that would be relevant is because it would also support that that was there intent with the product when it was actually released, not because it is independently relevant on its own.)

      Whether or not they had any interest in SJ’s voice when they hired the other actor, they clearly developed such an interest before they went to market, and there is at least an evidence-based argument that could be made in court that they did, in fact, commercially leverage similarity.

      1 reply →

    • It is a curious reaction, but it starts to make sense if some of these posters are running ops for intelligence agencies. Balaji Srinivasan noted that as the US started pulling out of foreign wars, the intelligence apparatus would be turned inward domestically.

      Some of it can also be attributed to ideological reasons, the d/acc crowd for example. Please note I am not attacking any individual poster, but speculating on the reasons why someone might refuse to acknowledge the truth, even when presented evidence to the contrary.

  • > Are we supposed to believe that the company lawyer or head of HR did this without consulting (or more likely being instructed by) the co-founder and CEO?!

    Yes this is pretty typical. The CEO doesn’t make all decisions. They hire people to make decisions. A company’s head of legal could definitely make decisions about what standard language to use in documents on their own.

  • It could have simply been the other way around: they auditioned some unknown voice actors, then someone noted that one of them sounded like Scarlett Johansson. They optimistically contacted SJ, assuming she would agree, but then had to back off.

Sky does not really sound like SJ though if you listen side by side. According to OAI's timeline, they intended to have Sky in addition to SJ. OAIs voice models including Sky predate the GPT4o voice assistant. Also:

"In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”"

It did not seem like an issue before and the Sky voice was public many months before GPT4o. I don't believe SJ can claim to own all young, attractive woman voices whether they are used as a voice assistant or not. It seems like the issue is being blown out of proportion. It does make a good story. The public perception of AI right now is generally negative and people are looking for reasons to disparage AI companies. Maybe there are good reasons sometimes, but this one is not it.

  • > It seems like the issue is being blown out of proportion.

    It kinda feels like its on purpose. Someone in a previous thread mentioned that this might have been a cynical marketing ploy and I'm warming up to the theory. After they recorded the Sky VA, they figured out a whole marketing campaign with SJ to promote the voice feature. After she turned them down (twice), they released with just enough crumbs referencing the movie to goad SJ into committing a first degree Streisand.

    With the slow roll out, everyone would have forgotten about the feature the day after the announcement but now it's been in the news for a week, constantly reminding everyone of what's coming up.

    • Calling it a Streisand on behalf of SJ is wrong though. She wanted it to be a topic of discussion and succeeds with it.

    • In addition they have prepared for a full court case before hand, with all their ducks in a row in theory. I am not a lawyer so I am not sure if the law works this way but this might help them defend their case and set a precedence.

    • Are we really at a point where tech companies will bait lawsuits just to get more PR? Clearly they need to be smited down to remember why old school companies go out of their way to avoid such possibilities of lawsuits

  • I'm also curious, legally speaking, is it an issue even if Sky's actress does sound like Scarlett? What if OpenAI admits they intentionally chose someone who sounded like Scarlett? Does it matter whether she was using her natural speaking voice or intentionally mimicking Scarlett's voice and mannerisms?

    This seems similar to the latest season of Rick and Morty. Whether justified or not in that particular case, it rubs me the wrong way a bit in principle to think that a production can fire someone only to hire someone else to do a near-perfect copy of their likeness. If (as in the OpenAI case) they'd gone further and trained an AI on the impressions of Justin's voice, would that have been considered an AI impersonation of Justin with extra steps?

    All of which is to say, this seems like a pretty interesting legal question to me, and potentially broader than just AI.

    • The fired actor would have already signed away any claim to the character's likeness. The likeness the company cares about is that of the character, not of the actor portraying the character. The actor never owned the character, so the actor shouldn't be miffed that someone else gets the part for future performances.

      2 replies →

    • See Midler vs Ford, Glover vs Universal or Stefani vs Activision for prior cases in this area. Courts usually side with the person being imitated.

  • "SJ can't own all female AI voices" is attacking a straw man version of the complaint, which is much narrower. The question is whether OpenAI deliberately fostered the impression of an association between their product and her performance, which she had so far refused.

    To your point, there have many female assistant voices on the market, including Sky -- but what might have tripped the line of impersonation was the context this particular one was presented and marketed. I don't know where exactly that line should be, but you can certainly reject this kind of marketing without stifling anybody's legitimate career.

Regardless of the moral implications, "sounds almost exactly the same" is not copyright infringement. Perhaps it could be trademark infringement if she had trademarked her voice like Harley-Davidson attempted (and failed) to trademark the sound of their motorcycles, but "sounds alike" is a pretty hard case to prove, and it's completely blown away if they can demonstrate that another human sounds indisputably similar.

People do celebrity impressions all the time, and that's not infringement either, because it's not actually copying that person's voice.

I'm sympathetic to SJ in this matter, especially after the Disney Black Widow debacle, but it sounds like she had the opportunity to write herself a nice check, and she turned it down.

On the basis of this article, it sounds like she doesn't have the cause of action that she had believed she had; I imagine that her legal team are now advising a fast settlement, but OpenAI's legal team might prefer to milk the free publicity for as long as they can, especially if they are fairly certain they would prevail at trial.

  • It isn't about copyright, it's about passing-off, it's described elsewhere in detail in these threads what it means. It's about intention and what the customer believes. If customers might believe its SJ, due to samas tweets, general likeness in voice, and the context (voice assistant), the public info about them trying to get SJ to do this - that's passing-off, even if it wasn't training on her voice per se. There are numerous law cases about this.

  • > it sounds like she had the opportunity to write herself a nice check, and she turned it down.

    If I were SJ, I'd turn it down too. Shes in no need of money, and selling her voice to OpenAI would make most of creators and every single voice actor hate her (not to mention the Twitter mob).

    In majority of creative circles, the current social norm is to hate AI, so touching AI in any way is too risky for reputation.

It probably is worth paying attention to the water WaPo is carrying for OpenAI here next to their publisher's announcement about prioritizing the use of AI in their newsrooms.

It doesn't seem like you'd need "shenanigans" for this. Lots of voice actors are capable of doing voices that sound like other people, and some even have a natural voice that happens to sound very similar to a particular more noteworthy celebrity. AFAIU, the rights to your likeness only apply to your likeness, not to the likeness of someone else who happens to look or sound a lot like you.

For a case that doesn't involve AI at all, consider situations where a voice actor in a cartoon is replaced (sometimes while still alive) by someone who can perform a voice that sounds the same. Decisively not illegal. Most people don't even find it immoral, as long as the reason for getting rid of the original voice actor wasn't wrong on its own (e.g. Roiland).

  • > For a case that doesn't involve AI at all, consider situations where a voice actor in a cartoon is replaced (sometimes while still alive) by someone who can perform a voice that sounds the same. Decisively not illegal.

    Because there are contractual clauses. Do you think Hank Azaria owns the voice of 'Homer Simpson'? Or does Fox own that? It would be crazy to develop a show and then be held hostage to your voice actors for all future shows - what if they get hit by a car?

  • The attempts to sound like Mel Blanc after his death just don't sound right to me. Or maybe it's just the bad scripts.

The article clearly disputes this. They hired and worked with the voice actor for Sky months before the first time SJ was contacted, and the voice actor used for Sky never had the movie Her or SJ's name mentioned to her a single time

  • The Movie Her predates all of this by years, and Sam Altman even tweeted "her"! The OpenAI team are clearly well aware of Scarlett's voice (its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry). The movie predates all of this by years - of course they knew.

    When auditioning actors "months before" they can still look for an actor who guess what? Sounds like SJ, even "before the first time SJ was contacted".

    As the actor - I'd likely also be looking to emulate SJ in Her - its clearly what the client was looking for.

    • > its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry

      Let's not exaggerate. It was a somewhat popular movie, yes, but not really defining and far from the first example of conversational AI speaking woman's voice. There are plenty of examples in movies and TV shows.

      If anything, the seminal work in this space is Star Trek casting Majel Barrett-Roddenberry as the voice of computer systems with conversational interfaces, as early as 1987 (or 1986, if she had that role in the Original Series; I don't remember those episodes too well), all the way to ~2008 (or to 2023, if you count post-mortem use of her voice). That is one distinctive voice I'd expect people in OpenAI to be familiar with :).

      Also, I can't imagine most people knowing, or caring, who voiced the computer in Her. It's not something that most people care about, especially when they're more interested in the plot itself.

      9 replies →

    • Sure, it could have happened, but it seems we don’t have evidence either way.

      Tweeting “her” months later doesn’t prove anything. That Tweet might superficially look like evidence of intent, but if you think about it, it’s not.

      5 replies →

  • Right. And that's extremely hard to believe. A discovery search of the internal emails should give us a definitive answer.

    • To find this "extremely hard to believe", you have to argue that this story, which has multiple sources unaffiliated with OpenAI, contemporaneous documentary evidence, and is written by a reporter with every incentive in the world to roast OpenAI, is directly wrong about facts plainly reported in the story.

      I think you have to want this story to be wrong to think it's wrong. It's a tough beat! Everyone was super sure OpenAI did this bad thing just a couple days ago, and now they're feeling sheepish about it.

      8 replies →

  • That doesn't mean anything. They could have been and were likely developing the process and technology while having Johansson in mind the whole time.

    > never had the movie Her or SJ's name mentioned to her a single time

    How do you know that?

    • The article says:

      >The agent, who spoke on the condition of anonymity to assure the safety of her client, said the actress confirmed that neither Johansson nor the movie “Her” were ever mentioned by OpenAI.

      2 replies →

I would say that OpenAI wanted something that sounded like her which in turn sounded like Scarlett Johannson.

I also think the "sounded like" is less clear than you think. Is it similar, yes. But how similar I am not sure what the line is but for sure I didn't think it was Scarlett Johannson. By saying it is Scarlett Johannson and relating it to her our brains will make the association though. That is marketing.

Since they asked two days before it was launched back in September my guess is that the voice was already created by then.

But there's nothing wrong with this!

Let's say I'm making a movie. I have an old wizard character similar to Gandolf in Lord of the Rings, so I contact the guy who played Gandolf in Lord of the Rings. He says no, so I hire a different actor who also fits the "old wise wizard" archetype.

Is any of that illegal?

> I guess they want us to believe this happened without shenanigans, but it's bit hard to.

Right. And the question is, did they actually used SJ's voice as part of their training data? Because there's a lot of that available given all her works.

There's a reason why they wanted 'her', specifically. What reason is that? If they could just work with a noname voice actress (likely, for far cheaper), why not just do that from the get go? It could be a marketing gimmick and maybe they wanted her name more than just the voice to add to the buzz. If it is not that, then the sequence of events doesn't make sense.

> In a statement from the Sky actress provided by her agent, she wrote that at times the backlash “feels personal being that it’s just my natural voice and I’ve never been compared to her by the people who do know me closely.”

This isn’t the timeline though? The actor for Sky was hired and cast before they even reached out to SkarJo. The idea that they wanted to literally reproduce “Her” feels like motivated reasoning to me.

I don't understand.

If you literally use SJ's image or voice, then you're in trouble.

If it's an SJ lookalike or soundalike (and you don't claim otherwise), there's no problem.

Right? What's the "shenanigans?"

  • > If it's an SJ lookalike or soundalike (and you don't claim otherwise), there's no problem.

    This isn't true. At least with respect to "soundalike" see, e.g., Waits v. Frito-Lay 978 F.2d 1093 and Midler v. Ford Motor Co. 849 F.2d 460.

  • Your second statement may not be true, legally, and at the very least many (including the actress in question) believe it is not true, ethically.

I think a better characterization would be:

* OpenAI wanted an AI voice that is SJ's voice

* SJ declined

* OpenAI got an AI voice from another person that sounds like SJ

  • That would require a step 3 where they get in a time machine:

    > But while many hear an eerie resemblance between “Sky” and Johansson’s “Her” character, an actress was hired to create the Sky voice months before Altman contacted Johansson, according to documents, recordings, casting directors and the actress’s agent.

So what? They’re free to hire whoever they want to be a voice actor. It’s not illegal for them to hire someone that sounds like Barack Obama.

  • If you say "yes we can" as your corporate announcement of that person who sounds like Obama, and one of your employees (or rather ex-executives) says "the secret ingredient in AGI is Obama", it actually can be illegal. The main issue in NIL rights (as with trademarks) isn't similarity - it's brand confusion.