← Back to context

Comment by tptacek

2 years ago

They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.

My guess: Sam wanted to imitate the voice from Her and became aware of Midler v. Ford cases so reached out to SJ. He probably didn't expect her decline. Anyway, this prior case tells that you cannot mimic other's voice without their permission and the overall timeline indicates OpenAI's "intention" of imitation. It does not matter if they used SJ's voice in the training set or not. Their intention matters.

  • Please don't take this as me defending OpenAI's clearly sketchy process. I'm writing this to help myself think through it.

    If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

    - It's fine to hire a voice actor.

    - It's fine to train a system to sound like that voice actor.

    - It's fine to hire a voice actor who sounds like someone else.

    - It's probably fine to go out of your way to hire a voice actor who sounds like someone else.

    - It's probably not fine to hire a voice actor and tell them to imitate someone else.

    - It's very likely not fine to market your AI as "sounds like Jane Doe, who sounds like SJ".

    - It's definitely not fine to market your AI as "sounds like SJ".

    Say I wanted to make my AI voice sound like Patrick Stewart. Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising. If so, would it have been OK for OpenAI to do all this as long as they didn't mention SJ? Or is SJ so clearly identifiable with her role in "Her" that it's never OK to try to make a product like "Her" that sounds like SJ?

    • There’s a special branch of law called “right of publicity” or “manners and likeness.”

      It protect celebrities who rely on endorsements and “who they are” for income.

      It very clearly prohibits copycats with near-likeness as a workaround to getting permission from a celebrity.

      OpenAI asked SJ to use her voice. That right there helps her case immensely.

      She said no. They went ahead anyway, presumably with someone or someone’s with a similar voice.

      They publicized the product by referencing SJ.

      These facts are damning.

      They might be just a part of the story. Maybe 100 actresses, all sounding roughly the same, were given the offer over a two year period.

      Maybe they all were given the same praise. Maybe one other, who signed an agreement, was praised on social media much more.

      But this isn’t a slippery slope or a grey area. SJ was asked and said no.

      That prohibits using a similar sounding copycat and publicizing as SJ.

      12 replies →

    • There's no clear line for this. To get the definite conclusion, you will need to bring this to the court with lots of investigation. I know this kind of ambiguity is frustrating, but the context and intention matter a lot here and unfortunately we don't have a better way than a legal battle to figure it out.

      Thanks to Sam, this OpenAI case is clearer than others since he made a number of clear evidence against him.

      2 replies →

    • When it comes to whether something is "wrong", in general intent matters a lot and what they did was communicate an obvious intent. There are certainly ways they could have avoided doing so, and I'm not sure I understand the value of trying to dissect it into a dozen tiny pieces and debate which particular detail pushes it over the line from ambiguous to hard-to-deny? Maybe I don't understand what kind of clarity you're trying to achieve here.

      This particular area of law or even just type of "fairness" is by necessity very muddy, there isn't a set of well-defined rules you can follow that will guarantee an outcome where everyone is happy no matter what, sometimes you have to step back and evaluate how people feel about things at various steps along the way.

      I'd speculate that OAI's attempts to reach out to SJ are probably the result of those evaluations - "this seems like it could make her people upset, so maybe we should pay her to not be mad?"

      2 replies →

    • >If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

      Taking the recent screwup out of account... It's tough. A commercial product shouldn't try to assossiate with another brand. But if we're being realistic: "Her" is nearly uncopyrightable.

      Since Her isn't a tech branch it would be hard to get in trouble based on that association alone for some future company. Kind of like how Skynet in theory could have been taken by a legitimate tech company and how the Terminator IP owner would struggle to seek reparations (to satisfy curiosity, Skynet is a US govt. Program. So that's already taken care of).

      As long as you don't leave a trail, you can probably get away with copying Stewart. But if you start making Star trek references (even if you never contacted Stewart), you're stepping in hit water.

    • > It's probably not fine to hire a voice actor and tell them to imitate someone else.

      Pretty sure this is fine, otherwise cartoons like the simpsons or south park would've gotten in trouble years ago.

      8 replies →

    • > If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

      Yes

      > Say I wanted to make my AI voice sound like Patrick Stewart

      Don't tweet "engage" of "boldly go where no man has gone before" when you release the product and you should be ok.

    • > It's fine to hire a voice actor who sounds like someone else.

      Not necessarily, when you're hiring them because they like someone else—especially someone else who has said that they don't want to work with you. OpenAI took enough steps to show they wanted someone who sounded like SJ.

      > Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.

      See https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co. and also Tom Waits vs. Frito-Lay.

      > as long as they didn't mention SJ

      Or tried to hire SJ repeatedly, even as late as 2 days before the launch.

    • > If it weren't for their attempt to link the voice to SJ (i.e. with the "her" tweet), would that be OK?

      No.

      The fact that it sounds very much like her and it is for a virtual assistant that clearly draws a parallel to the virtual assistant voiced by SJ in the movie (and it was not a protected use like parody) makes it not OK and not legal.

      > Surely it's OK to hire an English actor who sounds a lot like him, so long as I don't use Sir Patrick's name in the advertising.

      Nope.

      If you made it sound identical to Patrick Stewart that would also likely not be OK or legal since his voice and mannerisms are very distinctive.

      If you made it sound kind of like Patrick Stewart that is where things get really grey and it is probably allowed (but if you're doing other things to draw parallels to Patrick Stewart / Star Trek / Picard then that'd make your case worse).

      And the law deals with grey areas all the time. You can drag experts in voice and language into the court to testify as to how similar or dissimilar the two voices and mannerisms are. It doesn't nullify the law that there's a grey area that needs to get litigated over.

      The things that make this case a slam dunk are that there's the relevant movie, plus there's the previous contact to SJ, plus there's the tweet with "Her" and supporting tweets clearly conflating the two together. You don't even really need the expert witnesses in this case because the behavior was so blatant.

      And remember that you're not asking a computer program to analyze two recordings and determine similarity or dissimilarity in isolation. You're asking a judge to determine if someone was ripping off someone else's likeness for commercial purposes, and that judge will absolutely use everything they've learned about human behavior in their lifetime to weigh what they think was actually going on, including all the surrounding human context to the two voices in question.

  • A random person's normal speaking voice is nobody's intellectual property. The burden would have been on SJ to prove that the voice actor they hired was "impersonating" SJ. She was not: the Washington Post got her to record a voice sample to illustrate that she wasn't doing an impersonation.

    Unless & until some 3rd other shoe drops, what we know now strongly --- overwhelmingly, really --- suggests that there was simply no story here. But we are all biased towards there being an interesting story behind everything, especially when it ratifies our casting of good guys and bad guys.

    • If "Her" weren't Sam's favorite movie, and if Sam hadn't tweeted "her" the day it launched, and if they hadn't asked SJ to do the voice, and if they hadn't tried to reach her again two days before the launch, and if half the people who first heard the voice said "Hey, isn't that SJ?" -

      Then I'd say you have a point. But given all the other info, I'd have to say you're in denial.

      3 replies →

    • > the Washington Post got her to record a voice sample

      Actually it only says they reviewed "brief recordings of her initial voice test", which I assume refers to the voice test she did for OpenAI.

      The "impersonating SJ" thing seems a straw man someone made up. The OpenAI talent call was for "warm, engaging, charismatic" voices sounding like 25-45 yr old (I assume SJ would have qualified, given that Altman specifically wanted her). They reviewed 400 applicants meeting this filtering criteria, and it seems threw away 395 of the ones that didn't remind Altman of SJ. It's a bit like natural selection and survival of the fittest. Take 400 giraffes, kill the 395 shortest ones, and the rest will all be tall. Go figure.

    • You’re right that a random person’s voice is not IP, but SJ is not a random person. She’s much more like Mr. Waits or Ms. Milder than you or I.

      I don’t believe the burden would be to prove that the voice actor was impersonating, but that she was misappropriating. Walking down the street sounding like Bette Midler isn’t a problem but covering her song with an approximation of her voice is.

      You are dead right that the order of operations recently uncovered precludes misappropriation. But it’s an interesting situation otherwise, hypothetically, to wonder if using SJ’s voice to “cover” her performance as the AI in the movie would be misappropriation.

      20 replies →

  • I cannot read the article because of it's paywall - is there actual proof OpenAI reached out to Johansson - or is it just being alleged by her lawyers?

    It seems she has every reason to benefit from claiming Sky sounded like her even if it was a coincidence. "Go away" payments are very common, even for celebrities - and OpenAI has deep pockets...

    Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?

    • Johansson is a super successful actress and no doubt rejects 95% of roles offered to her, just as she rejected Altman's request to be the voice of ChatGPT.

      She doesn't need "go away" payments, and in any case that is not what we're looking at here. OpenAI offered her money to take the part, and she said no.

      According to celebrity net worth website, SJ is worth $165M.

      3 replies →

    • If her lawyers are half competent, then they wouldn’t lie. They may not tell the whole truth, but we’re not discussing what wasn’t said here.

      As for your second question, yes. Otherwise you have a perfect workaround that would mean a person likeness is free-for-all to use, but we already decided that is not acceptable.

      1 reply →

    • >Even so, if they got a voice actor to impersonate or sound similar to Johansson, is that something that's not allowed?

      Correct, that is not allowed in the US.

Sure, no-one is disputing that, and despite this Altman then contacts SJ again two days before release asking her to reconsider, then tweets "her" to remind the public what he was shooting for. The goal could have just been ChatGPT with a voice interface, but instead Altman himself is saying the the goal was specifically to copy "her".

  • He's not necessarily saying that was the goal from the start. All he is admitting with that tweet is that it is indeed (he finds it to be) reminiscent of "Her".

    • Well, from the start, Altman wanted SJ to do the voice. Perhaps he'd never seen or heard of the movie "her", and the association is just coincidental?

      After "she" said no, then Altman auditions a bunch of voice talent and picks someone who sounds just like SJ. I guess that's just the kind of voice he likes.

      So, Altman has forgotten about SJ, has 5 voice talents in the bag, and is good to go, right? But then, 2 days(!) before the release he calls SJ again, asking her to reconsider (getting nervous about what he's about to release, perhaps?).

      But still, maybe we should give Altman the benefit of the doubt, and assume he wanted SJ so badly because he had a crush on her or something?

      Then on release day, Altman tweets "her", and reveals a demo not of a sober AI assistant with a voice interface, but of a cringe-inducing AI girlfriend trying to be flirty and emotional. He could have picked any of the five voices for the demo, but you know ...

      But as you say, he's not admitting anything. When he tweeted "her" maybe it was because he saw the movie for the first time the night before?

      5 replies →

    • The voice, or the plot/concept of the movie? Her was a bout an AI having enough realism that someone could become emotionally attached to it. It was not a movie about Scarlett Johansson's voice. Any flirty female voice would be appropriator a "her" tweet.

      3 replies →

    • And if you commision an artist to draw a black-leather tight-fit clad red-head superspy in an ad for tour product, it need not look like Black Widow from the MCU.

      But if it does look very much like her, it doesn't really matter whether you never intended to.

      1 reply →

Even if the voice actor was sourced before they originally contacted SJ, it was clearly the intent to sound like her. There are so many other distinctive voices they could have chosen, but instead they decided to go as close as possible to "her" as they could. Many people thought it was SJ until she stated it wasn't. I appreciate the voice actor may sound like that naturally, but its hardly coincidental that the voice that sounds most like the voice from "her" was the one chosen for their promotion. It is clearly an attempt to pass-off.

  • >Even if the voice actor was sourced before they originally contacted SJ, it was clearly the intent to sound like her.

    Her, being the voice SJ did for the movie, not SJ's conversational voice which is somewhat different.

    If OpenAI were smart, they did it in a chinese wall manner and looked for someone whose voice sounded like the movie without involving SJ's voice in the discussion.

  • This is not a thing. They hired a voice actor, who spoke in her normal speaking voice. That voice is not SJ's intellectual property, no matter what it sounds like. Further, I don't know how you can say any intention here is "clear", especially given the track record on this particular story, which has been abysmal even after this story was published.

    They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine. Laurence Fishburne does not own his "welcome to the desert of the real" intonation; other people have it too, and they can be hired to read in it.

    Again: the Post has this voice actor reading in the normal voice. This wasn't an impersonator.

    • > I don't know how you can say any intention here is "clear"

      You are suggesting that it is coincidence that they contacted SJ to provide her voice, they hired a voice actor that sounds like her, they contacted SJ again prior to launch, and then they chose that specific voice from their library of voices and tweeted the name of the movie that SJs voice is in as a part of the promo?

      I haven't suggested what they have done is illegal, given that the fictional company that created the AI "her" is unlikely to be suing them, but it is CLEARLY what their intent was.

    • What part of "actor" in "voice actor" did you not understand? You don't hire an actor to play themselves generally. "SJ" was not playing herself in Her.

    • > They could have taken auditions from 50 voice actors, come across this one, thought to themselves "Hey, this sounds just like the actress in 'Her', great, let's use them" and that would be fine.

      Except that is simply not true. If their intent was to sound like Her, and then they chose someone who sounds like Her, then they're in trouble.

      6 replies →

It was not claimed that they cloned ScarJo's voice. They hired a soundalike when they couldn't get the person they wanted. Use or lack of use of AI is irrelevant. As I said before, both Bette Midler and Tom Waits won similar cases.

Since they withdrew the voice this will end, but if OpenAI hadn't backed off and ScarJo sued, there would be discovery, and we'd find out what her instructions were. If those instructions were "try to sound like the AI in the film Her", that would be enough for ScarJo to win.

I know that the Post article claims otherwise. I'm skeptical.

  • > It was not claimed that they cloned ScarJo’s voice.

    There were some claims by some people when the issue first arose that they had specifically done a deepfake clone of SJ’s voice; probably because of the combination of apparent trading on the similarity and the nature of OpenAI’s business. That’s not the case as far as the mechanism by which the voice was produced.

    • It's technically possible that the Sky voice/persona is half voice actress and half prosody/intonation ("performance") from SJ/"her". Clearly the ChatCGT tts system is flexible enough to add emotion/drama to the underlying voice, and that aspect must also have been trained on something.

      Clearly a lot of people (including her "closest friends") find the ChatGPT demo to have been very similar to SJ/"her", which isn't to deny that the reporter was fed some (performance-wise) flat snippets from the voice actor's audition tape that sounded like flat sections of the ChatGPT demo. It'd be interesting to hear an in-depth comparison from a vocal expert, but it seems we're unlikey to get that.

Your immediate acceptance that a timeline that represents the best spin of a deep-pocketed company in full crisis PR mode proves the story "false", full stop, no caveats is... I wouldn't say mind-bending, but quite credulous at a minimum. The timeline they present could be accurate but the full picture could still be quite damning. As Casey Newton wrote today [1]:

> Of course, this explanation only goes so far. We don’t know whether anyone involved in choosing Sky’s voice noted the similarity to Johansson’s, for example. And given how close the two voices sound to most ears, it might have seemed strange for the company to offer both the Sky voice and the Johansson voice, should the latter actor have chosen to participate in the project. [...] And I still don’t understand why Altman reportedly reached out to Johansson just two days before the demonstration to ask her to reconsider.

They absolutely have not earned the benefit of the doubt. Just look at their reaction to the NDA / equity clawback fiasco [2], and their focus on lifelong non-disparagement clauses. There's a lot of smoke there...

[1] https://www.platformer.news/openai-scarlett-johansson-chatgp...

[2] https://www.vox.com/future-perfect/351132/openai-vested-equi...

>They hired the actor that did the voice months before they contacted SJ. The reaction on this site to the news that this story was false is kind of mindbending.

People lose their rational mind when it comes to people they hate (or the opposite I suppose). I don't care for Sam Altman, or OpenAI one way or another, so it was quite amusing to watch the absolute outrage the story generated, with people so certain about their views.

I don't understand the point you are trying to make. The essential question is whether they were trying to imitate (using a voice actor or otherwise) Scarlett Johansson's voice without her permission. Nothing in the article refutes that they were; whether they sought the permission before or after they started doing the imitation is irrelevant. Others have pointed to previous case law that shows that this form of imitation is illegal.

Moreover I can't see any reasonable person concluding that they were not trying to imitate her voice given that:

1. It sounds similar to her (It's unbelievable that anyone would argue that they aren't similar, moreso given #2).

2. Her voice is famous for the context in which synthetic voice is used

3. They contacted her at some point to get her permission to use her voice

4. The CEO referenced the movie which Johansson's voice is famous for (and again depicts the same context the synthetic voice is being used) shortly before they released the synthetic voice.

Except the story isn't false? They wanted her voice, they got her voice*, they did marketing around her voice, but it's not her voice, she didn't want to give them her voice.

Notice how the only asterisk there is "it's technically not her voice, it's just someone who they picked because she sounded just like her"

>> They hired the actor that did the voice months before they contacted SJ.

Are you saying that story is false?

Yeah, but then again, I totally expected this opening the comment threads. Same happened with RMS debacle, same happened with similar events earlier, same happened on many a Musk stories. It seems that a neat narrative with clear person/object to hate, once established, is extremely resilient to facts that disprove it.

  • Right. Even if you think OpenAI isn’t a good place, this is an investigation by an established newspaper that refuted some of the more serious accusations (that OpenAI got a Johannson impersonator - they didn’t, that they modified the voice to sound like Johansson - evidence suggests this didn’t happen). When the reaction is “I don’t care that an investigation refuted some of the accusations”, it demonstrates someone isn’t openly approaching things in good faith.

    Likewise, if someone’s attitude is - “OK, maybe there’s no paper trail, but I’m sure this is what the people were thinking”, then you’ve made an accusation that simply can’t be refuted, no matter how much evidence gets presented.

    • > refuted some of the more serious accusations (that OpenAI got a Johannson impersonator - they didn’t

      A lot of the argument here comes down to whether the article does refute that. I don't believe it does.

      What it refutes is the accusation that they hired someone who sounds like Johansson after she told them she would not do it herself. That was certainly a more damning accusation, but it's not an identical one.

      But in my view, it requires a pretty absurd level of benefit of the doubt to think that they didn't set out to make a voice that sounds like the one from the movie.

      Maybe good for them that they felt icky about it, and tried to get her for real instead, but she said no, and they didn't feel icky enough about it to change the plan.

      Do you believe the article "refutes" that? Does it truly not strike you as a likely scenario, given what is known, both before and after this reporting?

      2 replies →

    • > When the reaction is “I don’t care that an investigation refuted some of the accusations”, it demonstrates someone isn’t openly approaching things in good faith.

      When the reaction is "it doesn't matter, it's still not ok to copy someone's voice and then market it as being that person's voice or related to that person's voice" and your reaction is to cast that as being something else, it demonstrates you are not openly approaching things in good faith.

    • An "investigation"?

      Let's note that OpenAI didn't release the names of the voice talent since they said they wanted to protect their privacy...

      So, how do you think the reporter managed to get not only the identity, but also the audition tape from "Sky"? Detective work?

      An interesting twist here is that WashPo is owned by Bezos, who via Amazon are backing Anthropic. I wonder how pleased he is about this piece of "investigative reporting"?

      1 reply →

    • OpenAI allowed the reporter to hear some snippets from the audition tape. Not exactly my idea of an "investigation".

      There are multiple parts to the voice performance of ChatGPT - the voice (vocal traits including baseline pronunciation) plus the dynamic manipulation of synthesized intonation/prosody for emotion/etc, plus the flirty persona (outside of vocal performance) they gave the assistant.

      The fact that the baseline speaking voice of the audition tape matches baseline of ChatGPT-4o only shows that the underlying voice was (at least in part, maybe in whole) from the actress. However, the legal case is that OpenAI deliberately tried to copy SJ's "her" performance, and given her own close friends noting the similarity, they seem to have succeeded, regardless of how much of that is due to having chosen a baseline sound-alike (or not!) voice actress.

      1 reply →

  • What facts disprove OpenAI making a voice that sounds like SJ such that the movie Her is referenced by Altman, and why is that actress upset?

    • > What facts disprove OpenAI making a voice that sounds like SJ

      The objective parts of this are disproved in several ways by the very article under which we're commenting. The subjective parts are... subjective, but arguably demonstrated as false in the very thread, through examples of SJ vs. Sky to listen side by side.

      > such that the movie Her is referenced by Altman

      You're creating a causal connection without a proof of one. We don't know why Altman referenced "Her", but I feel it's more likely because the product works in a way eerily similar to the movie's AI, not that because it sounds like it.

      > and why is that actress upset?

      Who knows? Celebrities sue individuals and companies all the time. Sometimes for a reason, sometimes to just generate drama (and capitalize on it).

      3 replies →

  • I also see this dynamic on these same kinds of threads, but what I see is that one side is very sure that the facts disprove something, and the other side is very sure they don't. I've been on both sides of this, on different questions. I don't think there is anything weird about this, it's just a dispute over what a given fact pattern demonstrates. It's totally normal for people to disagree about that. It's why we put a fairly large number of people on a jury... People just see different things differently.

    • It's unhelpful because the massive comment chains don't bring anything to the "discussion" (this is literal celebrity gossip so I'm having a hard time using 'discussion', but wait this isn't Reddit how could I forget, we're the enlightened HN.) It just devolves into ones' priors: do you hate or love OpenAI and sama for unrelated reasons. It's just a sports bar with the audience a few drinks in.

      4 replies →

  • I'm not sure what RMS has to do with Altman. I'm also not sure why you think people just want to hate on Musk when it took a decade of his blatant lies for most people to catch on to the fact that he's a conman (remember, everyone loved him and Tesla for the first 5 or 10 years of lies). But the comparison between Musk and Altman is pretty apt, good job there.

    • Well not sure what you mean by 'Conman'. Wildly successful people do aim high a lot, a lot. They don't meet 80% of their goals, that is perfectly ok. Even as low as 20% success on a lot of these moonshot things sets you ahead of the masses who aim very low and get there 100% of the times.

      This whole idea that some one has to comply to your idea of how one must set goals, and get there is something other people have no obligations to measure up to. Also that's the deal about his lies? He can say whatever he wants, and not get there. He is not exactly holding an oath to you or any one that he is at an error for not measuring up.

      Musk might not get to Mars, he might end up mining asteroids or something. That is ok. That doesn't make him a conman.

      tl;dr. Any one can say, work and fail at anything they want. And they don't owe anybody an explanation for a darn thing.

      3 replies →

It’s human nature: people see others achieve what they cannot, and try to pull them down. You see this wrt Musk on this site a lot, too.

  • It has nothing to do with this. There are many successful people and businesses that I admire, and a number of notable examples of those I do not. The two your comment mentions are simply part of that latter group. I think for good reason. (But of course I would think that...)

    • I’m not talking about you specifically here. What you’re saying could be true for you, and not true for the community as a whole. With the benefit of experience, I can tell for certain there’s (on average) a strong undercurrent of jealousy against people perceived as overly ambitious, particularly if they are successful in their ambitions. This is not specific to this site, of course, or even to the tech community in general.

      12 replies →

  • That was what Elizabeth Holmes claimed as well, however, we know that some people who try to achieve greatness are grifters. A pithy saying doesn’t change that reality.

    • You can’t seriously claim there’s any equivalence between Altman/Musk and Holmes. The former two have something to show for their ambition, Holmes was basically a fraud with no substance behind her whatsoever

      2 replies →

Tbf here Altman really screwed this over with that tweet and very sudden contacting. There probably wouldn't be much of a case otherwise.

If I had to guess the best faith order of events (more than what OpenAi deserves):

- someone liked Her (clearly)

- they got a voice that sounded like Her, subconsciously (this is fine)

- someone high up hears it and thinks "wow this sounds like SJ!" (again, fine)

- they think "hey, we have money. Why not get THE SJ?!"

- they contact SJ, she refuses and they realize money's isn't enough (still fine. But this is definitely some schadenfreude here)

- marketing starts semi-indepenently, and they make references to Her, becsuse famous AI voice (here's where the cracks start to form. Sadly the marketer may not have even realized what talks went on).

- someone at OpenAi makes one last hail Mary before the release and contacts SJ again (this is where the trouble starts. MAYBE they didn't know about SJ refusing, but someone in the pipeline should have)

- Altman, who definitely should have been aware of these contacts, makes that tweet. Maybe they forgot, maybe they didn't realize the implications. But the lawyer's room is now on fire

So yeah, hanlon's razor. Thus could he a good faith mistake, but OpenAi's done a good job before this PR disaster ruining their goodwill. Again, sweet Schadenfreude even if we are assuming none of this was intentional.

  • Just how many "Good faith mistakes" is a company / CEO permitted to make before a person stops believing the good faith part?

    • I'm a pretty forgiving person, I don't really mind mistakes as long they are 1) admitted to 2) steps are taken to actively reverse course, and 3) guidelines are taken to prevent the same mistakes from happening.

      But you more or less drain thst good faith when you are caught with your pants down and decide instead to double down. So I was pretty much against OpenAI ever since the whole "paying for training data is expensive" response during the NYT trials.

      ----

      In general, the populace can be pretty unforgiving (sometimes justified, sometimes not). It really only takes one PR blunder to tank thst good faith. And much longer to restore it.

    • Mistakes should be made once and once only, irrespective of good or bad faith. It is no longer a mistake when you do the same misstep over and over again, it is a deliberate pattern of behaviour.

The population of this site reacts to all stories like this. It’s only Gell-Mann Amnesia that causes your mind to bend.

Legally, the issue isn’t what they were thinking when they hired the actor, it’s what the intent and effect was when they went to market. (Even if there was documentary evidence that they actively sought out an actor for resemblance to SJ’s voice from day one, the only reason that would be relevant is because it would also support that that was there intent with the product when it was actually released, not because it is independently relevant on its own.)

Whether or not they had any interest in SJ’s voice when they hired the other actor, they clearly developed such an interest before they went to market, and there is at least an evidence-based argument that could be made in court that they did, in fact, commercially leverage similarity.

It is a curious reaction, but it starts to make sense if some of these posters are running ops for intelligence agencies. Balaji Srinivasan noted that as the US started pulling out of foreign wars, the intelligence apparatus would be turned inward domestically.

Some of it can also be attributed to ideological reasons, the d/acc crowd for example. Please note I am not attacking any individual poster, but speculating on the reasons why someone might refuse to acknowledge the truth, even when presented evidence to the contrary.