Comment by LewisVerstappen
2 years ago
What's shitty about this?
They approached Johansson and she said no. They found another voice actor who sounds slightly similar and paid her instead.
The movie industry does this all the time.
Johansson is probably suing them so they're forced to remove the Sky voice while the lawsuit is happening.
Nothing here is shitty.
Asking someone to license their voice, getting a refusal, then asking them again two days before launch and then releasing the product without permission, then tweeting post launch that the product should remind you of that character in a movie they didn't get rights to from the actress or film company is all sketchy and -- if similar enough to the famous actress's voice, is a violation of her personality rights under California and various other jurisdictions: https://en.m.wikipedia.org/wiki/Personality_rights
These rights should have their limits but also serve a very real purpose in that such people should have some protection from others pretending to be/sound like/etc them in porn, ads for objectionable products/organizations/etc, and all the above without compensation.
I will agree with you if
- they used Johannson's actual voice in training the text to speech model
or
- a court finds that they violated Johannson's likeness.
From hearing the demo videos, I don't think the voice sounded that similar to Johannson.
But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.
> But hiring another actor to replicate someone you refused your offer is not illegal and is done all the time by hollywood.
Probably this could indeed make them "win" (or not lose rather) in a legal battle/courts.
But doing so will easily make them lose in the PR/public sense, as it's a shitty thing to do to another person, and hopefully not everyone is completely emotionless.
4 replies →
If they didn't use her voice at all, doesn't seem like there would be a case or even concern.
Also, they proceeded to ask her for rights just 2 days before they demoed the Sky voice. It would be pretty coincidental that they actually didn't use her voice for the training at all if they were still trying to get a sign off from her.
If they used her actual voice for training the model that shipped then I agree with you. It seems like they used the voice from another woman who sounds similar though.
It doesn't "seem like" in this instance, "no no that is not what we did we commissioned someone else" without specifying who is their claim.
From technical standpoint, a finetuned voice model can be built from just few minutes of data and GPU time on top of an existing voice model, almost like how artists LoRAs are built for images. So it is entirely within possibility that that had happened.
I guess it takes more than a couple of days to organize things with an A list star, esp. if there's a studio recording session involved rather than just using existing material.
This strongly suggests they weren't trying to get her voice until the last minute (would have been too late for the launch) but, rather, they had already used the other actress, and realized they were exposing themselves to a lawsuit due to how similar they were.
It was a CYA move, it failed, and now their ass is uncovered.
Maybe despite not using her voice at all they wanted to give her some money as a gesture of good will and/or derisk the project.
Surely the company that has been gobbling up data and information without the rights to them or any form of compensation have suddenly turned a new leaf and decided to try and pay an actress that isn't involved.
Like, lets be real here. This wouldn't be the first time they would be using material without the right to them and I don't expect this to change any time soon without a major overhaul of EVERYTHING IN THE COMPANY and even then it will probably only happen after lawsuits and fines.
I would like to buy you a horse as a gesture of goodwill to derisk this flight attendant / passenger situation.
> The movie industry does this all the time.
Such as? Please give example..
What I'm wondering is why are they doing that in the first place. Why is the best AI company in the world trying to stick a flirty voice into their product?
It pains me to say it, but I really think it pays dividends to consider the very obvious possibility that the people who are doing this are in general just not socially well-adjusted.
Everything about OpenAI speaks of people who do not put great value on shared human connections, no?
Hey, I like that artist. I am going to train a computer to produce nearly identical work as if by them so I can have as many as I like, to meet my own wishes.
Why is it surprising that it didn't really cross their mind that a virtual girlfriend is not a good look?
This is not an organisation that has the feelings of people central to its mission. It's almost definitionally the opposite.
Yes, it seEms a LOt of big Names in tech have this same problem. Curious that, isn't it?
I also think it is tipping their hand a bit. I know companies can do multiple things at once, but what might this flirty assistant focus suggest about how AGI is coming along?
1 reply →
...because human brains enjoy being talked to in a flirty voice, and they benefit from doing things that their customers like? Doesn't seem that mysterious
Guess you are their target market.