Comment by giobox
2 years ago
The Movie Her predates all of this by years, and Sam Altman even tweeted "her"! The OpenAI team are clearly well aware of Scarlett's voice (its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry). The movie predates all of this by years - of course they knew.
When auditioning actors "months before" they can still look for an actor who guess what? Sounds like SJ, even "before the first time SJ was contacted".
As the actor - I'd likely also be looking to emulate SJ in Her - its clearly what the client was looking for.
> its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry
Let's not exaggerate. It was a somewhat popular movie, yes, but not really defining and far from the first example of conversational AI speaking woman's voice. There are plenty of examples in movies and TV shows.
If anything, the seminal work in this space is Star Trek casting Majel Barrett-Roddenberry as the voice of computer systems with conversational interfaces, as early as 1987 (or 1986, if she had that role in the Original Series; I don't remember those episodes too well), all the way to ~2008 (or to 2023, if you count post-mortem use of her voice). That is one distinctive voice I'd expect people in OpenAI to be familiar with :).
Also, I can't imagine most people knowing, or caring, who voiced the computer in Her. It's not something that most people care about, especially when they're more interested in the plot itself.
> Let's not exaggerate. It was a somewhat popular movie, yes, but not really defining and far from the first example of conversational AI speaking woman's voice. There are plenty of examples in movies and TV shows.
I'm honestly surprised so many people are making this argument, seemingly with a straight face.
It would have been a pretty weak argument even without the tweet from Altman - it is not exaggeration to say it is the canonical "AI voice companion" cultural artifact in our times, but the opposite, it requires exaggeration to downplay it - but then the CEO's own marketing of the connection weakens the argument past the point of plausibility.
Surely there are better defenses available! But with this line ... phrases like "don't piss on me and tell me it's raining" and "don't believe your lying eyes" keep popping into my mind for some reason ...
The quotation above is
> its inconceivable the majority of the team at OpenAI haven't at least seen part of the film that almost defined their industry
rather than
> It is inconsistent that Sam personally wasn't aware
(He obviously was)
I'd agree that Majel Barrett-Roddenberry is the prime example of a computer voice interface for most nerds… but then I looked up when Her was released and feel old now because "surely it's not 11 years old already!"
If "her" wasn't the canonical example of a near-future AI assistant in a film, why then does Sam bother to tweet the single word "her" following launch? I think that film is far more influential than you give it credit for here.
Everyone in tech who saw that tweet knew what it meant - a single word. The tweet doesn't even require additional context or explanation to almost anyone in this industry.
There is also a clear difference in the behaviour of the "computer" in Star Trek vs "her" - what OpenAI shipped is far more like the personality of "her" than the much more straight-laced examples in Star Trek, where the computer was virtually devoid of emotional-sounding responses at all.
Just anecdotally, I personally didn't know about the existence of that movie before this whole drama began. Sam tweeting that probably knew however.
Who cares.
What matters is whether OpenAI leadership had the movie Her in mind, and the AI in Her is more similar to LLMs than the Next Generation Star Trek main computer.
Computers have had conversational voices at least since Lost In Space.
I think we can surely all mostly agree 'her' presents a much more realistic portrayal of a near-term future than Lost in Space et al.
1 reply →
Even if thats true why would that be illegal or unethical? She can't possibly have a copyright on all voices that sounds like "her"
There have been cases where it was decided that a person had rights[0] to their distinctive voice, as an extension of the Right of Publicity[1]. For example Midler v. Ford Motor Co.[2], and one or two other cases I've seen mentioned but can't remember.
[0]: Though not necessarily "copyrights"?
[1]: https://higgslaw.com/celebrities-sue-over-unauthorized-use-o...
[2]: https://en.wikipedia.org/wiki/Midler_v._Ford_Motor_Co.
Midler v. Ford is a poor comparison for this case specifically because of: 1. hiring an impersonator 2. copying a song 3. intending to confuse.
If what OpenAI is saying is true, then none of the conditions apply. I'd say (1) is unlikely, (2) is very unlikely, and (3) is maybe, at least to some degree.
1 reply →
NIL rights are pretty broad, and more like trademark rights than patents or copyrights. The main test isn't similarity, it's whether there is brand confusion. Karpathy and Altman's tweets indicate brand confusion.
Still, this isn't recognized in every state or country, and there aren't many cases yet (although there are laws).
Sure, it could have happened, but it seems we don’t have evidence either way.
Tweeting “her” months later doesn’t prove anything. That Tweet might superficially look like evidence of intent, but if you think about it, it’s not.
Counterpoint: if you think about it, yes it is.
To spell it out, based on the date, it's very weak evidence for something that happened many months before.
3 replies →