← Back to context

Comment by visarga

1 year ago

> “it can kill you based on its own desires and there’s nothing you can do about it”

haha, that's an original take, but makes sense after Terminator and Hal

wondering if these movies have caused untold external consequences to humanity in its adoption of AI just to sell a few tickets

to make a parallel, anti-vaxxers did their damage and caused many lives to be lost, similarly these stories, which are no better, can make people have a bad start with AI and sabotage their futures, or stall the benefits of AI from everyone else

Genuinely I think that’s the case.

I have been in “AI” since 1998 when I was writing A* route planning for npcs in this new cool engine called Unreal.

The only thing that has been consistent in all these years is that nobody thinks it’s AI unless it’s literally like Arnold Schwarzenegger in the terminator. I mean I’m not even exaggerating, it’s so ridiculously predictable that the goalposts for AI move the second whatever that particular technology becomes ubiquitous

So for example, hog sift, surf etc. along with localization algorithms like slam type systems we’re so thoroughly in research when I started that they were considered a pillar of the field of AI. Now literally, no one would consider those AI because they do not use deep convolutional networks.

So just like Marvin Minsky said AI is a suitcase term that doesn’t fucking mean anything. As somebody who’s been doing it for so long I’m used to it but it’s still annoying.

So I’m just building the terminator and the counter terminator so we can move on.

  • > Tesler's Theorem (ca. 1970). My formulation of what others have since called the “AI Effect”. As commonly quoted: “Artificial Intelligence is whatever hasn't been done yet”. What I actually said was: “Intelligence is whatever machines haven't done yet”. Many people define humanity partly by our allegedly unique intelligence. Whatever a machine—or an animal—can do must (those people say) be something other than intelligence.

    https://www.nomodes.com/larry-tesler-consulting/adages-and-c...

>wondering if these movies have caused untold external consequences to humanity in its adoption of AI just to sell a few tickets

What are you saying? This isn't like when The Simpsons made fun of nuclear power and depicted it as doing impossible things. AGI is a hypothetical technology and we don't yet know what it could be capable of or even if it's feasible.

>to make a parallel, anti-vaxxers did their damage and caused many lives to be lost, similarly these stories, which are no better, can make people have a bad start with AI and sabotage their futures, or stall the benefits of AI from everyone else

Any idea can change a person's mind in one direction or another. Yours is an argument against the exchange of ideas in general. "Since hearing an idea could cause a person to $DO_BAD_THING, exchanging ideas (for example, by talking to people with $WRONG_OPINION, or by consuming fiction) is bad."

  • I thought "surely, when their lives will be at stake, people will do the prudent thing and trust doctors", but no, we were not that smart. Some ideas can be inflicting self-harm and continue getting support even in the face of grave consequences.

    • But you're not arguing against any particular idea. You're arguing against ideas based on how they change people's minds. But, a person's mind is changed by idea dependent of their personality and on the contents of their mind when they hear that idea, so in principle any idea can change a person's mind in any direction. Some people hear "vaccines cause autism" and conclude "I should not vaccinate my children", while others conclude "this country needs better education". Some people reach "I should not vaccinate my children" after hearing some other idea. Some people see The Terminator and think "I should work to prevent the advancement of AI", while others think "I should go into AI to prevent this from happening", and yet others think "ha, what a silly movie". Some people will reach "I should work to prevent the advancement of AI". So, like I said, you argument is one against culture as a whole. If the fact that hearing an idea will convince people of X opinion is a good reason to stop the spread of that idea, then it's also a good reason to stop the spread of all ideas.