Comment by evantbyrne
6 months ago
I'm pretty bearish on the idea that AGI is going to take off anytime soon, but I read a significant amount of theology growing up and I would not describe the popular essays from e.g., LessWrong as religious in nature. I also would not describe them as appearing poorly read. The whole "look they just have a new god!" is a common trope in religious apologetics that is usually just meant to distract from the author's own poorly constructed beliefs. Perhaps such a comparison is apt for some people in the inevitable AGI camp, but their worst arguments are not where we should be focusing.
Philosophy and religion are not mutually inclusive, though one can certainly describe a religious belief as being a philosophical belief.
Even a scientifically inclined atheist has philosophical ideas grounding their world view. The idea that the universe exists as an objective absolute with immutable laws of nature is a metaphysical idea. The idea that nature can be observed and that reason is a valid tool for acquiring knowledge about nature is an epistemological idea. Ethics is another field of philosophy and it would be a mistake to assume a universal system of ethics that has been constant throughout all cultures across all of human history.
So while I certainly agree that there is a very common hand-wave of "look the atheists have just replaced God with a new 'god' by a different name", you don't have to focus on religion, theology and faith based belief systems to identify different categories of philosophical ideas and how they have shaped different cultures, their beliefs and behaviours throughout history.
A student of philosophy would identify the concept of "my truth" as being an idea put forward by Emmanuel Kant, for example, even though the person saying that doesn't know that that's the root of the idea that reality is subjective. Similarly, the empirically grounded scientist would be recognized as following in the footsteps of Aristotle. The pious bible thumper parroting ideas published by Plato.
The point is that philosophy is not the same thing as religion and philosophy directly shapes how people think, what they believe and therefore how they act and behave. And it's kind of uncanny how an understanding of philosophy can place historical events in context and what kinds of predictive capabilities it has when it comes to human behaviour in the aggregate.
This sounds very educated but I don't really see what it has to do with the comment you're responding to (or with AI).
While it's a fair criticism, just because someone doesn't believe in a god doesn't mean the religious hardware in their brain has been turned off. It's still there and operational - I don't think it's a surprise that this hardware's attention would then be automatically tuned to a different topic.
I think you can also see this in the intensification of political discussion, which has a similar intensity to religious discussions 100-200+ years ago (i.e. Protestant reformation). Indicating that this "religious hardware" has shifted domains to the realm of politics. I believe this shift can also be seen through the intense actions and rhetoric we saw in the mid-20th century.
You can also look at all of these new age "religions" (spiritualism, horoscopes, etc.) as that religious hardware searching for something to operate on in the absence of traditional religion.
> While it's a fair criticism, just because someone doesn't believe in a god doesn't mean the religious hardware in their brain has been turned off.
Max Stirner said that after the Enlightenment and the growth of liberalism, which is still very much in vogue to this day, all we’ve done is replace the idea of God with the idea of Man.
The object might be different, but it is still the unshakable belief in an idealised and subjective truth, with its own rituals and ministers i.e a religion.
I guess the Silicon Valley hyper-technological optimism of the past years is yet another shift from Man to religious belief in the Machine.
I agree that modern hyper-online moralist progressivism and QAnonism are just fresh coats of paint on religion, but that isn't similar to AI.
AI isn't a worldview; it's an extremely powerful tool which some people happen to be stronger at using than others, like computers or fighter jets. For people who empirically observe that they've been successful at extracting massive amounts of value from the tool, it's easy to predict a future in which aggregate economic output in their field by those who are similarly successful will dwarf that of those who aren't. For others, it's understandable that their mismatched experience would lead to skepticism of the former group, if not outright comfort in the idea that such productivity claims are dishonest or delusional. And then of course there are certainly those who are actually lying or deluded about fitting in the former group.
Every major technology or other popular thing has some subset of its fandom which goes too far in promotion of the thing to a degree that borders on evangelical (operating systems, text editors, video game consoles, TV shows, diets, companies, etc.), but that really has nothing to do with the thing itself.
Speaking for myself, anecdotally, I've recently been able to deliver a product end-to-end on a timeline and level of quality/completeness/maturity that would have been totally impossible just a few years ago. The fact that something has been brought into existence in substantially less time and at orders of magnitude lower cost than would have been required a few years ago is an undeniable observation of the reality in front of me, not theological dogma.
It is, however, a much more cognitively intense way to build a product — with AI performing all the menial labor parts of development, you're boxed into focusing on the complex parts in a far more concentrated time period than would otherwise be required. In other words, you no longer get the "break" of manually coding out all the things you've decided need to be done and making every single granular decision involved. You're working at a higher level of abstraction and your written output for prompting is far more information-dense than code. The skills required are also a superset of those required for manual development; you could be the strongest pre-LLM programmer in the world, but if you're lacking in areas like human language/communication, project/product management, the ability to build an intuition for "AI psychology", or thinking outside the box in how you use your tools, adapting to AI is going to be a struggle.
It's like an industry full of mechanics building artisan vehicles by hand suddenly finding themselves foisted with budgets to design and implement assembly lines; they still need to know how to build cars, but the nature of the job has now fundamentally changed, so it's unsurprising that many or even most who'd signed up for the original job would fail to excel in the new job and rationalize that by deciding the old ways are the best. It's not fair, and it's not anyone's fault, but it's important for us all to be honest and clear-eyed about what's really happening here. Society as a whole will ultimately enjoy some degree of greater abundance of resources, but in the process a lot of people are going to lose income and find hard-won skills devalued. The next generation's version of coal miners being told to "learn to code" will be coders being told to "learn to pilot AI".
If the "greater abundance if resources" all ends up in the hands of those few at the top of the pyramid, I'm not sure most people are going to celebrate this change.
> It's not fair, and it's not anyone's fault, but it's important for us all to be honest and clear-eyed about what's really happening here.
Or we can just refuse this future and act as a society to prevent it from happening. We absolutely have that power, if we choose to organize and use it.
8 replies →
Which then leads you to the question "who installed the hardware"?
No, that lead you to that question.
It leads me to the question, "Is it really 'religious hardware' or the same ol' 'make meaning out of patterns' hardware we've had for millenia that has allowed us to make shared language, make social constructs, mutually believe legal fictions that hold together massive societies, etc.?"
5 replies →
I've read LessWrong very differently from you. The entire thrust of that society is that humanity is going to create the AI god.
They are literally publishing a book called "If you build this, everybody dies" and trying to stop humanity from doing that. I feel like that's an important detail: they're not the ones trying to create the god, they're the ones worried about someone else doing it.
I don't think what you said negates anything I said
Maybe not a god, but we're intentionally designing artificial minds greater than ours, and we intend to give them control of the entire planet. While also expecting them to somehow remain subservient to us (or is that part just lip service)?
What makes an artificial mind greater than ours?
Do you assume that someone will stumble into creating a person, but with unlimited memory and computational power?
Otherwise, if we are able to create this person using our knowledge, we will most certainly be able to augment humans with those capabilities.
I’m sorry, but are you arguing that an LLM is anywhere near a human mind? Or are you arguing about some other AI?
If you understand the cultural concepts of Adam Curtis’s All Watched Over by Machines of Loving Grace, then yes we do keep trying to make gods out of inanimate things.
And it’s the atheists who continuously do it, claiming they don’t believe in God just markets or ai etc.
It’s an irony of ironies.
My comment was not referring to the present state of things, but to the ultimate goal.
However, the present state is also worth a look!
The three uniquely human factors which people keep saying a machine can never do:
1. Empathy: they win by default. (My reference group is twenty friends and seven therapists.)
2. Critical Thinking: they win with the correct prompt. (You need to explicitly work against the sycophancy. i.e. the desire to appear empathetic limits the ability to convey true information to a human.)
3. Creativity: I want to say creativity lags behind, in LLMs at least, but Midjourney is blowing my damn mind, so I might have to give them that, too.
That's with the versions of AI we have today. My comment was referring to the ultimate goal, i.e. where this is all heading.
To put it explicitly, we intend to:
(1) make them in our image (trained after our mental output, and shaped after our body),
(2) while also making them vastly superior intellectually and physically (strength, endurance, etc.),
(3) while also expecting them to have no will of their own -- except as it aligns with ours. (We do actually need to give them a will to make them useful.)
I do not expect that to end very well.
1 reply →
I didn’t say that “it’s just a new god,” I said:
The notion that history is just on some inevitable pre-planned path is not a new idea, except now the actor has changed from God to technology.
This is a more nuanced sentence.
Before that quoted sentence you drew a line from the reformation to people believing that AI is inevitable, then went on to imply these people may even believe such a thing will happen without the involvement of people. These are generalizations which don't fit a lot of the literature and make their best ideas look a bit sillier than they are. It is situations like these that make me think that analogies are better suited as a debate tactic than a method of study.
Would you say LessWrong posts are dogmatic?
> I also would not describe them as appearing poorly read.
YOU come off as poorly read, so I wouldn't trust your judgement on this one, champ. "common trope" lmfao.
I jsut want to comment here that this is the classic arrogant, underread “I reject half of humanities thoughts” foolishness that OP is referring to.
I mean the lack of self awareness you have here is amazing.
To the contrary. I sped through my compsci capstone coursework first year of college and spent most of the rest of my time in philosophy, psychology, and sociology classrooms. The "hey if you squint this thing it looks like religion for the non-religious" perspective is just one I've heard countless times. It is perfectly valid to have a fact based discussion on whether there is a biological desire for religiosity, but drawing a long line from that to broadly critique someone's well-articulated ideas is pretty sloppy.
Quoting your college classes is the first sign of inexperience but I’ll Share some modern concepts.
In Adam Curtis‘s all watched over by machines of loving Grace, he makes a pretty long and complete argument that humanity has a rich history of turning over its decision-making to inanimate objects in a desire to discover ideologies we can’t form ourselves in growing complexity of our interconnectivity.
He tells a history of them constantly failing because the core ideology of “cybernetics” is underlying them all and fails to be adaptive enough to match our DNA/Body/mind combined cognitive system. Especially when scaled to large groups.
He makes the second point that humanity and many thinkers constantly also resort to the false notion of “naturalism” as the ideal state of humanity, when in reality there is no natural state of anything, except maybe complexity and chaos.
Giving yourself up to something. Specially something that doesn’t work is very much “believing in a false god.”
4 replies →
> The "hey if you squint this thing it looks like religion for the non-religious" perspective is just one I've heard countless times
To be fair, we shouldn't bundle Augustine and Thomas Aquinas with John MacArthur and Joel Osteen. Meaning that some religious thought is more philosophically robust than other religious thought.