Comment by glenstein

6 hours ago

Agreed. At a bare minimum it's a hedge against terrestrial existential risks. And if Mars itself sucks, then, well, rotating space stations with simulated G, same principle.

One terrible thing wrought by billionaire Mars fantasies is a backlash that I think has become too sweeping. It's wrongheaded for a million reasons, but it's nevertheless true that hedging against terrestrial existential risks is something we should have an interest in.

Sorry, I'd love to hear exactly how a mars habitat with a half dozen people or a space station are "hedges against terrestrial existential risks"? Those are both "unfriendly" environments that lack the resources required to sustain themselves for any appreciable amount of time. And certainly don't have the number of people required to repopulate.

  • I'd love to see you make more of an effort to try and understand the idea you're engaging in than just engaging in an emotionally charged dismissal. I try to profess the principle of charity here from time to time, which means tackling the version of an idea that credits it with making the most sense.

    So if the version of the idea that you're engaging with is one that doomed to fail, doesn't have the resources or technology or population to succeed... maybe assume that's not the version I'm talking about?

    There are contexts where I love to get into these kinds of details (there was an amazing conversation on HN from a few months ago [1] about what would be involved in sending a bunch of voyager-style space probes to alpha centauri), but you have to want to try.

    1. https://news.ycombinator.com/item?id=46058528

  • If I’m to believe the experts, LLMs are a panacea to all problems to have ever existed, like Blockchain before it.

    Therefore it is a non-issue as given that LLMs have only gotten exponentially more impressive, in [current_year+n] you will be able to prompt Claude to materialize a fast terraforming machine and FTL it over to mars.