← Back to context

Comment by voidhorse

3 days ago

Sure, but it's overblown. People have been reasoning about and building probabilistic systems formally since the birth of information theory back in the 1940s. Many systems we already rely on today are highly stochastic in their own ways.

Yes, LLMs are a bit of a new beast in terms of the use of stochastic processes as producers—but we do know how to deal with these systems. Half the "novelty" is just people either forgetting past work or being ignorant of it in the first place.

> Half the "novelty" is just people either forgetting past work or being ignorant of it in the first place.

We also see this in cryptocurrencies. The path to forgetting is greased by the presence of money and fame, and at some later time they are eventually forced to "discover" the same ancient problems they insisted couldn't possibly apply.

Truly appreciate the perspective. Any pointers to previous work on dealing with stochastic systems from the past? Part of my work is securing AI workloads, and it seems like losing determinism throws out a lot of assumptions in previously accepted approaches.