This is absolutely a new type of nondeterministic tool, so you're spot on there.
One of the key things we realized starting to use it is that the approach allows you to mix deterministic and non-deteministic tools together as part of a composable chain.
So you can, for example, use LLMs for their evaluation capabilities with a natiural language script as part of a broader chain that wraps it in deterministic code, and that also can include and run deterministic code nested within the plain language script.
So it allows us to create pipelines that combine the best of both approaches as appropriate based on the sub-task at hand.
One thing to consider is that the steps in the pipeline can be deterministic (the code executed) while the outputs (summaries, reviews, evaluations, explanations) may be nondeterministic. An example would be summarizing data calculated via a traditional script, and piping it to a report-format markdown script that generates the report and summarizes the results.
I agree that this is a choice by each person using tools like this, and that it is up to each of us as developers whether a tool like this suits the use case at hand.
My own view is that the world is rapidly moving to more human language programming tools, and that system automation and shell scripting will be part of this. There is a wide array of sensible potential use cases I can see between the two polarized views of "never use an LLM' and "let's vibe code system automation".
This is absolutely a new type of nondeterministic tool, so you're spot on there.
One of the key things we realized starting to use it is that the approach allows you to mix deterministic and non-deteministic tools together as part of a composable chain.
So you can, for example, use LLMs for their evaluation capabilities with a natiural language script as part of a broader chain that wraps it in deterministic code, and that also can include and run deterministic code nested within the plain language script.
So it allows us to create pipelines that combine the best of both approaches as appropriate based on the sub-task at hand.
If you mix deterministic and nondeterministic, then the result is nondeterministic.
Which means your entire pipeline is tainted.
If your process is fine with that, whatever, but don't pretend that the result can be controlled.
One thing to consider is that the steps in the pipeline can be deterministic (the code executed) while the outputs (summaries, reviews, evaluations, explanations) may be nondeterministic. An example would be summarizing data calculated via a traditional script, and piping it to a report-format markdown script that generates the report and summarizes the results.
I agree that this is a choice by each person using tools like this, and that it is up to each of us as developers whether a tool like this suits the use case at hand.
My own view is that the world is rapidly moving to more human language programming tools, and that system automation and shell scripting will be part of this. There is a wide array of sensible potential use cases I can see between the two polarized views of "never use an LLM' and "let's vibe code system automation".
1 reply →
Exactly. The above is a terrible idea.
I guess these so called “developers” these days did not ever think about why this is needed. Ever.
The “senior/staff” engineers of 2025 are now at the same knowledge level of juniors in 2015 or were not at all “senior” to begin with ideas like this.