Comment by chr15m

1 day ago

Do you mean you want responses cached to e.g. a file based on the inputs?

Yeah, if it's a novel prompt, by all means send it to the model, but if its the same prompt as 30s ago, just immediately give me the same response I got 30s ago.

That's typically how we expect bash pipelines to work, right?