← Back to context

Comment by chr15m

2 days ago

Do you mean you want responses cached to e.g. a file based on the inputs?

Yeah, if it's a novel prompt, by all means send it to the model, but if its the same prompt as 30s ago, just immediately give me the same response I got 30s ago.

That's typically how we expect bash pipelines to work, right?

  • Bash pipelines don't do any caching and will execute fresh each time, but I understand your idea and why a cache is useful when iterating on the command line. I'll implement it. Thanks!