Comment by chr15m 1 day ago Do you mean you want responses cached to e.g. a file based on the inputs? 1 comment chr15m Reply __MatrixMan__ 19 hours ago Yeah, if it's a novel prompt, by all means send it to the model, but if its the same prompt as 30s ago, just immediately give me the same response I got 30s ago.That's typically how we expect bash pipelines to work, right?
__MatrixMan__ 19 hours ago Yeah, if it's a novel prompt, by all means send it to the model, but if its the same prompt as 30s ago, just immediately give me the same response I got 30s ago.That's typically how we expect bash pipelines to work, right?
Yeah, if it's a novel prompt, by all means send it to the model, but if its the same prompt as 30s ago, just immediately give me the same response I got 30s ago.
That's typically how we expect bash pipelines to work, right?