Comment by MPSimmons
1 year ago
>And how much data can you give it?
Well, theoretically you can give it up to the context size minus 4k tokens, because the maximum it can output is 4k. In practice, though, its ability to effectively recall information in the prompt drops off. Some people have studied this a bit - here's one such person: https://gritdaily.com/impact-prompt-length-llm-performance/
You should be able to provide more data than that in the input if the output doesn't use the full 4k tokens. So limit is context_size minus expected length of output.