Comment by jeswin
1 year ago
A big limitation with GPT4 Turbo (and Claude 3) for coding is the output token size. The only way to overcome the 4k limitation is by generating a file (if it fits), and feeding it back to generate the second and so on.
For this reason, GPT4-32k is my preferred model for codegen. I wish there were cheaper options.
Can you use 32k with Chat?
Chat is a fairly terrible interface for real work, since you can’t modify anything, meaning the context can get easily poisoned. I much prefer the API playgrounds, and third party interfaces, that slow editing both my input and the responses.
What 3rd party tools do you recommend?
3 replies →