← Back to context Comment by hahahahhaah 8 hours ago But are you losing horsepower of the LLM available to problem solving on a given task by doing so? 1 comment hahahahhaah Reply simonw 7 hours ago Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.
simonw 7 hours ago Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.
Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.