← Back to context

Comment by hahahahhaah

12 hours ago

But are you losing horsepower of the LLM available to problem solving on a given task by doing so?

Maybe a little, but Claude has 200,000 tokens these days and GPT-5.2 has 400,000 - there's a lot of space.

  • True. You would know this better but are you also burning "attention" by giving it a new language? Rather than use its familiar Python pathways it needs to attend more to generate the unseen language. It needs to KV across from the language spec to the language to the goal. Rather than just speak the Python or JS it is uses to speaking.