Comment by mrbungie
7 months ago
Probably they ran a frequency analysis to get the most used languages, and then, they focused on scoring high on those languages in any way they could including Prompt Engineering or Context Engineering (whatever they're calling that right now).
Or they just choose Python because that's what most AI bros and ChatGPT users use nowadays. (No judging, I'm a heavy Python user).
No, it's because that's what ChatGPT users internally to calculate things, manipulate data, display graphs etc. That's what its "python" tool is all about. The use cases usually have nothing to do with programming - the user is only interested in the end result, and don't know or care that it was generated using Python (although it is noted in the interface).
The LLM has to know how to use the tool in order to use it effectively. Hence the documentation in the prompt.
Oops, I forgot about that. Still, having it in the system prompt seems fragile, but whatever, my bad.