← Back to context Comment by catlifeonmars 20 days ago Seems like we should fix the LLMs instead of bending over backwards no? 3 comments catlifeonmars Reply redman25 20 days ago They’re good at it because they’ve learned from the existing mountains of python and javascript. catlifeonmars 19 days ago I think the next big breakthrough will be cost effective model specialization, maybe through modular models. The monolithic nature of today’s models is a major weakness. rienbdj 19 days ago Plenty of Java in the training data too.
redman25 20 days ago They’re good at it because they’ve learned from the existing mountains of python and javascript. catlifeonmars 19 days ago I think the next big breakthrough will be cost effective model specialization, maybe through modular models. The monolithic nature of today’s models is a major weakness. rienbdj 19 days ago Plenty of Java in the training data too.
catlifeonmars 19 days ago I think the next big breakthrough will be cost effective model specialization, maybe through modular models. The monolithic nature of today’s models is a major weakness.
They’re good at it because they’ve learned from the existing mountains of python and javascript.
I think the next big breakthrough will be cost effective model specialization, maybe through modular models. The monolithic nature of today’s models is a major weakness.
Plenty of Java in the training data too.