← Back to context

Comment by justanotherjoe

1 month ago

Yes humans are different learners. Its not a problem. At least we dont know if its a problem.

Regarding brute force, of course not. We are 2 years into chatgpt people are still thinking its just ngram statistical models smh. Dont listen to bad youtubers.... Not even ppl like 3b1b, sabine, thor, or primeagen. Why, when you can take it from so many ppl actually working in ai instead.

Anyway, yes chatgpt is huge. But if you use ngram statistical models like its 1980, you'd need the whole universe as a server and it still wont be as good. Big!=infinitely big. Its not 'brute force'.

Maybe its not what you meant. But i heard this a lot. Anyway, sorry for venting!