Comment by Krei-se
1 month ago
Its learned from synthetic training sets generated by iterating over sources variables or structures, modifying them until the code does not compile, then taking the compilers result as the output.
This does not make the learned rules better btw its like model collapse on steroids, see: https://spectrum.ieee.org/ai-coding-degrades
You get code that is technically correct but is more likely to contain shortcuts by omitting steps or adding irrelevant structures that hit performance (not that you'd care about that) or at worst provide a nonworking solution that still is correct to the checker. Because those sets are easily generated in billions AI can learn with a known 0% loss so no its no surprise it can give you a working WASM binary!!!! Natural language and business logic is WAY MORE complicated and has no specifications with defined behavior.
It`s similar to how you try to repeat a wrong opinion that makes sense to you but is not solving the problem domain and then think just by spamming it repeatedly it will magically become truth.
You assume working code = correct code = good code.
No comments yet
Contribute on Hacker News ↗