← Back to context

Comment by cap11235

6 days ago

https://gist.github.com/cpsquonk/e9a6134e78a2c832161ca973803...

I did Qwen3-256B (a free model, but you'd need a host for something that large, probably. I used Kagi) and Claude Code.

Curious how these look to you.

It actually wrote out the code for all the hard stuff.

I like the Python code which outsourced the hard stuff to existing libraries. The odds of that working are higher.

Can you tell it to use the "glam" crate for the vectors, instead of writing out things like vector length the long way?

(We now need standardized low-level types more than ever, so the LLMs will use them.)