Comment by TimByte
24 days ago
The scary part isn't "LLMs doing sums." It's that the same deterministic model, same weights, same prompt, same OS, produces different floating-point tensors on different devices
24 days ago
The scary part isn't "LLMs doing sums." It's that the same deterministic model, same weights, same prompt, same OS, produces different floating-point tensors on different devices
No comments yet
Contribute on Hacker News ↗