Training ASICs (like Google’s TPUs) can generally run inference too, since inference is a subset of training computations. TPUs are widely used for both.
Mining ASICs (Bitcoin, etc.) cannot be repurposed…they’re hardwired for a single hash algorithm and lack matrix math needed for neural networks.
I think they mean serving inference workloads
How does that work? Isn't most bitcoin mining done on custom ASICs? I didn't think that the ASIC could be repurposed for inference.
Training ASICs (like Google’s TPUs) can generally run inference too, since inference is a subset of training computations. TPUs are widely used for both.
Mining ASICs (Bitcoin, etc.) cannot be repurposed…they’re hardwired for a single hash algorithm and lack matrix math needed for neural networks.
The biggest cost is the power which is often on multi year contracts. The hardware is comparatively cheap
1 reply →