← Back to context

Comment by dash2

16 hours ago

I think it’s interesting what this approach suggests about who will profit from AI. I’m sceptical that having huge numbers of GPUs is a moat. After all, real humans – even geniuses – are trained on much much less data than the whole Internet. But proprietary and specialised data could very well be a moat. It’s hard to train a scientist/lawyer/analyst without reading a lot of science/law/finance. Companies’ proprietary data might encode a great deal of irreplaceable knowledge. Seems as if Mistral is taking this bet.

> After all, real humans – even geniuses – are trained on much much less data than the whole Internet.

It's certainly different data, but one could argue that real humans have been trained on 3.5 billion years of evolution data.