Comment by vineyardmike

2 days ago

> You could have said same about Transformers, Google released it, but didn't move forward, turns out it was a great idea

Google released transforms as research because they invented it while improving Google Translate. They had been running it for customers for years.

Beyond that, they had publicly-used transformer based LMs ("mums") integrated into search before GPT-3 (pre-chat mode) was even trained. They were shipping transformer models generating text for years before the ChatGPT moment. Literally available on the Google SERP page is probably the widest deployment technology can have today.

Transformers are also used widely in ASR technologies, like Google Assistant, which of course was available to hundreds of millions of users.

Finally, they had a private-to-employees experimental LLMs available, as well as various research initatives released (meena, LaMDA, PaLM, BERT, etc) and other experiments, they just didn't productize everything (but see earlier points). They even experimented with scaling (see "Chinchilla scaling laws").