Comment by steveBK123
2 years ago
All of these “AI/ML” driven products seem to have a whizz-bang initial release where they seem 80% ready and then never come close to the remaining 5/10/15/20%
Hell I used dragon naturally speaking in the late 90s and the stuff now doesn’t even feel 10x better despite billions invested and 10000x the compute.
Self driving cars feels similar. Always five year away from mass market.
We can tune to be pretty good most of the time, but being fully good enough all the time out to a bunch of 9s just ends up being a moonshot by comparison.
Really curious how much better in what dimensions these generative / LLMs will actually get in 5/10/15 years.
Dragon Naturally Speaking was, ironically, more flexible than today's voice tech besides the fact that it wasn't internet-connected (I think?). It's not like you can attempt to write an essay with Alexa or control a browser window with it. What's also funny is how we have this narrative that cloud computing is a necessity for AI, and yet Dragon had NLP that fit on a CD-ROM. Ok, maybe it came on multiple discs... I'm forgetting, but my point still stands.
Most of our advances have been in marketing rather than substance.
The current generation of AI/ML may change that in some way. Dragon Naturally Speaking may have been a thing in the 90s, but I'm pretty sure we didn't have anything close to GPT or Stable Diffusion.