Comment by rvnx
2 years ago
No they are likely working on offline LLMs and custom chips so they'll be fine.
If you can run a large model locally for most of the cases, you won't want to use the Google Cloud services or OpenAI.
2 years ago
No they are likely working on offline LLMs and custom chips so they'll be fine.
If you can run a large model locally for most of the cases, you won't want to use the Google Cloud services or OpenAI.
No comments yet
Contribute on Hacker News ↗