Comment by pancomplex
8 days ago
In theory it could -- but we currently still depend on modern LLMs due to their great latency vs locally run LLMs.
8 days ago
In theory it could -- but we currently still depend on modern LLMs due to their great latency vs locally run LLMs.
No comments yet
Contribute on Hacker News ↗