Comment by pancomplex
7 days ago
In theory it could -- but we currently still depend on modern LLMs due to their great latency vs locally run LLMs.
7 days ago
In theory it could -- but we currently still depend on modern LLMs due to their great latency vs locally run LLMs.
No comments yet
Contribute on Hacker News ↗