Comment by yujonglee
5 days ago
Local-first, controllability(custom endpoint part), and eventually extensibility(VSCode part of the post)
We're putting a lot of effort into making it run smoothly on local machines. There are no signups, and the app works without any internet connection after downloading models.
One of the things I would want to do is - As the meeting is going on - I would like to ask a LLM what questions I could ask at that point in time. Especially if it's a subject I am not expert in.
Would I be able to create an extension that could do this?
you can definitely do that in the future. but we had that on our mind as well from multiple requests - planning to add "eli5 - explain like i'm five" and "mmss - make me sound smart" ;) (edit: grammar fix)
Wow, does anything like this exist in current commercial tools?
ELI5 sounds useful.
MMSS sounds terrifying though, honestly.
1 reply →