Comment by eavan0
14 hours ago
I'm glad it's not another LLM CLI that uses React. Vibe-cli seems to be built with https://github.com/textualize/textual/
14 hours ago
I'm glad it's not another LLM CLI that uses React. Vibe-cli seems to be built with https://github.com/textualize/textual/
I'm not excited that it's done in python. I've had experience with Aider struggling to display text as fast as the llm is spitting it out, though that was probably 6 months ago now.
Python is more than capable of doing that. It’s not an issue of raw execution speed.
https://willmcgugan.github.io/streaming-markdown/
thats an issue with aider. using a proper framework in the alternate terminal buffer would have greatly benefitted them