← Back to context

Comment by martzoukos

10 days ago

I guess that there is no streaming option for sending generated tokens to, say, an LLM service to process the text in real-time.