← Back to context Comment by tpae 1 day ago I've been building with local AI, on Apple Silicon. It's only 8mb, but runs 30% faster than Ollama.https://github.com/dinoki-ai/osaurus 1 comment tpae Reply mattfrommars 5 hours ago did you really solo develop this entire application? including dinoki-ai which appears to be SAAS?
mattfrommars 5 hours ago did you really solo develop this entire application? including dinoki-ai which appears to be SAAS?
did you really solo develop this entire application? including dinoki-ai which appears to be SAAS?