← Back to context

Comment by yunohn

5 days ago

Super cool, congrats on the launch - will be trying this soon! I noticed it’s using Tauri - what are your main takeaways from building a local inference desktop app with it?

Thanks. I learned that running a server on the Rust side and calling it from a TypeScript frontend is a good approach. For example, we run an OpenAI-compatible server using a Tauri plugin (https://github.com/fastrepl/hyprnote/tree/main/plugins/local...) and call it using the Vercel AI SDK.