← Back to context

Comment by politelemon

3 days ago

Very nice, good work. I think you should add the chat app links on the readme, so that visitors get a good idea of what the framework is capable of.

The performance is quite good, even on CPU.

However I'm now trying it on a pixel, and it's not using GPU if I enable it.

I do like this idea as I've been running models in termux until now.

Is the plan to make this app something similar to lmstudio for phones?

appreciate the feedback! Made the demo links more prominent on the README.

Some Android models won't support GPU hardware; we'll be addressing that as we move to our own kernels.

The app itself is just a demonstration of Cactus performance. The underlying framework gives you the tools to build any local mobile AI experience you'd like.