← Back to context

Comment by lambda

2 days ago

[flagged]

I added it since many people who used Unsloth don't know how to compile llama.cpp, so the only way from Python's side is to either (1) Install it via apt-get within the Python shell (2) Error out then tell the user to install it first, then continue again

I chose (1) since it was mainly for ease of use for the user - but I agree it's not a good idea sorry!

:( I also added a section to manually compile llama.cpp here: https://docs.unsloth.ai/basics/troubleshooting-and-faqs#how-...

But I agree I should remove apt-gets - will do this asap! Thanks for the suggestions :)

  • Hey man, I was seeing your comments and you do seem to respond to each and everyone nicely regarding this sudo shenanigan.

    I think that you have removed sudo so this is nice, my suggestion is pretty similar to that of pxc (basically determine different distros and use them as that)

    I wonder if we will ever get a working universal package manager in linux, to me flatpak genuinely makes the most sense even sometimes for cli but flatpak isn't built for cli unlike snap which both support cli and gui but snap is proprietory.

    • Hey :) I love suggestions and keep them coming! :)

      I agree on handling different distros - sadly I'm not familiar with others, so any help would be appreciated! For now I'm most familiar with apt-get, but would 100% want to expand out!

      Interesting will check flatpak out!

      2 replies →

  • have you considered cosmopolitan? e.g. like llamafile that works on everything up to and including toasters.

    • Oh llamafile is very cool! I might add it as an option actually :) For generic exports (ie to vLLM, llamafile etc), normally finetunes end with model.save_pretrained_merged and that auto merges to 16bit safetensors which allows for further processing downstream - but I'll investigate llamafile more! (good timing since llamafile is cross platform!)

> What this whole thing hallucinated by an LLM?

probably not, because LLMs are a little more competent than this