Comment by exhilaration 8 months ago Local LLMs are almost here, no Internet needed! 2 comments exhilaration Reply mystraline 8 months ago Almost?I've been running a programming LLM locally, with a 200k context length with using system ram.Its also an abliterated model, so I get none of the moralizing or forced ethics either. I ask, and it answers.I even have it hooked up to my HomeAssistant, and can trigger complex actions from there. robotnikman 8 months ago What model are you using and what kind of hardware are you running it on?
mystraline 8 months ago Almost?I've been running a programming LLM locally, with a 200k context length with using system ram.Its also an abliterated model, so I get none of the moralizing or forced ethics either. I ask, and it answers.I even have it hooked up to my HomeAssistant, and can trigger complex actions from there. robotnikman 8 months ago What model are you using and what kind of hardware are you running it on?
Almost?
I've been running a programming LLM locally, with a 200k context length with using system ram.
Its also an abliterated model, so I get none of the moralizing or forced ethics either. I ask, and it answers.
I even have it hooked up to my HomeAssistant, and can trigger complex actions from there.
What model are you using and what kind of hardware are you running it on?