← Back to context Comment by 3abiton 9 days ago Depends what is your setup? You can always find more support on r/Localllama 1 comment 3abiton Reply WillAdams 9 days ago Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool.Let me rephrase:What locally-hosted LLM would be suited to batch processing image files?
WillAdams 9 days ago Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool.Let me rephrase:What locally-hosted LLM would be suited to batch processing image files?
Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool.
Let me rephrase:
What locally-hosted LLM would be suited to batch processing image files?