I have a prompt which works for a single file in Copilot, but it's slower than opening the file and looking at it to find one specific piece of information and re-saving it manually and then running a .bat file to rename with more of the information, then filling out the last two bits when entering things.
Do you mean you want to process multiple files with a single LLM call or process multiple files using the same prompt across multiple LLM calls?
(I would recommend the latter)
Multiple files with a single LLM call.
I have a prompt which works for a single file in Copilot, but it's slower than opening the file and looking at it to find one specific piece of information and re-saving it manually and then running a .bat file to rename with more of the information, then filling out the last two bits when entering things.
Depends what is your setup? You can always find more support on r/Localllama
Using Copilot, and currently running jan.ai --- /r/Localllama seems to tend towards the typical Reddit cesspool.
Let me rephrase:
What locally-hosted LLM would be suited to batch processing image files?