← Back to context Comment by amelius 7 hours ago You can use a local LLM and you can ask it to use tools so it is faster. 3 comments amelius Reply sigseg1v 6 hours ago "so it is faster" than what? A cloud hosted LLM? That's a pretty low bar. It's certainly not faster than jq. kelvinjps10 7 hours ago There is hardware that is able to run jq but no a local AI model that's powerful enough to make the filtering reliable. Ex a raspberry pi
sigseg1v 6 hours ago "so it is faster" than what? A cloud hosted LLM? That's a pretty low bar. It's certainly not faster than jq.
kelvinjps10 7 hours ago There is hardware that is able to run jq but no a local AI model that's powerful enough to make the filtering reliable. Ex a raspberry pi
"so it is faster" than what? A cloud hosted LLM? That's a pretty low bar. It's certainly not faster than jq.
There is hardware that is able to run jq but no a local AI model that's powerful enough to make the filtering reliable. Ex a raspberry pi