Comment by Der_Einzige

5 days ago

Zero discussion around LLM sampling. How do you leave such a gaping hole in such a written piece? I know it's not AI cus AI wouldn't be that sloppy.

Local inference users are all about sampling, but users addicted to commercial inference services are wary of sampling, because they have to pay by the token.