Comment by andersa 8 months ago TensorRT-LLM being open source is a lie, all the important kernels are loaded from cubins. 1 comment andersa Reply nabakin 8 months ago Yeah you're right (although, they started to open source some of that recently iirc). I meant SOTA for inference engines we can actually download and use ourselves.
nabakin 8 months ago Yeah you're right (although, they started to open source some of that recently iirc). I meant SOTA for inference engines we can actually download and use ourselves.
Yeah you're right (although, they started to open source some of that recently iirc). I meant SOTA for inference engines we can actually download and use ourselves.