← Back to context Comment by dataversity 1 month ago Also, is there is way I can invoke the models or is there an API which your tool exposes? 1 comment dataversity Reply sarthaksaxena 1 month ago Yes indeed there is, run `llmpm serve <model_name>`, which will expose an API endpoint http://localhost:8080/v1/chat/completions & also host a chat UI where you can interact with the local running model https://localhost:8080/chat.Follow the docs here: https://www.llmpm.co/docsPro tip for your use case: Checkout the `llmpm serve` section
sarthaksaxena 1 month ago Yes indeed there is, run `llmpm serve <model_name>`, which will expose an API endpoint http://localhost:8080/v1/chat/completions & also host a chat UI where you can interact with the local running model https://localhost:8080/chat.Follow the docs here: https://www.llmpm.co/docsPro tip for your use case: Checkout the `llmpm serve` section
Yes indeed there is, run `llmpm serve <model_name>`, which will expose an API endpoint http://localhost:8080/v1/chat/completions & also host a chat UI where you can interact with the local running model https://localhost:8080/chat.
Follow the docs here: https://www.llmpm.co/docs
Pro tip for your use case: Checkout the `llmpm serve` section