Comment by gabrielruttner

5 days ago

gabe, hatchet cofounder here. thanks for this feedback and i agree!

under the hood we're using vercel ai sdk to make tool calls so this is easily extended [1]. this is the only "opinionated" api for calling llm apis which is "bundled" within the sdk and we were torn on how to expose it for this exact reason, but since its so common we decided to include it.

some things we were thinking is overloading `defaultLanguageModel` with a map for different usecases, or allowing users to "eject" the tool picker to customize it as needed. i've opened a discussion [2] to track this.

[1] https://github.com/hatchet-dev/pickaxe/blob/main/sdk/src/cli...

[2] https://github.com/hatchet-dev/pickaxe/discussions/3

I think providing examples and sample code is better than tying your API to AI sdk.

Due to how fast AI providers are iterating on their APIs, many features arrive weeks or months later to AI SDK (support for openai computer use is pending since forever for example).

I like the current API where you can wait for an event. Similar to that, it would be great to have an API for streaming and receiving messages and everything else is handled by the person so they could use AI sdk and stream the end response manually.