← Back to context

Comment by martinbooth

7 hours ago

One example I have that made me excited for this feature is the free recipe manager website I run.

Many of the paid-for competitors give users the ability to import unstructured recipe data these days from sites like instagram or at least text-only websites.

I can't afford to offer this as a feature since my website has no advertising and I just pay for it out of pocket, but it's an incredibly easy feature to add if you have the money to pay for tokens.

If I could use a local llm to do it though that runs in the person's own browser then I think it would definitely be valuable.

That said, I'm not sure the state of local llms provides a good enough experience yet (small models and slow) but that doesn't mean that in the future it might not be useful.

The propsosed apis do work for this purpose, albeit more slowly and lower quality