Comment by phillipcarter

2 years ago

Congrats on the release! I'm keenly interested in this space, as I believe that Observability is one of the top ways to steer LLMs to be more reliable in production.

I noticed your SDKs use tracing concepts! Are there plans to implement OpenTelemetry support?

Thank you so much, fully share your sentiment on this and aligned our domain language to OpenTelemetry. Currently users add lots of metadata and configuration details to the trace by manually instrumenting it using the SDKs (or via Langchain integration). We are thinking about integrating OpenTelemetry, as this would be a step function on making integrations with apps easier. However, hadn't had the time yet to figure out how capture all the metadata that's relevant as context to the trace.

  • Makes sense! If you're curious, I added an autoinstrumentation library for openai's python client here: https://github.com/cartermp/opentelemetry-instrument-openai-...

    The main challenge I see is that since there's no standard that each LLM has for inputs/outputs (let alone retrieval APIs!) any kind of automatic instrumentation will need to have a bunch of adapters. I suppose LangChain helps here, but even then with so many folks ripping it out for production you're still in the same place.

    Happy to collaborate on any design thinking for how to incorporate OTel support!

    • Yes, we were thinking about the lack of standards as well. I would be super happy to have a design discussion around the topic, i will reach out to you.

Shameless plug on https://proc.gg since you asked about OpenTelemetry support. The observability features are built upon Otel and I plan to open source it if there is considerable interests.