← Back to context

Comment by pooper

5 hours ago

Speaking of opentelemetry, I try to use open telemetry with my personal projects in an asp dotnet as well as with a dotnet console app. I don't have the required corporate background in opentelemetry. I had to write my own file log exporter. I didn't write it myself -- I used Claude to write it for me in jsonl format which seemed like a good way to have each row in json and for the console app, I get a file something like this

``` logs_2025-12-24_0003.jsonl ```

I asked Claude to keep it in an xdg folder and it chose

``` /home/{username}/.local/share/{applicationName}/telemetry/logs ```

I also have folders for metrics and traces but those are empty.

I have never had a need to look at the logs for the dotnet console app I have and the only reason I have looked at the logs on the asp dotnet app was to review errors when I ran into some error on my asp dotnet application, which frankly I don't need open telemetry for.

What am I missing here? Am I using it wrong?

If you use open telemetry, where do your logs, metrics, and traces go? Do you write your own custom classes to write them to a file on the disk? Do you pay for something like datadog (congratulations on winning the lottery I guess?)

I appreciate your reply. Thank you for helping me learn.

By default, open telemmetry will try to send traces to a remote http endpoint, unless there is a local exporter in the code base to avoid that behaviour.

Otel has value for things crossing microservices in production, much less in local for a single app.

Often, languages running on dotnet or the JVM already enjoy great tooling for quite a few years, and so locally they are much better than otel.

Open Telemetry is a "narrow waist".

That is, it defines a relatively small, interoperable interface that a lot of distinct products from many different vendors can "sink" their telemetry into, and then on the other end of this narrow waist a bunch of different consumers can "source" the data from.

Think of it as a fancy alternative to ILogger and similar abstractions that is cross-platform and cross-vendor. Instead of Microsoft-specific or Java-specific (or whatever-specific) sources with their own protocols and conventions, there's a single standard for the data schema that everybody can talk. It's like TCP/IP or JSON.

So your question is in some sense nonsense. It's like asking "what do you use TCP/IP for?" or "where do you put your JSON"?

The answer is: wherever you want! That's the whole point.

In Azure, you would use Application Insights, as a random example. New Relic, DataDog, Prometheus, Zipkin, Elasticsearch, or... just your console output. Simple text log files. A SQL database. Wherever!

In more practical terms, for a solo developer working on personal projects, use Aspire.NET with Visual Studio 2026. You'll get a free "local" alternative to an APM like Application Insights or DataDog that collects the traces. Keep using the "standard" interfaces like ILogger, they forward everything to OTel if you're using it.