← Back to context

Comment by shortishly

24 days ago

I agree that it isn't straight forward! Tansu uses the JSON protocol descriptors from Apache Kafka, generating ~60k LoC of Rust to represent the structures. It then uses a custom Serde encoder/decoder to implement the protocol: original, flexible and tag buffers formats for every API version (e.g., the 18 just in FETCH). It is based off spending the past ~10 years using Kafka, and writing/maintaining an Erlang client (there are no "good" Kafka clients for Erlang!). It also uses a bunch of collected protocol examples, to encode/decode during the tests. Tansu is also a Kafka proxy, which is also used to feed some of those tests.

Some of the detail: https://blog.tansu.io/articles/serde-kafka-protocol

However, there are definitely cases I am sure where Tansu isn't compatible. For example, Kafka UI (kafbat) reports a strange error when doing a fetch (despite actually showing the fetched data), which I've yet to get to the bottom of.

If you find any compatibility issues, then please raise an issue, and I can take a look.