Comment by huntergemmer
21 hours ago
Interesting idea - I haven't explored wavelet-based approaches but the intuition makes sense: decompose into frequency bands, keep the low-frequency trend, and selectively preserve high-frequency peaks that exceed some threshold.
My concern would be computational cost for real-time/streaming use cases. LTTB is O(n) and pretty cache-friendly. Wavelet transforms are more expensive, though maybe a GPU compute shader could make it viable.
The other question is whether it's "visually correct" for charting specifically. LTTB optimizes for preserving the visual shape of the line at a given resolution. Wavelet decomposition optimizes for signal reconstruction - not quite the same goal.
That said, I'd be curious to experiment. Do you have any papers or implementations in mind? Would make for an interesting alternative sampling mode.
Doesn't FFT depend at least on a "representative" sample of the entire dataset?
Sounds like what makes sql joins NP-hard.
I don't. I just remember watching a presentation on it and it always struck me that wavelets are an incredibly powerful and underutilized technique for data reduction while preserving quality in a quantifiable and mathematically justifiable way.
I don't have any papers in mind, but I do think that the critique around visual shape vs signal reconstruction may not be accurate given that wavelets are starting to see a lot of adoption in the visual space (at least JPEG2000 is the leading edge in that field). Might also be interesting to use DCT as well. I think these will perform better than LTTB (of course the compute cost is higher but there's also HW acceleration for some of these or will be over time).
This might be because JPEG already does FFT/DCT.