Comment by mrgriscom
4 days ago
This is plainly false though. You're saying beats can't be localized to less than one second of precision (regardless of track length, which already smells suspect). Humans can localize a beat to within 50ms.
4 days ago
This is plainly false though. You're saying beats can't be localized to less than one second of precision (regardless of track length, which already smells suspect). Humans can localize a beat to within 50ms.
Yes, I got lost in the numbers and made a blunder by misinterpreting what we mean by frequency resolution expressed in "BPM" instead of Hz.
It is correct to say "0.0046237213 Hz which is 0.27742328 BPM". My mistake was to interpret 0.27742328 BPM as the limit of frequency resolution in units of BPM. Rather, any BPM measured must be an exact multiple of 0.27742328 BPM.
Thanks for pointing out my mistake!
> (regardless of track length, which already smells suspect)
Frequency resolution being dependent on the number of samples is a very well known property of basic sampling theory and signal analysis.
In fact, one can interpolate the frequency spectrum by zero-padding the time samples. This increases the resolution in an artificial way because it is after all an interpolation. However, a longer song has more natural frequency resolution than a shorter song.
Note, this frequency resolution is not related to fidelity which is some messy human related thing that is over a sliding window of shorter duration that I don't pretend to understand.
BTW, the opposite is also possible. You can zero-pad the spectrum as a means of resampling (interpolating) the time domain. This is slower but more spectrally correct than say time-domain linear or cubic interpolation.
These techniques require an FFT and so are somewhat expensive to apply to long signals like an entire song, as I did for the plot. Daft Punk's HBFS takes about 8 seconds on one CPU core with Numpy's FFT.