Comment by spot5010
2 months ago
Right. This makes sense. But why Fourier space in particular. Why not, for example, a wavelet transform.
2 months ago
Right. This makes sense. But why Fourier space in particular. Why not, for example, a wavelet transform.
> Why not, for example, a wavelet transform.
That is a great idea for a paper. Work on it, write it up and please be sure to put my name down as a co-author ;-)
Or for that matter, a transform that's learned from the data :) A neural net for the transform itself!
That would be super cool if it works! I’ve also wondered the same thing about activation functions. Why not let the algorithm learn the activation function?
2 replies →
Now you’re talking efficiency—-certainly a wavelet transform may also work. But wavelets tend to be more localized than FTs.
This way you end up with time dilated convolutional networks [1].
[1] https://openreview.net/pdf?id=rk8wKk-R-