← Back to context

Comment by colanderman

3 years ago

> The resulting complex array is than converted into an output image by taking the magnitude.

No, it's emphatically not. Perhaps you are thinking of displaying a spectrogram.

To produce an image from frequency-domain data, inverse DFT must be applied. Since (as @nyanpasu64 points out), the DFT of a real-valued image or kernel is conjugate-symmetric (and vice-versa), the result is again real-valued without loss of information. The phase information is not lost. If it were, the image would be a jumbled mess.

(Not that DFT+inverse DFT is necessary for Gaussian blur anyway -- you simply convolve with a truncated Gaussian kernel.)

> Another point of view is that there are infinitely many images that will produce the same result after blurring.

No, this is not true. I don't know why you think it is. This is only true of a brick wall filter, which Gaussian filter is not [1].

The SNR of high-spatial-frequency components is reduced for sure, which can lead to irrevocable information loss. But this is nothing to do with phase.

[1] https://en.wikipedia.org/wiki/Window_function#Gaussian_windo...