Comment by dagss
3 years ago
Wait ... a Gaussian blur? That doesn't remove ANY information from the image. It looks blurred to our eyes but the same information is all there.
The information is absolutely not gone.
Does it state that it hallucinates craters that were not there in the original, or is it possible the filters simply did an FFT, adjusted the power spectrum to what we expect of a non-blurry picture, hence inverting the Gaussian blur?
EDIT: Note that a deblur of a smooth but "noisy" image can cause "simulation" or "hallucination" entirely without AI. Could be any number of things causing an output image like that (wavelet sharpening, power spectrum calibration, ...). Even if the information isn't recoverable as such, a photo of a Gaussian blur has an unnatural power spectrum that could easily "trick" conventional non-AI algorithms into doing such things.
Especially since the only thing I see in the output is "more detail" (i.e. simply a different power spectrum than the author expected..)
Unless you’re using floating point numbers and have padded the image to infinity, it’s literally untrue that applying a Gaussian kernel doesn’t throw away at least some (if not a lot of) information. I think what you intended to say is the blur doesn’t remove “all” the information. Not to mention he downsized the image (likely using bilinear not even bicubic ) further throwing out a ton of information. Not to mention he then showed the image on a display and took a photo from across the room. For all practical purposes the technically incorrect pedantic statement does not apply even if you correct it.
Looking at the images, it will be fantastical if not a violation of information theory if the phone didn’t use prior information about how the moon looks to create the photo.
Source: spent way more months than I would have preferred calculating CRLB values on Gaussian blurred point source images.
Hear Hear! This deserves to be reiterated given the statements of information recoverability made in this thread.
It also happened after downsampling (aka removing information) of the image. And when parts of the image was clipped to white (aka no information at all). So your point doesn't absolve Samsung of guilt as you believe.
Yeah the author glossed over this a bit in my opinion. In infinite precision math you're correct, but at some point the signal in those higher frequencies is going to be reduced below the precision of the storage data type, never mind the dynamic range of the monitor and camera he's using.
It seems clear the author does not have knowledge on the subject, more than glossing over -- the article even emphasises that information is removed by the blur (100% wrong). I agree it may not have destroyed the experiment entirely, but it does mean the experiment was conducted without knowledge of basic signal processing and I would prefer a more through study or analysis before drawing conclusions...
You must have "glossed over" the first part of the statement (emphasis mine):
> I *downsized it to 170x170 pixels* and applied a gaussian blur, so that all the detail is GONE
> The information is absolutely not gone.
Continue to read through paragraphs 4 and 5 of the "Conclusion" section.
Well it is a conclusion written without knowledge of signal analysis.
The input to the algorithm will be an image with a power spectrum that isn't natural, . It would be very natural even without "AI" to attempt to "deblur" when faced with such a power spectrum.
A deblur of noise can cause "hallucinations" with much simpler reasons than AI beong involved.
Could it not e.g. be doing some wavelet transforms or FFTs and automatic power spectrum calibration?
The parent is probably talking about the section where they clipped the highlights and details where added back in by the phone.