Comment by Majromax
3 years ago
> In all of the examples it seems that there is information in the original (dark spots) that are getting boosted.
That was the point of the Gaussian blur. By blurring the source image of the moon before taking its picture, there was no information. The Gaussian blur destroys fine detail, by design.
The image enhancement applied by the Samsung phone is adding detail where there was none originally – not just detail that might be buried in the optics somewhere, but not there at all. It involves a computational model of what the target (here the moon) "should" look like and guesses at what it thinks are the blurry bits.
No, Gaussian blur does not "destroy" fine detail. It is still there it is just that the "volume" of it is turned down. You can put all the fine detail back again by applying the inverse filter.
Gaussian blur is reversible with deconvolution - however that is almost certainly not what is happening here, as it’s fairly computationally expensive.
It may not be expensive if it’s being approximately done by a neural network that implicitly learned to do it when sharpening images.
this feels more like wild speculation than a plausible explanation
1 reply →
But I don't see it adding any detail over the blur. It really just looks like it is boosting contrast a bit on top of the blur. I don't see anything like ridges or defined features appearing.