Comment by ibreakphotos
3 years ago
Hey all, it's the author of the reddit post here. First of all, let me say that I don't usually frequent HN, but the comments on here are of such high quality, that I might need to change that. I got semi-depressed on reddit, with people misattributing statements and, in general, not being overly, uh, skeptical :)
That being said, there were a few comments on here about gaussian blur and deconvolution, which I would like to tackle. First, I need to mention that I do not have an maths/engineering background. I am familiar with some concepts, as I've used deconvolution via FFT several years ago during my PhD, but while I am aware of the process, I don't know all the details. I certainly didn't know that the image that was gaussian blurred could be sharpened perfectly - I will have to look into that. In fact, I used gaussian blur to redact some private information (like in screenshots), and it's very helpful to know if I haven't redacted anything and the data is recoverable. Wow.
I would love to learn more about the types of blur that cannot be deconvoluted.
However, please have in mind that in my experiment:
1) I also downsampled the image to 170x170, which, as far as I know, is an information-destructive process
2) The camera doesn't have the access to my original gaussian blurred image, but that image + whatever blur and distortion was introduced when I was taking the photo from far away, (whatever algo they are using doesn't have access to the original blurred image to run a perfect deconvolution on)
3) Lastly, I also clipped the highlights in the last example, which is also destructive (non-reversible), and the AI hallucinated details there as well
So I am comfortable saying that it's not deconvolution which "unblurs" the image and sharpens the details, but what I said - an AI model trained on moon images that uses image matching and a neural network to fill in the data.
Thank you again for your engagement and your thoughtful comments, I really appreciate them, and have learned a lot just by reading them!
> In fact, I used gaussian blur to redact some private information
Absolutely never do that. I honestly don't understand why people still do, given that it's obvious that low levels of blur can be reversed why even risk guessing until what point someone might be able to recover anything? Just censor it, draw over it with an opaque tool, and save it in a format that won't store layers or undo history or something (the riskiest format being pdf).
If you don't like how that looks, the alternative is to replace the information and then blur it. They can unblur but will find an easter egg at best.
Personally, I censor instead of blurring a replacement, but I balance between low contrast and not hiding the fact that information was removed. A stark contrast distracts and looks ugly. E.g., for black text on a white background, I'd pick a light/medium gray (around the average black level of the original text, basically).
> save it in a format that won't store layers or undo history or something
For eliminating such risk, just screenshot your censored content and use that image.
Should be rather easy to prove if Samsung is really able to „unblur“ an image in that way: use something else than an image of the moon as starting point and apply the same steps, i.e. down sizing and blur, then take a photo and see if it’s able to recover details.
I just wanted to say that the experiment at the end where you had half the moon and the whole moon was brilliant, and perfectly illustrated the problem in a single picture. If anyone hasn't seen that they should.
It would be interesting to see how this mode handles foreground objects such as an aeroplane or clouds.
Does it just overdraw it (i.e. erase it), apply texture over the foreground element, or fail altogether?
This scenario is probably why other vendors don't go so far as to fake such images with texture overlays.