In the past few days, Samsung has come under some criticism over allegations that it is faking shots of the moon on the Galaxy S23 Ultra. It all started when this Redditor u/ibreakphotos He put a blurry image of the moon on his screen and took a picture with his Galaxy S23 Ultra, which then produced a good-looking moon. Then outlets started picking it up, and now Samsung felt it needed to explain its process to the world.
Samsung provided a detailed and technical explanation of how the moon shot works. The article has already been online for a while, only in Korean, but the recent controversy has given us the English version. Samsung uses Scene Optimizer, AI Deep Learning, and Super Resolution. Moon shot will engage When you enable the Scene Optimizer and zoom in over 25x, the AI Deep Learning Engine, preloaded with a variety of moon shapes and details, will recognize the moon, then apply Super Resolution multi-frame processing to enhance the moon.
This is nothing new. Samsung has been doing the same since the Galaxy S20 Ultra first introduced “100x Space Zoom” and it’s certainly not the only manufacturer using this kind of processing.
So Samsung’s moon shots aren’t technically fake, they’re just enhanced with artificial intelligence. In fact, you don’t get the final image from your phone’s lens and sensor, but more from its processing engine. But really, what did you expect? You need a huge lens, a tripod and an expensive dedicated camera to get a good picture of the moon.
Anyway, we’ll probably be talking about this in another two or three years when people forget about it again and someone brings it up again.