You may have seen headlines this week about the Samsung Galaxy S23 Ultra taking so-called “fake” pictures of the moon. Starting with the S20 Ultra, Samsung has a Space Zoom feature that combines 10x optical zoom with massive digital zoom to achieve a combined 100x zoom. In marketing shots, Samsung showed that its phone takes near-crystal clear pictures of the moon, and users did the same on a clear night.
But Redditor proved that Samsung’s incredible space zoom uses a bit of trickery. It turns out that when shooting the moon, the AI-powered Samsung Scene Optimizer does a lot of hard work to make it look like the moon was photographed with a high-resolution telescope, not a smartphone. So when someone takes a picture of the moon—in the sky or on a computer screen, as in a Reddit post—the Samsung computing machine takes over and cleans up the craters and contours that the camera missed.
In a follow-up post, they prove beyond any doubt that Samsung does indeed add “moon” images to photos to make the shot clearer. As they explain: “The computer vision/AI module recognizes the moon, you take a picture, at which point a neural network trained on countless images of the moon fills in the details that were not available optically.” It’s a bit more “fake” than Samsung allows, but it’s still to be expected.
Even without investigative work, it should be quite obvious that S23 cannot naturally take clear pictures of the Moon. While Samsung claims space zoom shots using the S23 Ultra are “capable of capturing images from an incredible distance of 330 feet,” the Moon is almost 234,000 miles away, or roughly 1,261,392,000 feet. It is also a quarter of the size of the Earth. After all, smartphones have no problem taking clear photos of skyscrapers that are more than 100 meters away.
Of course, the distance to the moon doesn’t tell the whole story. The moon is essentially a light source against a dark background, so the camera needs a bit of help to get a sharp image. Here’s how Samsung explains it: “When you take a picture of the Moon, your Galaxy device’s camera system will use this deep learning-based artificial intelligence technology as well as multi-frame processing to further enhance detail. Read on to learn more about the many steps, processes, and technologies required to obtain high-quality images of the Moon.”
It’s not that much different from features like Portrait Mode, Portrait Lighting, Night Mode, Magic Eraser, or Face Blur. All of this uses computational awareness to add, tweak, and edit things that aren’t there. In the case of the moon, it’s easy for Samsung’s AI to make it look like the phone is taking incredible pictures because Samsung’s AI knows what the moon looks like. For the same reason, the sky sometimes seems too blue and the grass too green. The photo engine applies what it knows to what it sees to mimic a higher-end camera and make up for the usual flaws of a smartphone.
The difference here is that while photography algorithms typically segment an image into sections and apply various settings and exposure controls to them, Samsung also uses a limited form of AI imaging on the moon with details never seen before. in the camera data to start with – but you won’t know this because the details of the Moon always look the same when viewed from Earth.

Samsung says the S23 Ultra’s camera uses Scene Optimizer’s “deep learning-based AI detail enhancement engine” to effectively remove remaining noise and further enhance image details.
Samsung
What will Apple do?
Apple is rumored to be adding a periscope zoom lens to the iPhone 15 Ultra for the first time this year, and the controversy will no doubt reverberate in how the company trains its AI. But you can be sure that the compute engine will do a fair amount of hard work behind the scenes, as it does now.
That’s what makes smartphone cameras so great. Unlike point-and-shoot cameras, our smartphones have a powerful “brain” that can help us take better photos and improve the quality of bad photos. It can make night photos look like they were taken in good light and mimic the bokeh effect of an ultra-fast aperture camera.
And this is what will allow Apple to get incredible results from 20x or 30x zoom from a 6x optical camera. Since Apple has avoided astrophotography so far, I doubt it will go as far as sampling higher resolution photos of the moon to help the iPhone 15 take sharper pictures, but you can be sure its Photonic Engine will work hard cleaning up the edges. , preserving detail and expanding the capabilities of a telephoto lens. And judging by what we get in the iPhone 14 Pro, the results are sure to be impressive.
Whether it’s Samsung or Apple, computational photography has enabled some of the biggest breakthroughs of the past few years, and we’ve only scratched the surface of what it’s capable of. None of this is really real. And if that were the case, then we would be less impressed by the photos we take with our smartphones.