And unlike, for example, the Eiffel Tower, its appearance will not change drastically based on the lighting. Shooting the moon usually only happens at night, and Samsung’s processing falls apart if the moon is partially obscured by clouds.
One of the clearest ways Samsung’s processing plays with the Moon is by manipulating the midtone contrast, making its topography more pronounced. However, it is clearly also capable of introducing the appearance of texture and detail that is not present in the raw photo.
Samsung does this because the 100x zoom images on the Galaxy S21, S22 and S23 Ultra phones are bad. Of course they do. They include mass cropping in a small 10 MP sensor. Periscopic zooms in phones are great, but they’re not magic.
Credible theories
Huawei is the second major company accused of faking its moon photos, with the otherwise brilliant Huawei P30 Pro from 2019. It was the last Huawei flagship released before the company was blacklisted in the US, effectively destroying the appeal of its phones in the West.
Android Authority claimed that the phone pasted an image of the moon into your photos. Here’s how the company responded: “Moon Mode works on the same principle as other major AI modes, in that it recognizes and optimizes details within an image to help individuals take better photos. It doesn’t replace the image in any way – it would require an unrealistic amount of storage since the AI mode recognizes over 1300 scenarios. Based on machine learning principles, the camera recognizes the scenario and helps to optimize focus and exposure to enhance details such as shapes, colors and highlights/lowlights.”
Familiar, right?
You won’t see these techniques used in too many other brands, but not for any high-minded reason. If the phone doesn’t have a long-range zoom of at least 5x, Moon mode is mostly pointless.
Trying to capture the moon with an iPhone is difficult. Even the iPhone 14 Pro Max doesn’t have the zoom range for that, and the phone’s auto exposure will turn the moon into a glowing white blob. From a photographer’s point of view, the S23’s exposure control itself is excellent. But how “fake” are the pictures of the moon on the S23?
The most generous interpretation is that Samsung is using the actual camera image data and just implementing its machine learning know-how to massage the processing. This could, for example, help him trace the contours of the Sea of Serenity and the Sea of Tranquility when trying to extract a greater sense of detail from a cloudy source.
However, this line is stretched in such a way that the final image shows the position of the craters Kepler, Aristarchus and Copernicus with seemingly incredible precision when these small features are not discernible in the source. While you can tell where the moon’s features are from a blurry source, this is next-level stuff.
However, it’s easy to overestimate how far the Samsung Galaxy S23 has come. His photos of the moon may look OK at first glance, but they’re still bad. A recent Versus video featuring the S23 Ultra and the Nikon P1000 shows what a decent sub-DSLR consumer superzoom camera is capable of.
A question of trust
The anger over this Moon issue is understandable. Samsung uses lunar images to introduce its 100x camera mode and the images are, to some extent, synthesized. But it’s actually just poked its toe outside the ever-widening window of Overton AI here, which has driven innovation in phone photography for the past decade.
Each of these technical tricks, whether you call them AI or not, is designed to do what would be impossible with the raw basics of a phone camera. One of the first of these, and probably the most significant, was HDR (High Dynamic Range). Apple built HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.