Samsung’s Moon Shots Force Us to Ask How Much AI Is Too Much
And unlike the Eiffel Tower, for example, its appearance won’t change much based on lighting. Shooting the moon usually takes place only at night, and Samsung’s processing breaks down if the moon is partially obscured by clouds.
One of the most obvious ways Samsung handles tricks with the moon is by adjusting the contrast between tones, making its terrain more visible. However, it is also clearly capable of introducing the appearance of textures and details that are not present in the raw image.
Samsung does this because the 100x zoom images of the Galaxy S21, S22, and S23 Ultra phones are terrible. Of course they do. They involve batch cropping into a small 10 MP sensor. The zooming capabilities of the periscopes in phones are great, but they’re not magic.
Huawei is another big company accused of faking its moon photos. Huawei P30Pro from 2019. It was the last flagship Huawei released before the company was blacklisted in the US, destroying the appeal of the phone in the West.
Android Authority Claimed Your phone has pasted an existing moon image on your photo. Here’s how the company responds: “Moon mode works on the same principle as the other major AI modes, in that it recognizes and optimizes details in photos to help individuals capture the best possible photos. better photos. It doesn’t replace the image in any way—this would require unrealistic storage capacity as the AI mode recognizes more than 1,300 scenarios. Based on machine learning principles, the camera recognizes a scenario and helps optimize focus and exposure to enhance details such as shapes, colors, and highlights/low light.”
You won’t see these techniques used in too many other brands, but not for any sublime reason. If the phone doesn’t have at least 5x long-range zoom, then Moon mode is largely meaningless.
Trying to capture the moon with an iPhone is difficult. Even iPhone 14 Pro Max There’s no zoom range for it, and the phone’s auto-exposure turns the moon into a blinding white blob. From a photographer’s perspective, the S23’s exposure control alone is outstanding. But how “fake” is the moon image of S23?
The most generous interpretation is that Samsung uses the camera’s real image data and only deploys its machine learning to massage the processing. This could, for example, help it trace the outlines of Serenity Sea And Peace of the sea when trying to get a sense of more detail from a blurry source.
However, the line is stretched in such a way that the final image shows the positions of the Kepler, Aristarchus, and Copernicus craters with the seemingly uncanny accuracy of these small features in the source. While you can infer the positions of lunar features from a fuzzy source, it’s next-level stuff.
However, it’s easy to overestimate the extent of Samsung Galaxy S23 growth here. Its moon images are fine at a glance, but they’re still terrible. Recently compared to video have S23 Ultra and Nikon P1000 shows the capabilities of a superzoom camera for the sub-consumer DSLR.
A question of faith
The outrage over this moon issue is understandable. Samsung uses images of the moon to promote its 100x camera mode, and to some extent, these images are composite. But it’s really just poked a toe out of the ever-expanding Overton AI window here, which has driven innovation in phone photography over the past decade.
Each of these technical tricks, whether you call them AI or not, is designed to do things that wouldn’t be possible with the rudimentary basics of phone cameras. One of the first and arguably the most important features is HDR (High Dynamic Range). Apple integrated HDR into its camera app in iOS 4.1, released in 2010, the year of the iPhone 4.