• Binary Footprints
  • Posts
  • The Infamous Moonshots and the Authenticity of 21st Century Photography

The Infamous Moonshots and the Authenticity of 21st Century Photography


I once heard a line in a show. “If you truly love someone, you’ll remember what they’re face looks like”. As funny, and true, as this quote is, photos have still become an integral part of mankind. But today we aren’t here to talk about the history of pictures. No, there’s more than enough of that topic sprawled across the internet. Today we’re gonna be seeing what the future holds for photography and how Artificial Intelligence plays into all this.

Late last year, Apple held their ‘Scary Fast’ event to unveil the iPhone 15, 15 Plus, 15 Pro and Pro Max along with other devices and it was as visually appealing as these events tend to be. From cool drone shots to nice close ups, it was as good as always. However, at the end of this event they put up a graphic on the screen which said that all shots were taken on the iPhone, specifically the 15 Pro Max. This was quite the shocker for most people watching these events as while watching all the segments,  no one could tell. This is how freakishly good smartphone cameras have become. As a matter of fact, IMDb has an entire page showing some movies that were shot on phones. 

Google’s flagship phones (basically means their leading phone), the Pixel 8 and 8 pro were released about the ending of 2023 and they had some of the coolest, but frankly concerning, features phones have ever had. The Pixel has always been praised for its beautiful photos but not all of that is because of the camera itself and more so the, software that goes into it. This is called Computational Photography.

You probably have an idea what this term means already, but to put it into perspective I’ll give it proper definition. Marc Levoy, one of the pioneers of this concept and also one of the people responsible for the Pixel’s great photos, defined computational photography as “computational imaging techniques that enhance or extend the capabilities of digital photography in which the output is an ordinary photograph, but one that could not have been taken by a traditional camera”. What does this mean and why should you care?

Well these are images that were gotten from the Google Pixel 8

You should be able to spot the difference but if not, they had their smiles completely changed solely by a few lines of code. I’d recommend seeing the full video(link: https://youtu.be/Z1N53ZOv-ak?si=SAvAOJhmESNFkKcA) as it’s just thirty seconds long but really puts into perspective how photos are being redefined and smartphones are getting well…smarter. One more example is the Huawei P30 Pro. Basically, the story with this was that anytime someone tried to take the photo of the moon the camera would detect this and ask to switch on ‘Moon mode’ and the result would be a really crisp and clear picture of the moon. Over time, people noticed that all the moon shots looked exactly the same and some people claimed that the phone was actually just placing a stored photo of the moon over your photo to make it clearer than it was meant to be. Whether this was true or not was never confirmed but I think it’s a real possibility. 

As phones develop over time we’re going to be seeing more examples of this, the most recent being GalaxyAI in the S24. It begins to evoke the question of what is real and what is not. This is the same case with deepfakes where images and videos are generated of people doing things they’ve never done before simply through AI. I believe that conscious efforts have to be made by these companies to draw the line as much as possible in order to distinguish between reality and fiction. An example of this is the watermark automatically added to the bottom of pictures that have been edited with AI in the S24. These efforts could mean the difference between a world where we have control over these fakes and one where we don’t.

Watermark on AI edited images on the S24

Thanks for reading and see you next week.