Technology

#Why Every Photo You Take Is “Fake” – Review Geek

“Why Every Photo You Take Is “Fake” – Review Geek”

Someone taking a photo with a Samsung Galaxy S23 Ultra smartphone.
Justin Duino / Review Geek

Smartphones are under fire for “faking” or “cheating” high-quality photos. But every photo in existence contains some fakery, and it’s not a bad thing.

Artificial intelligence has invaded your smartphone camera with a singular goal—to ruin your photos and fill your head with lies! At least, that’s the idea you might see in some headlines. Smartphone camera technology is advancing rapidly, leading to some confusion about what’s “real” and “fake.”

Well, I’ve got good news; every photo in existence is “fake.” It doesn’t matter if it was shot on a smartphone from 2023 or a film camera from 1923. There is always some trickery going on behind the scenes.

The Physical Constraints of Phone Cameras

If you stuck a full-sized camera lens on a phone, it would be a monstrosity. Smartphones need to be small, compact, and somewhat durable, so they tend to utilize incredibly small camera sensors and lenses.

This teensy-weensy hardware creates several physical constraints. While a smartphone may have a 50MP sensor, the sensor’s size is actually quite small, meaning that less light can reach each pixel. This leads to reduced low-light performance and can introduce noise to an image.

Lens size is also important. Tiny camera lenses can’t bring in a ton of light, so you end up with a reduced dynamic range and, once again, reduced low-light performance. A tiny lens also means a small aperture, which cannot produce a shallow depth of field for background-blur or “bokeh” effects.

At a physical level, smartphones cannot take high-quality photos. Advancements in sensor and lens technology have greatly improved the quality of smartphone cameras, but the best smartphone cameras come from brands that utilize “computational photography.”

Phone Cameras Use Software to “Cheat”

Justin Duino / Review Geek

The best smartphone cameras come from Apple, Google, and Samsung—three leaders in software development. This is no coincidence. In order to push past the hardware constraints of smartphone cameras, these brands use “computational photography” to process and enhance photos.

Smartphones use multiple computational photography techniques to produce a high-quality image. Some of these techniques are predictable; a phone will automatically adjust the color and white balance of a photo, or it may “beautify” a subject by sharpening and brightening their face.

But the most advanced computational photography techniques go beyond simple image editing.

Take “stacking,” for instance. When you press the shutter button on your phone, it takes multiple images within the span of a few milliseconds. Each image is made with slightly different settings—some are blurry, some are overexposed, and some are zoomed in. All of these photos are combined to  produce an image with a high dynamic range, strong colors, and minimal motion blur.

An example of night photography on the iPhone 11.
Apple

Stacking is the key concept behind HDR photography, and it’s the starting point for a large number of computational photography algorithms. Night mode, for example, uses stacking to produce a bright nighttime image without a long exposure time (which would introduce motion blur and other problems).

And, as I mentioned earlier, smartphone cameras cannot produce a shallow depth of field. To get around this problem, most smartphones offer a portrait mode that uses software to estimate depth. The results are pretty hit or miss, especially if you have long or frizzy hair, but it’s better than nothing.

Some people believe that computational photography is “cheating,” as it misrepresents your smartphone camera’s capabilities and produces an “unrealistic” image. I’m not sure why this would be a serious concern. Computational photography is imperfect, but it allows you to take high-quality photos using low-quality hardware. In many cases, this brings you closer to a “realistic” and “natural” image with a sense of depth and dynamic range.

The best example of this “cheating” is Samsung’s “moon controversy.” To advertise the Galaxy S22 Ultra’s zoom capabilities, Samsung decided to create a lunar photography algorithm. Basically, it’s an AI that makes crappy pictures of the moon look slightly less crappy by adding details that don’t exist in the original image. It’s a useless feature, but if you need to take a photo of the moon with a camera that’s smaller than a penny, I’d reckon that some “cheating” is necessary.

That said, I am concerned by the misleading ways that some companies market their computational photography tools. And my biggest gripe is the “shot on iPhone” or “shot on Pixel” nonsense that phonemakers peddle out each year. These advertisements are made with million-dollar budgets, big fat add-on lenses, and professional editing. The idea that you could reproduce one of these advertisements with nothing but a smartphone is a stretch, if not an outright lie.

This Is Nothing New

A very broken camera.

Some people are unhappy with computational photography. They argue that it misrepresents reality, and therefore, it must be bad! Cameras should give you the exact image that enters the camera’s lens—anything else is a lie!

Here’s the thing; every photograph contains some level of “fakery.” It doesn’t matter if the photo was shot on a phone, a DSLR camera, or a film camera.

Let’s look at the film photography process. Camera film is coated with a photosensitive emulsion. When the camera shutter opens, this emulsion is exposed to light, leaving an invisible chemical trace of an image. The film is dunked through a series of chemicals to produce a permanent negative, which is projected on an emulsion-lined paper to create a printed image (well, the photo paper also needs a chemical wash, but that’s the gist of it).

Every step in this process affects how an image looks. One brand of film may oversaturate reds and greens, while another brand may have a dull appearance. Darkroom chemicals may alter an image’s color or white balance. And printing an image to photo paper introduces even more variables, which is why many film labs use a reference sheet (or a computer) to dial in color and exposure.

Most people who owned a film camera were not professional photographers. They had no control over the printing process, and they certainly didn’t choose the chemical composition of their film. Doesn’t that sound familiar? Film manufacturers and photo labs were the “computational photography” of their day.

But what about modern DSLR and mirrorless cameras? Well, I’m sorry to say, but all digital cameras perform some photo processing. They may adjust an image for lens distortion or reduce the noise in a photo. But the most common form of processing is actually file compression, which can totally alter the color and white balance of an image (a JPEG only contains a few million colors). Some cameras allow you to save RAW image files, which are minimally processed but tend to look “flat” or “dull” without professional editing.

All Photos Are “Fake,” and It’s Not a Big Deal

Person using the 100x zoom on the Samsung Galaxy S23 Ultra
Justin Duino / Review Geek

Reality is an important part of photography. Sometimes we want a photograph that accurately represents a moment in time, flaws and all. But more often than not, we ask our cameras to capture a good image, even in unfavorable circumstances—we ask for fakery.

This fakery requires technological advancements beyond the camera lens. And computational photography, despite its imperfections and marketing spin, is the technology we need right now.

That said, companies like Google, Apple, and Samsung need to be more transparent with their customers. We’re constantly bombarded by advertisements that stretch the truth, leading many people to believe that smartphones are comparable to full-sized or professional-grade cameras. This simply isn’t true, and until customers understand what’s going on, they’re going to keep getting mad about computational photography.

If you liked the article, do not forget to share it with your friends. Follow us on Google News too, click on the star and choose us from your favorites.

For forums sites go to Forum.BuradaBiliyorum.Com

If you want to read more like this article, you can visit our Technology category.

Source

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close

Please allow ads on our site

Please consider supporting us by disabling your ad blocker!