Review: The iPhone 8 is a look into the augmented future of photography

Review: The iPhone 8 is a look into the augmented future of photography

The best reason to buy a new iPhone

The camera is the best reason to buy a new iPhone this year just as it has been several years running. The iPhone is the world’s most popular camera, by far, and Apple continues to take seriously the business of improving it.

As I’ve mentioned before in these reviews, I have a lengthy history in photography. I’ve been a working pro photographer, and I’ve sold cameras, performed maintenance and run a print lab. This has helped me to recognize the difference between how seriously Apple takes photography and the way that other companies approach “making the camera better.”

There are other smartphones that take excellent pictures, Samsung’s Galaxy S8+, the most direct competitor in terms of hardware that Apple has, among them. However, once you move beyond the basics of increasing resolution, basic optimization and adding catch-up computational features like faux blur, you begin to realize that there’s not a smartphone company on earth that takes it as far as Apple does. It’s just not comparable once you get into the nitty gritty. Here are a few examples you’ll find in the iPhone 8.

Sensor improvements. Same resolution, 12MP, but bigger sensor overall. This is a great recipe for improved image quality. I’m a big fan of concentrating on the size of the individual image sensors that translate into larger individual pixels and making the walls between them deeper, both of which Apple has done here. Those deeper pixel wells give better isolation between capture elements so that you don’t get that speckle that results in color confusion between two pixels.

There’s also a new color filter. Given that digital camera color filters haven’t changed much in years, I was curious about the details here, but couldn’t learn much more that it should result in improved dynamic range and color.

High dynamic range (HDR) shooting has also been massively improved, to the point where there is no longer even a toggle for it. You just shoot, and if the camera thinks your picture will benefit by an expansion of tones into dark and light it will use it. And there’s such a tiny lag in between the images used to composite together an HDR shot that you’ll find very little of that ghosting that happened under the previous system. You’ll notice that you no longer get two shots — one HDR and one not. That’s how confident Apple is. It’s really well done — seamless even.

The wide and telephoto sensors in the 8 Plus have both been updated, and the system has 83 percent more throughput, which allows for more data to be passed through in a more power-efficient manner. This will help with rapid fire images, but more importantly it allows for the enormous amount of information that needs to be pushed through the pipeline to support 4K video at 60 frames per second and super slo-mo 1080p at 240 FPS.

Apple’s adoption of the HEVC video format, which is enormously effective at reducing file sizes but that’s still crazy impressive. Especially given that Apple is not playing tricks with video quality, and that this still leaves plenty of overhead for the computer vision smarts it uses on video to determine subject matter.

The results are better color with a wider range of tones across all kinds of shooting environments. Apple is particularly fond of their work capturing skies with the iPhone 8 and 8 Plus, because skies are chock full of light of all spectrums. That holds up in my testing — less banding, more gradations of tone that reproduce more accurately.

Textures in shooting cloth or any other fine detail up close are also improved, with less chance for muddy or moire images when the patterns are super regular. This is reflected in the 4K video modes as well, which have better color rendition across dark and light scenes and less artifacting.

Though you don’t get it at a full 240 FPS yet, it may be interesting to some slow-motion videographers that you get continuous autofocus at 1080p 120 FPS now. This should help track subjects during a slo-mo shot as they move toward or away from you.

Similarly, skin tones have been improved, with less heavy-handed smoothing as seen in the iPhone 7. This is a result of sensor improvements, but also of Apple applying deep learning and intelligence to the process of determining subjects and optimizing exposure. More on that later.

 

As phone cameras have gotten better at low-light photography, the flash has been reduced to documenting that weird growth you text your friends and hopefully a medical professional.

Hardware accelerated noise reduction. I know this is going to be a bit in the weeds for some folks, but I’m really excited about this one. Noise reduction (NR) is the process that every digital camera system uses to remove the multi-colored speckle that’s a typical byproduct of a (relatively) tiny sensor, heat and the analog-to-digital conversion process. Most people just call this “grain.”

In previous iPhones this was done purely by software. Now it’s being done directly by the hardware. I’d always found Apple’s NR to be too “painterly” in its effect. The aggressive way that they chose to reduce noise created an overall “softening,” especially noticeable in photos with fine detail when cropped or zoomed.

Here’s the bit where I whined about the way Apple handled reducing noise in the iPhone 7:

I’m still not completely happy with how much noise reduction Apple’s image signal processor (ISP) applies to  pictures, but I make this statement fully aware that this is not something most folks will notice.

It makes some sense that the NR would be more aggressive because most people want less ‘grain’ or pixel noise in their images. But it still results, I feel, in a little loss of sharpness in low-light situations. To be clear, this remains basically unchanged from the way that I feel about the way the ISP was tuned in the iPhone 6. Apple has made some insane improvements in the camera this time around, but I hope it does pay some attention to how they reduce noise and tweak that in the future.

Well, tweak it they have. Noise reduction is no longer a software-only feature. It’s hardware-accelerated, multi-band noise reduction done by the image signal processor (ISP) that Apple continues to improve. The result is reduced noise, but with a sharper, crisper feel that doesn’t feature the blotchy byproduct of the previous process. It’s a solid improvement everyone will benefit from, whether they realize it or not.

‘Zero’ shutter lag

For a while now, iPhones have been keeping a few images in memory before you even press the shutter button. These are recorded but thrown away nearly instantly. In optimal conditions, when you press the shutter, the last picture it recorded just before you pressed the shutter is the one you actually take.

This helps compensate for your normal human lag in pressing the button when you see something you want to take a shot of, and the lag of the system itself. It makes the picture-taking process feel instantaneous.

That buffer has gotten a bump in size in the iPhone 8, and Apple is now applying deep learning to optimize the process for the right time to shoot, the subject matter and other bits of intelligence.

The results are difficult to determine without a huge set of examples, because there are a lot of variables on any given shot. But, anecdotally, it does feel like the pictures fire off faster. Verdict: I believe them, but I need more time to feel this one out.

Flash

One of the main reasons you hate flash pictures is that they tend to pop the subject with tons of light and reduce the background to a blackish-gray nothing. It kills ambiance and mood and as phone cameras have gotten better at low-light photography, the flash has been reduced to documenting that weird growth you text your friends and hopefully a medical professional.

But pro photographers use flash all the time. Mainly because they have control over the shutter speed as well as the flash. This allows them to choose to leave the shutter open after the flash fires, filling in the background with more light and balancing the exposure.

Now, Apple does this automatically for you. If you take a flash picture of a person or thing and there’s enough light available to “fill” behind the subject, the iPhone will drag the shutter or leave it open automatically. It does this using intelligence around the subject matter, distance, ambient exposure and more in the time it takes to pop the shutter off.

The beauty of all of the examples I mentioned above? Every single bit of it is as accessible to the harried parent that wants the best shot of their kid on the first day of kindergarten as it is to a pro photographer. Yes, Apple has a silicon team that’s beyond ridiculous. Yes, it’s at the cutting edge of mobile computational photography and application of deep learning to photos. But you literally don’t have to give a flying animated poop emoji about it to get the benefit.

That is the key Apple innovation: It doesn’t matter whether you read, liked or understood what I just wrote above — you’re still going to benefit with incredible pictures.

Source link

Leave a Reply

Your email address will not be published.