Can Apple’s iPhone XS camera catch up to Google’s Pixel?
As ever, the camera system is one of the most important features in the new iPhones. Apple’s SVP of product marketing Phil Schiller played the same role he always does at each iPhone announcement, breaking down the camera technology in depth and explaining the improvements. This year he went as far as to pronounce that the iPhone XS will hail “a new era of photography.”
But the days where Apple held a major lead over every Android manufacturer are long gone. Google’s innovative approach to computational photography with the Pixel line means they’re now the phones to beat in terms of pure image quality, while competitors like Samsung’s Galaxy S9 Plus and Huawei’s P20 Pro have strong claims to supremacy in areas like low-light performance. The iPhone 8 and X have good cameras, to be sure, but it’s hard to make the case that they’re the best. Can the iPhone XS catch up?
The biggest hardware upgrade this year is the new, larger 12-megapixel sensor. Last year Apple said the 8 and X sensor was “larger” than the 7’s, but teardowns revealed that this wasn’t meaningfully true; the field of view and focal length of the lens didn’t change. This time, though, Apple is citing the increased pixel size, which should indeed make a difference.
The iPhone XS’ main camera has 1.4µm pixels, up from 1.22µm in the iPhone X and on par with the Google Pixel 2. The bigger the pixels, the greater their ability to collect light, which means more information to play with when constructing a photo. This is the first time Apple has increased pixel size since the iPhone 5S, after going down to 1.22µm when it moved to 12 megapixels with the 6S, so it could represent a major upgrade.
![Screenshot-2018-08-10-at-01.24.30.png]
()
Otherwise, the hardware is much the same as seen on the X. There’s still a six-element f/1.8 lens and a secondary f/2.4 telephoto module on both the XS and XS Max, though the optics will likely have had to be redesigned for the new sensor. (The cheaper iPhone XR has the same primary camera but no telephoto lens.) Apple also says the True Tone flash is improved, without providing details. And the selfie camera is unchanged beyond “all-new video stabilization.”
But as Schiller said on stage, hardware is only part of the story with camera technology today. Computational photography techniques and software design are at least as important when it comes to getting great photos out of tiny phone optics.
The A12 Bionic chip in the iPhone XS is designed for this core use case. This year Apple has directly connected the image signal processor to the Neural Engine, the company’s term for the part of the chip designed for machine learning and AI. This looks to be a big focus with the A12. It will be the world’s first 7-nanometer processor to ship in a smartphone, which should allow for more efficient performance, but Apple is only citing conventional speed increases of up to 15 percent over the A11; far more dramatic boosts are seen with machine learning operations, suggesting that Apple has used the opportunity of a 7nm design to focus on ramping up the Neural Engine.
Congratulations @rahit47! You have completed the following achievement on the Steem blockchain and have been rewarded with new badge(s) :
Award for the number of upvotes
Click on the badge to view your Board of Honor.
If you no longer want to receive notifications, reply to this comment with the word
STOP