Wow! Here's a oddly compelling reason to pick a Pixel 10 over an iPhone 17

100x zoom image of a blue car
(Image credit: Google Pixel)

It's been a long time since I've been excited by the possibilities offered by a new camera phone. Because let's face it, whether you're using the cheapest or most expensive smartphone on the market, you're ultimately still going to be shooting the same scenes. Which is why, when I first head about the Google's Pixel 10 promise of 100x zoom, I was genuinely thrilled. Could this open up a whole new class of things to photograph?

Living on the south-west coast of England in Weston-super-Mare, I'm blessed with a daily coastal walk that not only surrounds me with natural beauty, but also offers many photographic opportunities. I'll be honest, though, after 20 odd years of making the same walk, I do occasionally feel a bit jaded. 

Yes, the variable British weather keeps things interesting: the same scene shot in sweltering sunshine can look like another world under a blanket of snow and ice. But most of the time, I'll still end up focusing on the same things. The pier. The mudflats at low tide. The occasional donkey looking a little Eeyore-like as it tramps across the beach, wide-eyed five-year-old on its back. 

In my mind, then, the 100x zoom on the Pixel 10 Pro and 10 Pro XL sounded like game-changer. Suddenly, that cargo ship on the horizon wouldn't be just a distant speck; it'd become detailed maritime architecture. Those hazy suggestions of Welsh hills across the Bristol Channel could become genuine subjects. Those world-class surfing kites, high in the sky, might suddenly become lens-accessible.

It all sounded, in short, a bit like a superpower: giving you the ability to collapse distance and bring the far-off world into close-up focus.

But turns out, there's a bit of catch.

Opening up new worlds

On closer inspection, the Pixel 10's 100x zoom isn't quite what you think; it's substantially enhanced by AI. This isn't simple digital zoom where pixels are stretched until they scream. Google's algorithms are actively inventing details, using generative AI to fill gaps the lens cannot capture.

Which creates somewhat of a philosophical quandry. Would I be really be creating a photograph? Or would it be more like a digital painting: one where somebody else is holding the brush? Would I have captured a distant ship, or just asked for Google's best guess at what a ship should look like?

Admittedly, so far reviews suggest this AI enhancement works remarkably well. Text becomes legible, animal fur gains texture, architectural details emerge from the mush. It's all technically impressive, undeniably useful… and completely unsettling if you care about authenticity.

(Image credit: Google)

Yes, photography has always involved technical intervention. But that has traditionally involved enhancing existing visual information, not conjuring it up from scratch via algorithms. Ones that, lest we forget, have a tendency to hallucinate mad things, from made up facts to hands with six fingers. 

Yet despite these concerns, I'm still  excited. Ultimately, even if these pictures end up being weird or terrible, at least that'll be interesting in its own way. 

So I'm giving Google points for trying something genuinely new with the Pixel 10. And whether or not the company's algorithmic optimism matches the messy reality of shooting distant subjects from a windswept Somerset coast, it should at least be fun finding out.

Tom May

Tom May is a freelance writer and editor specializing in art, photography, design and travel. He has been editor of Professional Photography magazine, associate editor at Creative Bloq, and deputy editor at net magazine. He has also worked for a wide range of mainstream titles including The Sun, Radio Times, NME, T3, Heat, Company and Bella.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.