The latest Google Pixel 4 leak shows that the camera is packed with new features, including a new motion blur mode
(Image credit: Future / Google)
As has become tradition, Google Pixel 4 leaks continue to come thick and fast ahead of the phone's release next month. And the latest details reveal that the camera phone will pack some exciting new features such as motion blur mode, audio zoom, dynamic depth data and Live HDR.
The new modes show Google's continued innovation when it comes to phone photography, particularly in the software-driven side of imaging where the battle is increasingly being fought.
Indeed, while the iPhone 11 finally boasts more than two cameras but has little else in the way of new imaging features, the Google Pixel 4 looks like it will break new ground – both in transplanting traditional photography techniques as well as software-driven effects.
The headline feature will, according to a code dissection of the leaked Google Camera 7.0 by XDA Developers (thanks, DP Review), be Motion Mode. "It’ll supposedly let you take shots of moving subjects in the foreground while blurring the background, perfect for photos of sporting events," says the site, aping the effect of panning photography with traditional cameras.
There's a little more information about the previously leaked Google Pixel 4 astrophotography mode, too. "Google will be using the GPU (the Adreno 640 in the Qualcomm Snapdragon 855) to accelerate segmentation of the sky and then optimize the image by 'finding' the stars and brightening them."
A Live HDR mode was also revealed, which "could be used to apply HDR in real-time to the camera viewfinder, and it may also be used to automatically retouch photos milliseconds after taking them."
Get the Digital Camera World Newsletter
The best camera deals, reviews, product advice, and unmissable photography news, direct to your inbox!
It looks like the Pixel 4 will be introducing the audio zoom feature made popular by HTC (which stands to reason, as Google acquired some of HTC's technology and engineers), as well as support for a new file format called DDF: dynamic depth format.
These files record and retain the depth data for photographs, enabling apps to access and manipulate the data without affecting the original file – so, for example, depth of field could be adjusted after the fact by other software.
It all gets us incredibly excited for the Google Pixel 4's arrival in October – and reminds us just how unambitious Apple was with the iPhone 11's photography and imaging features.
The editor of Digital Camera World, James has 21 years experience as a journalist and started working in the photographic industry in 2014 (as an assistant to Damian McGillicuddy, who succeeded David Bailey as Principal Photographer for Olympus). In this time he shot for clients like Aston Martin Racing, Elinchrom and L'Oréal, in addition to shooting campaigns and product testing for Olympus, and providing training for professionals. This has led him to being a go-to expert for camera and lens reviews, photo and lighting tutorials, as well as industry news, rumors and analysis for publications like Digital Camera Magazine, PhotoPlus: The Canon Magazine, N-Photo: The Nikon Magazine, Digital Photographer and Professional Imagemaker, as well as hosting workshops and talks at The Photography Show. He also serves as a judge for the Red Bull Illume Photo Contest. An Olympus and Canon shooter, he has a wealth of knowledge on cameras of all makes – and a fondness for vintage lenses and instant cameras.