The Pixel 10 and Pixel 11 camera capabilities have been leaked in detail, indicating major software and hardware upgrades are incoming.
Google Pixel 10 series is months away from launch and the Pixel 11 is even far ahead, but that doesn’t stop the rumour mill, which already has information about these devices. To be specific, the Pixel 10 and Pixel 11 camera capabilities have been leaked ahead of their launch, and a lot is to be expected as per this information.
As per a couple of reports from Android Authority, the Pixel 10 and Pixel 11 are set to bring major improvements in terms of optics over their predecessors. Speaking of the Pixel 11 camera capabilities first, the says that it will draw power from the Google Tensor G6 SoC whose image signal Processor will be getting support for an under-display IR camera system. With a new “lite” front-end-one of the primary components of an ISP-designed specifically for this purpose, the chip will enable more efficient and less power-intensive handling of these systems.
It isn’t the first time Google would be using an IR-based face unlocking system as it has already done so on the Pixel 4 but scrapped it later on from Pixel 5 onwards. If Google does employ this supposed under-display IR camera system, it will be going head-on against Apple, who is also rumored to launch phones with an under-display IR camera for Face ID in 2026.
Coming to the of both the Pixel 10 and the Pixel 11, they’ll be loaded with AI. The Pixel 10, that is likely to be powered by the Tensor G5, will be getting “Video Generative ML” features. The description of the feature is as follows: “Post-capture Generative AI-based Intuitive Video Editing for the Photos app.”
While unclear, one could assume that Google will let users edit their videos after they have been captured by using AI that actually understands the video. The feature might also be available in the YouTube app, specifically for YouTube Shorts.
Photo editing on the Pixel 10 and Pixel 11 could also get an upgrade, including features like “Speak-to-Tweak,” which seems to be an LLM-based editing tool, and “Sketch-to-Image,” which is self-explanatory and resembles a similar feature offered by Samsung’s Galaxy AI.
Google is also working on a “Magic Mirror” feature, details about which are still scarce. Additionally, the Tensor G5 should also be capable of running Stable Diffusion-based models on-device, which could be used in the Pixel Studio app, instead of the current server-based solution.
In addition, Tensor G5 will finally add support for 4K 60fps HDR video, whereas previous models only allowed up to 4K 30fps video capture. Further, the Pixel 11 camera capabilities could also include 100x zoom through Machine Learning for both photos and videos. It will be enabled by a so-called “next-gen” telephoto camera which suggests notable hardware upgrades are also possible.
Cinematic Blur is also getting an upgrade on the Pixel 11, now supporting 4K at 30fps along with a new “video relight” feature that adjusts lighting conditions in videos. These enhancements are powered by a “Cinematic Rendering Engine” within the chip’s image signal processor. Additionally, the new hardware block lowers the power consumption of recording videos with blur by nearly 40%.
The Pixel 11 will also introduce an “Ultra Low Light video” mode, also known as “Night Sight video.” While a similar feature exists, it previously relied on cloud processing; this version is entirely on-device. Google notes that the feature is optimized for lighting as low as 5-10 lux-very dim conditions akin to a softly lit room or the light from a cloudy sky at dusk, comparable to a candle about a meter away. This new capability will also utilize upgraded camera hardware to achieve optimal results.
Finally, the Pixel 11’s Tensor G6 will get a “nanoTPU” that will enable a few ML-based always-on features in the Pixel phones. Majority of these are health-related features, such as detection of “agonal breathing, cough, snore, sneeze, and sleep apnea, fall detection, gait analysis, and sleep stages monitoring.” There’s also emergency sound event detection.
Some of the features are also for activity tracking, like “Running ML,” which is a collection of tools for runners, such as “coachable pace” and “balance & oscillation” analysis. Lastly, Pixel 11 might also bring support for more Quick phrases.