Pixel Camera is a
camera phone
A camera phone is a mobile phone that is able to capture photographs and often record video using one or more built-in digital cameras. It can also send the resulting image wirelessly and conveniently. The first commercial phone with a color c ...
application developed by
Google
Google LLC (, ) is an American multinational corporation and technology company focusing on online advertising, search engine technology, cloud computing, computer software, quantum computing, e-commerce, consumer electronics, and artificial ...
for the
Android operating system on
Google Pixel
Google Pixel is a brand of portable Consumer electronics, consumer electronic devices developed by Google that run either ChromeOS or the Pixel version of the Android (operating system), Android operating system. The main line of Pixel products ...
devices. Development with zoom lenses for the application began in 2011 at the
Google X research incubator led by
Marc Levoy, which was developing image fusion technology for
Google Glass. It was publicly released for
Android 4.4+ on the
Google Play
Google Play, also known as the Google Play Store, Play Store, or sometimes the Android Store (and was formerly Android Market), is a digital distribution service operated and developed by Google. It serves as the official app store for certifie ...
on April 16, 2014. The app was initially released as Google Camera and supported on all devices running Android 4.4 KitKat and higher. However, in October 2023, coinciding with the release of the
Pixel 8 series, it was renamed to Pixel Camera and became officially supported only on Google Pixel devices.
Features
Google Camera contains a number of features that can be activated either in the Settings page or on the row of icons at the top of the app.
Pixel Visual/Neural Core
Starting with Pixel devices, the camera app has been aided with
hardware accelerators, a hidden image processing chip, to perform its image processing. The first generation of Pixel phones used
Qualcomm
Qualcomm Incorporated () is an American multinational corporation headquartered in San Diego, California, and Delaware General Corporation Law, incorporated in Delaware. It creates semiconductors, software and services related to wireless techn ...
's
Hexagon DSPs and
Adreno GPUs to accelerate image processing. The
Pixel 2
The Pixel 2 and Pixel 2 XL are a pair of Android (operating system), Android smartphones designed, developed, and marketed by Google as part of the Google Pixel product line. They collectively serve as the successors to the Pixel (1st generation) ...
and
Pixel 3 (but not the
Pixel 3a) include the
Pixel Visual Core to aid with image processing. The
Pixel 4 introduced the
Pixel Neural Core. Note that the Visual Core's main is to bring the HDR+ image processing that's symbolic of the Pixel camera to any other app that has the relevant
Google APIs. Pixel Visual Core is built to do heavy image processing while conserving energy, saving battery.
HDR+
Unlike earlier versions of
high dynamic range (HDR) imaging, HDR+, also known as HDR+ on, uses
computational photography
Computational photography refers to digital image capture and processing techniques that use digital computation instead of optical processes. Computational photography can improve the capabilities of a camera, or introduce features that were no ...
techniques to achieve higher dynamic range. HDR+ takes continuous burst shots with short exposures. When the shutter is pressed the last 5–15 frames are analysed to pick the sharpest shots (using
lucky imaging
Lucky imaging (also called lucky exposures) is one form of speckle imaging used for astrophotography. Speckle imaging techniques use a high-speed camera with shutter speed, exposure times short enough (100 ms or less) so that the changes in ...
), which are selectively aligned and combined with image averaging. HDR+ also uses Semantic Segmentation to detect faces to brighten using synthetic fill flash, and darken and denoise skies. HDR+ also reduces
shot noise
Shot noise or Poisson noise is a type of noise which can be modeled by a Poisson process.
In electronics shot noise originates from the discrete nature of electric charge. Shot noise also occurs in photon counting in optical devices, where s ...
and improves colors, while avoiding
blowing out highlights and
motion blur
Motion blur is the apparent streaking of moving objects in a photograph or a sequence of frames, such as a film or animation. It results when the image being recorded changes during the recording of a single exposure, due to rapid movement or l ...
. HDR+ was introduced on the
Nexus 6 and brought back to the
Nexus 5.
HDR+ enhanced
Unlike HDR+/HDR+ On, 'HDR+ enhanced' mode does not use Zero Shutter Lag (ZSL). Like Night Sight, HDR+ enhanced features positive-shutter-lag (PSL): it captures images after the shutter is pressed. HDR+ enhanced is similar to HDR+ from the Nexus 5, Nexus 6,
Nexus 5X and
Nexus 6P. It is believed to use underexposed and overexposed frames like
Smart HDR from
Apple
An apple is a round, edible fruit produced by an apple tree (''Malus'' spp.). Fruit trees of the orchard or domestic apple (''Malus domestica''), the most widely grown in the genus, are agriculture, cultivated worldwide. The tree originated ...
. HDR+ enhanced captures increase the dynamic range compared to HDR+ on. HDR+ enhanced on the Pixel 3 uses the learning-based AWB algorithm from Night Sight.
Live HDR+
Starting with the Pixel 4, Live HDR+ replaced HDR+ on, featuring
WYSIWYG
In computing, WYSIWYG ( ), an acronym for what you see is what you get, refers to software that allows content to be edited in a form that resembles its appearance when printed or displayed as a finished product, such as a printed document, web ...
viewfinder with a real-time preview of HDR+.
HDR+ live uses the learning-based AWB algorithm from Night Sight and averages up to nine underexposed pictures.
Dual Exposure Controls
'Live HDR+' mode uses Dual Exposure Controls, with separate sliders for brightness (
capture exposure) and for shadows (
tone mapping
Tone mapping is a technique used in image processing and computer graphics to map one set of colors to another to approximate the appearance of high-dynamic-range (HDR) images in a medium that has a more limited dynamic range. Print-outs, C ...
). This feature was made available for Pixel 4, and has not been retrofitted on older Pixel devices due to hardware limitations.
With Bracketing
In April 2021, Google Camera v8.2 introduced HDR+ with Bracketing, Night Sight with Bracketing and Portrait Mode with Bracketing. Google updated their
exposure bracketing algorithm for HDR+ to include an additional long exposure frame and Night Sight to include 3 long exposure frames. The spatial merge algorithm was also redesigned to decide merged or not per pixel (like Super Res Zoom) & updated to handle long exposures (clipped highlights, more motion blur and different noise characteristics). with Bracketing enables further reduced
read noise, improved details/texture and more natural colors. With Bracketing is automatically enabled depending on the dynamic range and motion. With Bracketing is supported in all modes for the Pixel 4a (5G) and 5. With Bracketing is supported in Night Sight for the Pixel 4 and 4a.
Motion Photos
Google Camera's Motion photo mode is similar to
HTC's Zoe and
iOS' Live Photo. When enabled, a short, silent, video clip of relatively low resolution is paired with the original photo. If RAW is enabled, only a 0.8MP DNG file is created, not the non-motion 12.2MP DNG. Motion Photos was introduced on the Pixel 2. Motion Photo is disabled in HDR+ enhanced mode.
Video Stabilization
Fused Video Stabilization, a technique that combines
optical image stabilization and
Electronic/Digital image stabilization, can be enabled for significantly smoother video. This technique also corrects
Rolling shutter distortion and
Focus breathing, amongst various other problems. Fused Video Stabilization was introduced on the Pixel 2.
Super Res Zoom
Super Res Zoom is a
multi-frame super-resolution technique introduced with the Pixel 3 that shifts the image sensor to achieve higher resolution, which Google claim is equivalent to 2-3x
optical zoom. It is similar to
drizzle image processing. Super Res Zoom can also be used with telephoto lens, for example Google claims the Pixel 4 can capture 8x zoom at near-optical quality.
Top Shot
When Motion Photos is enabled, Top Shot analyzes up to 90 additional frames from 1.5 seconds before and after the shutter is pressed. The Pixel Visual Core is used to accelerate the analysis using
computer vision
Computer vision tasks include methods for image sensor, acquiring, Image processing, processing, Image analysis, analyzing, and understanding digital images, and extraction of high-dimensional data from the real world in order to produce numerical ...
techniques, and ranks them based on object motion, motion blur, auto exposure, auto focus, and auto white balance. About ten additional photos are saved, including an additional HDR+ photo up to 3 MP. Top Shot was introduced on the Pixel 3.
Other features
* Computational Raw – Google Camera supports capturing
JPEG
JPEG ( , short for Joint Photographic Experts Group and sometimes retroactively referred to as JPEG 1) is a commonly used method of lossy compression for digital images, particularly for those images produced by digital photography. The degr ...
and
DNG files simultaneously. The DNG files are also processed with Google's HDR+ Computational Photography. Computational Raw was introduced on the Pixel 3.
* Motion Auto Focus – maintains focus on any subject/object in the frame. Motion Auto Focus was introduced in the Pixel 3.
* Frequent Faces – allows the camera to remember faces. The camera will try to ensure those faces are in focus, smiling and not blinking.
* Location – Location information obtained via
GPS and/or Google's location service can be added to pictures and videos when enabled.
Functions
Like most camera applications, Google Camera offers different usage modes allowing the user to take different types of photo or video.
Slow Motion
Slow motion
Slow motion (commonly abbreviated as slow-mo or slo-mo) is an effect in film-making whereby time appears to be slowed down. It was invented by the Austrian priest August Musger in the early 20th century. This can be accomplished through the use ...
video can be captured in Google Camera at either 120 or, on supported devices, 240 frames per second.
Panorama
Panoramic photography
Panoramic photography is a technique of photography, using specialized equipment or software, that captures images with horizontally elongated field of view, fields of view. It is sometimes known as ''wide format photography''. The term has also ...
is also possible with Google Camera. Four types of panoramic photo are supported; Horizontal, Vertical,
Wide-angle and
Fisheye. Once the Panorama function is selected, one of these four modes can be selected at a time from a row of icons at the top of the screen.
Photo Sphere
Google Camera allows the user to create a 'Photo Sphere', a
360-degree panorama photo, originally added in Android 4.2 in 2012. These photos can then be embedded in a web page with custom HTML code or uploaded to various Google services.
The
Pixel 8 released without the feature, the first Pixel phone not to have the feature, and thus leading many to believe that the feature has been discontinued.
Portrait
Portrait mode (called Lens Blur previous to the release of the Pixel line) offers an easy way for users to take 'selfies' or portraits with a
Bokeh effect, in which the subject of the photo is in focus and the background is slightly blurred. This effect is achieved via the
parallax
Parallax is a displacement or difference in the apparent position of an object viewed along two different sightline, lines of sight and is measured by the angle or half-angle of inclination between those two lines. Due to perspective (graphica ...
information from dual-pixel sensors when available (such as the Pixel 2 and Pixel 3), and the application of
machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of Computational statistics, statistical algorithms that can learn from data and generalise to unseen data, and thus perform Task ( ...
to identify what should be kept in focus and what should be blurred out. Portrait mode was introduced on the Pixel 2.
Additionally, a "face retouching" feature can be activated which cleans up blemishes and other imperfections from the subject's skin.
The Pixel 4 featured an improved Portrait mode, the machine learning algorithm uses parallax information from the telephoto and the Dual Pixels, and the difference between the telephoto camera and wide camera to create more accurate depth maps.
For the front facing camera, it uses the parallax information from the front facing camera and IR cameras. The blur effect is applied at the Raw stage before the tone-mapping stage for more realistic SLR-like bokeh effect.
Playground
In late 2017, with the debut of the
Pixel 2
The Pixel 2 and Pixel 2 XL are a pair of Android (operating system), Android smartphones designed, developed, and marketed by Google as part of the Google Pixel product line. They collectively serve as the successors to the Pixel (1st generation) ...
and
Pixel 2 XL, Google introduced AR Stickers, a feature that, using Google's new
ARCore
ARCore, also known as Google Play Services for AR, is a software development kit developed by Google that allows for augmented reality (AR) applications to be built. ARCore has been integrated into a multitude of devices.
Key technologies
ARC ...
platform, allowed the user to superimpose
augmented reality
Augmented reality (AR), also known as mixed reality (MR), is a technology that overlays real-time 3D computer graphics, 3D-rendered computer graphics onto a portion of the real world through a display, such as a handheld device or head-mounted ...
animated objects on their photos and videos. With the release of the Pixel 3, AR Stickers was rebranded to Playground.
Google Lens
The camera offers a functionality powered by
Google Lens, which allows the camera to copy text it sees, identify products, books and movies and search similar ones, identify animals and plants, and scan barcodes and
QR codes, among other things.
Photobooth
The Photobooth mode allows the user to automate the capture of selfies. The
AI is able to detect the user smile or funny faces and shoot the picture at the best time without any action from the user, similar to
Google Clips. This mode also feature a two level AI processing of the subject's face that can be enabled or disabled in order to soften its skin.
Motion Photos functionality is also available in this mode. The
white balance is also adjustable to defined presets. In October 2019, Photobooth was removed as a standalone mode, becoming an "Auto" option in the shutter options, later being removed altogether.
Night Sight
Night Sight is based on a similar principle to exposure stacking, used in
astrophotography
Astrophotography, also known as astronomical imaging, is the photography or imaging of astronomical objects, celestial events, or areas of the night sky. The first photograph of an astronomical object (the Moon) was taken in 1839, but it was no ...
. Night Sight uses modified HDR+ or Super Res Zoom algorithms. Once the user presses the trigger, multiple long exposure shots are taken, up to 15x 1/15 second exposure or 6x of 1 second exposure, to create up to a 6-second exposure. The motion metering and tile-based processing of the image allows to reduce, if not cancel, camera shake, resulting in a clear and properly exposed shot. Google claims it can handle up to ~8% displacement frame to frame. And each frame is broken into around 12,000 tiles. It also introduced a learning-based AWB algorithm for more accurate
white balance in low light.
Night Sight also works well in daylight, improving WB, detail and sharpness. Like HDR+ enhanced, Night Sight features positive-shutter-lag (PSL). Night Sight also supports a delay-timer as well as an assisted selector for the focus featuring three options (far, close and auto-focus). Night Sight was introduced with the Pixel 3, all older Pixel phones were updated with support.
Astrophotography
Astrophotography mode activates automatically when Night Sight mode is enabled and the phone detects it is on a stable support such as a tripod. In this mode, the camera averages up to fifteen 16-second exposures, to create a 4-minute exposure to significantly reduce
shot noise
Shot noise or Poisson noise is a type of noise which can be modeled by a Poisson process.
In electronics shot noise originates from the discrete nature of electric charge. Shot noise also occurs in photon counting in optical devices, where s ...
. By dividing the shot into several shorter exposures, the camera manages to achieve the light capture of a long exposure without having to deal with
star trails, which would otherwise require moving the phone very precisely during the exposure to compensate for the Earth's rotation. Astrophotography mode also includes improved algorithms to remove
hot pixels and warm pixels caused by
dark current and
convolutional neural network
A convolutional neural network (CNN) is a type of feedforward neural network that learns features via filter (or kernel) optimization. This type of deep learning network has been applied to process and make predictions from many different ty ...
to detect skies for sky-specific
noise reduction
Noise reduction is the process of removing noise from a signal. Noise reduction techniques exist for audio and images. Noise reduction algorithms may distort the signal to some degree. Noise rejection is the ability of a circuit to isolate an u ...
. Astrophotography mode was introduced with the Pixel 4, and backported to the Pixel 3 and Pixel 3a.
Portrait Light
Portrait Light is a
post process feature that allows adding light source to portraits. It simulates the directionality and intensity to complement the original photograph's lighting using
machine learning
Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of Computational statistics, statistical algorithms that can learn from data and generalise to unseen data, and thus perform Task ( ...
models. Portrait Light was introduced with the Pixel 5, and backported to the Pixel 4, Pixel 4a and Pixel 4a 5G. When using the default mode or Night Sight mode, its automatically applied if there a person or people. Portrait Light was a collaboration between Google Research,
Google Daydream,
Google Pixel
Google Pixel is a brand of portable Consumer electronics, consumer electronic devices developed by Google that run either ChromeOS or the Pixel version of the Android (operating system), Android operating system. The main line of Pixel products ...
, and
Google Photos
Google Photos is a photo sharing and Cloud storage, storage service developed by Google. It was announced in May 2015 and spun off from Google+, the company's former Social networking service, social network.
Google Photos shares the 15 gigab ...
teams.
Ultra HDR
With the launch of the Pixel 8, Google announced that the Pixel Camera would receive support for Ultra HDR. Ultra HDR is a format that stores an additional set of data alongside the JPG, with additional luminosity information to produce an HDR photo. Shortly after, with version 9.2 of the app, Ultra HDR was backported to the Pixel 7 and 6.
Unofficial ports
Many developers have released unofficial
ports Ports collections (or ports trees, or just ports) are the sets of makefiles and Patch (Unix), patches provided by the BSD-based operating systems, FreeBSD, NetBSD, and OpenBSD, as a simple method of installing software or creating binary packages. T ...
that allow for their use in non-Google phones, or implement its premium features on older Google phones. These unofficial apps often work around the lack of certain hardware features present in Google's top tier devices, and sometimes even go as far as enabling features not exposed by the official version of the app. There are numerous different versions, targeted at different Android phones.
Although many of the features are available on the ported versions, it is not unusual for some features not to be available, or not work properly, on phones without proper API support or incompatible hardware.
Google Play Services or a replacement like
microG is also required for the app to run.
In 2016 a modified version brought HDR+ featuring Zero Shutter Lag (ZSL) on back to the Nexus 5X and Nexus 6P. In mid-2017, a modified version of Google Camera was created for any smartphone equipped with a Snapdragon 820, 821 or 835 processor. In 2018, developers released modified versions enabling Night Sight on non-Pixel phones. In August 2020, a new way of accessing extra cameras was introduced, removing the need to use root on phones that don't expose all cameras for third-party apps.
References
Further reading
*
*Jimmy Westenberg (12 December 2017)
"How to use AR Stickers on the Google Pixel or Pixel 2" ''Android Authority''.
External links
*
{{Google Pixel
Google software
Android (operating system) software
Camera software