Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Pixel Camera

From Wikipedia, the free encyclopedia
Camera application developed by Google for Pixel devices
Pixel Camera
Google Camera 5.2.019 running on Android Oreo 8.1 on a Google Pixel 2
Original authorX Development
DevelopersGoogle, Google Research
Initial releaseApril 16, 2014; 11 years ago (2014-04-16)
Stable release
Android10.0.081.808198566.40 / September 23, 2025; 57 days ago (2025-09-23)[1]
Wear OS9.8.084.729276569.02 / March 5, 2025; 8 months ago (2025-03-05)[2]
Operating systemAndroid,Wear OS
TypeCamera
LicenseProprietary

Pixel Camera is acamera phone application developed byGoogle for theAndroid operating system onGoogle Pixel devices. Development with zoom lenses for the application began in 2011 at theGoogle X research incubator led byMarc Levoy, which was developing image fusion technology forGoogle Glass.[3] It was publicly released forAndroid 4.4+ on theGoogle Play on April 16, 2014.[4] The app was initially released asGoogle Camera and supported on all devices running Android 4.4 KitKat and higher. However, in October 2023, coinciding with the release of thePixel 8 series, it was renamed toPixel Camera and became officially supported only on Google Pixel devices.[5]

Features

[edit]

Pixel Camera contains a number of features that can be activated either in the Settings page or on the row of icons at the top of the app.

Pixel Visual/Neural Core

[edit]
[icon]
This sectionneeds expansion. You can help byadding to it.(October 2019)

Starting with the first-generation Pixel devices, the Pixel Camera app has utilizedhardwareaccelerators to assist withimage processing. The first generation of Pixel phones usedQualcomm'sHexagon DSPs andAdreno GPUs to improve processing efficiency. ThePixel 2 andPixel 3 (excluding thePixel 3a) include thePixel Visual Core to aid with image processing. ThePixel 4 introduced thePixel Neural Core.[6] The Visual Cores enables HDR+ image processing to be accessed by third-party applications through Google APIs and is designed to perform complex processing tasks while minimizing energy consumption.

HDR+

[edit]

Unlike earlier versions ofhigh dynamic range (HDR) imaging, HDR+, also known as HDR+ on, usescomputational photography techniques to achieve higher dynamic range. HDR+ takes continuous burst shots with short exposures. When the shutter is pressed the last 5–15 frames are analysed to pick the sharpest shots (usinglucky imaging), which are selectively aligned and combined with image averaging. HDR+ also uses Semantic Segmentation to detect faces to brighten using synthetic fill flash, and darken and denoise skies. HDR+ also reducesshot noise and improves colors, while avoidingblowing out highlights andmotion blur. HDR+ was introduced on the GoogleNexus 6 and brought back to theNexus 5.[7][8][9]

HDR+ enhanced

[edit]

Unlike HDR+/HDR+ On, 'HDR+ enhanced' mode does not use Zero Shutter Lag (ZSL). Like Night Sight, HDR+ enhanced features positive-shutter-lag (PSL): it captures images after the shutter is pressed. HDR+ enhanced is similar to HDR+ from the Nexus 5, Nexus 6,Nexus 5X andNexus 6P. It is believed to use underexposed and overexposed frames likeSmart HDR fromApple. HDR+ enhanced captures increase the dynamic range compared to HDR+ on. HDR+ enhanced on the Pixel 3 uses the learning-based AWB algorithm from Night Sight.[10][11]

Live HDR+

[edit]

Starting with the Pixel 4, Live HDR+ replaced HDR+ on, featuringWYSIWYG viewfinder with a real-time preview of HDR+.[12] HDR+ live uses the learning-based AWB algorithm from Night Sight and averages up to nine underexposed pictures.[13]

Dual Exposure Controls

[edit]

'Live HDR+' mode uses Dual Exposure Controls, with separate sliders for brightness (capture exposure) and for shadows (tone mapping). This feature was made available for Pixel 4, and has not been retrofitted on older Pixel devices due to hardware limitations.[13][12]

With Bracketing

[edit]

In April 2021, Google Camera v8.2 introduced HDR+ with Bracketing, Night Sight with Bracketing and Portrait Mode with Bracketing. Google updated theirexposure bracketing algorithm for HDR+ to include an additional long exposure frame and Night Sight to include 3 long exposure frames. The spatial merge algorithm was also redesigned to decide merged or not per pixel (like Super Res Zoom) & updated to handle long exposures (clipped highlights, more motion blur and different noise characteristics). with Bracketing enables further reducedread noise, improved details/texture and more natural colors. With Bracketing is automatically enabled depending on the dynamic range and motion. With Bracketing is supported in all modes for the Pixel 4a (5G) and 5. With Bracketing is supported in Night Sight for the Pixel 4 and 4a.[14]

Motion Photos

[edit]

Google Camera's Motion photo mode is similar toHTC's Zoe andiOS' Live Photo. When enabled, a short, silent, video clip of relatively low resolution is paired with the original photo. If RAW is enabled, only a 0.8MP DNG file is created, not the non-motion 12.2MP DNG. Motion Photos was introduced on the Pixel 2. Motion Photo is disabled in HDR+ enhanced mode.[15][16][17]

Video Stabilization

[edit]

Fused Video Stabilization, a technique that combinesoptical image stabilization andElectronic/Digital image stabilization, can be enabled for significantly smoother video. This technique also correctsRolling shutter distortion andFocus breathing, amongst various other problems. Fused Video Stabilization was introduced on the Pixel 2.[18][19]

Super Res Zoom

[edit]

Super Res Zoom is amulti-frame super-resolution technique introduced with the Pixel 3 that shifts the image sensor to achieve higher resolution, which Google claim is equivalent to 2-3×optical zoom. It is similar todrizzle image processing. Super Res Zoom can also be used with telephoto lens, for example Google claims the Pixel 4 can capture 8× zoom at near-optical quality.[20][21]

Top Shot

[edit]

When Motion Photos is enabled, Top Shot analyzes up to 90 additional frames from 1.5 seconds before and after the shutter is pressed. The Pixel Visual Core is used to accelerate the analysis usingcomputer vision techniques, and ranks them based on object motion, motion blur, auto exposure, auto focus, and auto white balance. About ten additional photos are saved, including an additional HDR+ photo up to 3 MP. Top Shot was introduced on the Pixel 3.[22]

Other features

[edit]
  • Computational Raw – Pixel Camera supports capturingJPEG andDNG files simultaneously. The DNG files are also processed with Google's HDR+ Computational Photography. Computational Raw was introduced on the Pixel 3.[9]
  • Motion Auto Focus – maintains focus on any subject/object in the frame. Motion Auto Focus was introduced in the Pixel 3.[23]
  • Frequent Faces – allows the camera to remember faces. The camera will try to ensure those faces are in focus, smiling and not blinking.[21]
  • Location – Location information obtained viaGPS and/or Google's location service can be added to pictures and videos when enabled.

Functions

[edit]

Like most camera applications, Pixel Camera offers different usage modes allowing the user to take different types of photo or video.[24]

Slow Motion

[edit]

Slow motion video can be captured in Pixel Camera at either 120 or, on supported devices, 240 frames per second.[25]

Panorama

[edit]

Panoramic photography is also possible with Pixel Camera. Four types of panoramic photo are supported; Horizontal, Vertical,Wide-angle andFisheye. Once the Panorama function is selected, one of these four modes can be selected at a time from a row of icons at the top of the screen.[26]

Photo Sphere

[edit]

Pixel Camera allows the user to create a 'Photo Sphere', a360-degree panorama photo, originally added in Android 4.2 in 2012.[27] These photos can then be embedded in a web page with custom HTML code or uploaded to various Google services.[28]

ThePixel 8 released without the feature, the first Pixel phone not to have the feature, and thus leading many to believe that the feature has been discontinued.[citation needed]

Portrait

[edit]

Portrait mode (called Lens Blur previous to the release of the Pixel line) offers an easy way for users to take 'selfies' or portraits with aBokeh effect, in which the subject of the photo is in focus and the background is slightly blurred. This effect is achieved via theparallax information from dual-pixel sensors when available (such as the Pixel 2 and Pixel 3), and the application ofmachine learning to identify what should be kept in focus and what should be blurred out. Portrait mode was introduced on the Pixel 2.[29][30][31]

Additionally, a "face retouching" feature can be activated which cleans up blemishes and other imperfections from the subject's skin.[32]

The Pixel 4 featured an improved Portrait mode, the machine learning algorithm uses parallax information from the telephoto and the Dual Pixels, and the difference between the telephoto camera and wide camera to create more accurate depth maps.[33] For the front facing camera, it uses the parallax information from the front facing camera and IR cameras.[34] The blur effect is applied at the Raw stage before the tone-mapping stage for more realistic SLR-like bokeh effect.[13][33]

Playground

[edit]

In late 2017, with the debut of thePixel 2 andPixel 2 XL, Google introduced AR Stickers, a feature that, using Google's newARCore platform, allowed the user to superimposeaugmented reality animated objects on their photos and videos. With the release of the Pixel 3, AR Stickers was rebranded to Playground.[35][36]

Google Lens

[edit]

The camera offers a functionality powered byGoogle Lens, which allows the camera to copy text it sees, identify products, books and movies and search similar ones, identify animals and plants, and scan barcodes andQR codes, among other things.

Photobooth

[edit]

The Photobooth mode allows the user to automate the capture of selfies. TheAI is able to detect the user smile or funny faces and shoot the picture at the best time without any action from the user, similar toGoogle Clips. This mode also feature a two level AI processing of the subject's face that can be enabled or disabled in order to soften its skin.Motion Photos functionality is also available in this mode. Thewhite balance is also adjustable to defined presets.[37] In October 2019, Photobooth was removed as a standalone mode, becoming an "Auto" option in the shutter options,[38] later being removed altogether.[clarification needed]

Night Sight

[edit]

Night Sight is based on a similar principle to exposure stacking, used inastrophotography. Night Sight uses modified HDR+ or Super Res Zoom algorithms. Once the user presses the trigger, multiple long exposure shots are taken, up to 15× 1/15 second exposure or 6× of 1 second exposure, to create up to a 6-second exposure. The motion metering and tile-based processing of the image allows to reduce, if not cancel, camera shake, resulting in a clear and properly exposed shot. Google claims it can handle up to ~8% displacement frame to frame. And each frame is broken into around 12,000 tiles. It also introduced a learning-based AWB algorithm for more accuratewhite balance in low light.[39][40][13]

Night Sight also works well in daylight, improving WB, detail and sharpness. Like HDR+ enhanced, Night Sight features positive-shutter-lag (PSL). Night Sight also supports a delay-timer as well as an assisted selector for the focus featuring three options (far, close and auto-focus). Night Sight was introduced with the Pixel 3, all older Pixel phones were updated with support.[41][42][43][44]

Astrophotography

[edit]

Astrophotography mode activates automatically when Night Sight mode is enabled and the phone detects it is on a stable support such as a tripod. In this mode, the camera averages up to fifteen 16-second exposures, to create a 4-minute exposure to significantly reduceshot noise. By dividing the shot into several shorter exposures, the camera manages to achieve the light capture of a long exposure without having to deal withstar trails, which would otherwise require moving the phone very precisely during the exposure to compensate for the Earth's rotation. Astrophotography mode also includes improved algorithms to removehot pixels and warm pixels caused bydark current andconvolutional neural network to detect skies for sky-specificnoise reduction.[45] Astrophotography mode was introduced with the Pixel 4, and backported to the Pixel 3 and Pixel 3a.[18][46][13]

Portrait Light

[edit]

Portrait Light is apost process feature that allows adding light source to portraits. It simulates the directionality and intensity to complement the original photograph's lighting usingmachine learning models. Portrait Light was introduced with the Pixel 5, and backported to the Pixel 4, Pixel 4a and Pixel 4a 5G. When using the default mode or Night Sight mode, its automatically applied if there a person or people. Portrait Light was a collaboration between Google Research,Google Daydream,Google Pixel, andGoogle Photos teams.[47]

Ultra HDR

[edit]
Main article:Ultra HDR

With the launch of the Pixel 8, Google announced that the Pixel Camera would receive support forUltra HDR. Ultra HDR is a format that stores an additional set of data alongside the JPG, with additional luminosity information to produce an HDR photo.[48] Shortly after, with version 9.2 of the app, Ultra HDR was backported to the Pixel 7 and 6.[49]

Unofficial ports to other phones

[edit]

Many developers have released unofficial gcamports for phones not made by Google, or implement its premium features on older Google phones. These unofficial apps often work around the lack on other phones of some of the hardware features of Google's more advanced devices, and sometimes enable features not exposed by the official version of the app. There are many different versions, targeted at different Android phones.

Although many Pixel features are available on the ported versions, some features may not to be available, or not work properly, on phones without proper API support[50] or incompatible hardware.[51]Google Play Services or a replacement such asmicroG is also required for the app to run.[52]

In 2016 a modified version brought HDR+ with Zero Shutter Lag (ZSL) to the Google Nexus 5X and Nexus 6P.[53] In mid-2017, a modified version of Google Camera was created for any smartphone equipped with a Snapdragon 820, 821 or 835 processor.[54] In 2018, developers released modified versions enabling Night Sight on non-Pixel phones.[55] In August 2020, a new way of accessing extra cameras was introduced,[56] removing the need to use root on phones that don't expose all cameras for third-party apps.[57]

References

[edit]
  1. ^"Pixel Camera APKs".APKMirror. RetrievedOctober 6, 2025.
  2. ^"Pixel Camera APKs".APKMirror. RetrievedOctober 6, 2025.
  3. ^"Meet Gcam: The X graduate that gave us a whole new point of view".Medium. 2017-06-06. Retrieved2019-10-15.
  4. ^Kellex (16 April 2014)."Google Camera Quick Look and Tour".Droid Life.
  5. ^Li, Abner (October 13, 2023)."'Google Camera' is now 'Pixel Camera' on the Play Store".9to5Google.Archived from the original on October 14, 2023. RetrievedOctober 15, 2023.
  6. ^"Introducing the HDR+ Burst Photography Dataset".Google AI Blog. Retrieved2019-08-02.
  7. ^Shankland, Stephen (October 21, 2016)."How Google's Pixel phone builds a better photo".CNET. Retrieved2019-10-14.
  8. ^"HDR+: Low Light and High Dynamic Range photography in the Google Camera App".Google AI Blog. Retrieved2019-10-14.
  9. ^ab"5 ways Google Pixel 3 camera pushes the boundaries of computational photography".DPReview. Retrieved2019-10-15.
  10. ^"HDR+ on vs HDR+ Enhanced? - Post #5".forum.xda-developers.com. Retrieved2018-04-05.
  11. ^Patel, Idrees (2017-11-20)."Google Explains Decisions Made on the Pixel 2 Camera".xda-developers. Archived fromthe original on 2017-12-16. Retrieved2019-10-14.
  12. ^ab"Live HDR+ and Dual Exposure Controls on Pixel 4 and 4a".Google AI Blog. Retrieved2020-08-04.
  13. ^abcde"These are the most important Google Pixel 4 camera updates".DPReview. Retrieved2019-10-18.
  14. ^"HDR+ with Bracketing on Pixel Phones".Google AI Blog. Retrieved2021-04-28.
  15. ^"Behind the Motion Photos Technology in Pixel 2".Google AI Blog. Retrieved2019-10-14.
  16. ^"Motion Stills – Create beautiful GIFs from Live Photos".Google AI Blog. Retrieved2019-10-14.
  17. ^Segan, Sascha (September 11, 2015)."How Apple's 'Live Photos' Can Win Where HTC's Zoe Lost".PCMag. Retrieved2019-10-14.
  18. ^abMade by Google '19, retrieved2019-10-16
  19. ^"Fused Video Stabilization on the Pixel 2 and Pixel 2 XL".Research Blog. Retrieved2018-04-05.
  20. ^"See Better and Further with Super Res Zoom on the Pixel 3".Google AI Blog. Retrieved2019-10-14.
  21. ^ab"Google Pixel 4 Promises 'Studio-Like Photos Without the Studio'".petapixel.com. Retrieved2019-10-16.
  22. ^"Top Shot on Pixel 3".Google AI Blog. Retrieved2019-10-15.
  23. ^Kundu, Kishalaya (2018-10-12)."10 Best Google Pixel 3 Camera Features".Beebom. Retrieved2019-10-15.
  24. ^"Google Camera HDR+ Manual setting of all parameters version".ZenTalk. Retrieved2018-04-05.
  25. ^"Google Camera - Apps on Google Play".Google Play. 2018-04-05. Retrieved2018-04-05.
  26. ^Biersdorfer, J. D. (2016-05-23)."Going Wide With Google Camera".The New York Times.ISSN 0362-4331. Retrieved2018-04-05.
  27. ^"Android 4.2 Jelly Bean Has Arrived: Photo Sphere Panoramic Camera, Gesture Typing, Wireless HDTV Streaming – TechCrunch".techcrunch.com. Retrieved2018-04-05.
  28. ^"Photo Sphere".Android Central. 2016-04-26. Archived fromthe original on 2020-12-01. Retrieved2018-04-05.
  29. ^"Portrait mode on the Pixel 2 and Pixel 2 XL smartphones".Google AI Blog. Retrieved2019-10-14.
  30. ^"Learning to Predict Depth on the Pixel 3 Phones".Google AI Blog. Retrieved2019-10-14.
  31. ^"How to Use Portrait Mode in Google Pixel 2: Cool Tips".Guiding Tech. 2017-12-26. Retrieved2018-04-05.
  32. ^"Download Google Camera App with Motion Photo + Face Retouching on the Google Pixel".xda-developers. 2017-10-13. Retrieved2018-04-05.
  33. ^ab"Improvements to Portrait Mode on the Google Pixel 4 and Pixel 4 XL".Google AI Blog. Retrieved2019-12-21.
  34. ^"uDepth: Real-time 3D Depth Sensing on the Pixel 4".Google AI Blog. Retrieved2020-04-10.
  35. ^"How to use AR stickers on the Google Pixel or Pixel 2".Android Authority. 2017-12-12. Retrieved2018-04-05.
  36. ^"See your world differently with Playground and Google Lens on Pixel 3".Google. 2018-10-09. Retrieved2019-10-15.
  37. ^"Take Your Best Selfie Automatically, with Photobooth on Pixel 3".Google AI Blog. Retrieved2019-10-15.
  38. ^"This week's top stories: Google Camera 7.1, Pixelbook Go hands-on, and more". 2019-10-12.
  39. ^"Night Sight: Seeing in the Dark on Pixel Phones".Google AI Blog. Retrieved2019-10-14.
  40. ^"See the light with Night Sight".Google. 2018-11-14. Retrieved2019-10-14.
  41. ^Savov, Vlad (2018-11-14)."Google gives the Pixel camera superhuman night vision".The Verge. Retrieved2019-10-14.
  42. ^"The Pixel's Night Sight camera mode performs imaging miracles".Engadget. Retrieved2019-10-14.
  43. ^"Pixel Night Sight also works in daylight, reducing noise and boosting resolution".Android Police. 2018-11-14. Retrieved2019-10-14.
  44. ^Savov, Vlad (2018-11-26)."Google's Night Sight is subtly awesome in the daytime, too".The Verge. Retrieved2019-10-14.
  45. ^"Astrophotography with Night Sight on Pixel Phones".Google AI Blog. Retrieved2019-12-04.
  46. ^"Behind the scenes: Google's Pixel cameras aren't trying to be cameras at all".Android Authority. 2019-10-15. Retrieved2019-10-16.
  47. ^"Portrait Light: Enhancing Portrait Lighting with Machine Learning".Google AI Blog. Retrieved2020-12-12.
  48. ^"Hands-on with Ultra HDR in Android 14: The future of photography". 2023-10-09.
  49. ^"Google's bringing Ultra HDR and other features to Pixel 7 and Pixel 6 — what you need to know". 2023-12-06.
  50. ^"Comment from one of the modders (cstark) regarding support for dual exposure controls".
  51. ^"How To Install and Use the Google Camera Port".www.celsoazevedo.com. Retrieved2020-09-27.
  52. ^"Google Camera Port: FAQ and Troubleshooting".www.celsoazevedo.com. Retrieved2020-09-27.
  53. ^Chow, Charles (2016-11-05)."Camera NX V4 Bring ZSL Photo Shooting with HDR+ on Nexus, Same as Pixel Phone's Way (Update for N6P)".ChromLoop. Archived fromthe original on 2021-09-17. Retrieved2019-10-15.
  54. ^Andrew Liptak (August 12, 2017)."Google's Pixel camera software has been made to work on other recent Android phones".
  55. ^"Get Google Camera port with Night Sight for Xiaomi Mi 5, Essential Phone".xda-developers. 2018-10-27. Retrieved2019-10-15.
  56. ^"New Google Camera mod enables auxiliary camera support on many devices without root".xda-developers. 2020-08-27. Retrieved2020-09-27.
  57. ^"AUX Cameras Enabler Module".Android Pages. 2020-04-07. Retrieved2020-09-27.

Further reading

[edit]

External links

[edit]
Smartphones
Flagship
Mid-range
Foldables
Computers
Laptops
Tablets
Wearables
People
Related
Software
development
Development tools
Official
Other
Integrated development
environments
(IDE)
Languages,databases
Augmented reality andvirtual reality
Events,communities
Releases
Derivatives
Devices
Pixel
Nexus
Play edition
Custom
distributions
Booting and
recovery
APIs
AlternativeUIs
Rooting
Lists
Related topics
Retrieved from "https://en.wikipedia.org/w/index.php?title=Pixel_Camera&oldid=1322516103"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2025 Movatter.jp