Google will finally unlock the Pixel 2’s full HDR+ capabilities in December

The Google Pixel 2 comes with a number of innovative camera features, but the full potential of the smartphone’s full camera will soon be unlocked with the activation of the Pixel Visual Core when Android 8.1 launches. In October, Google said HDR+ will soon be available to third-party photography app developers via a preview of Android Oreo 8.1. Now that the operating system is in the final developer preview, those capabilities will be launching to consumers in December, Google shared. The update means that non-native apps can utilize the enhanced image quality from the Google Pixel 2 HDR+ mode.

Google’s HDR+ mode has been part of Google Camera for a few years, but the Google Pixel 2 refines that program by expanding processing power. HDR is a photo technique that blends multiple images together to keep the detail in both the bright and dark areas of the image. Since HDR involves multiple images, faster hardware is necessary in order to keep HDR from slowing down the performance of the smartphone.

HDR+ is made possible by Pixel Visual Core, a new System on Chip (SoC) circuit. It’s the first custom-made SoC by Google for a consumer product — a processor designed specifically for handling imaging data. Google says that using the Pixel Visual Core, the smartphone can process HDR+ photos five-times faster but with only a tenth of the battery drain when compared to the application processor, which third-party apps currently use for imaging.

“A key ingredient to the IPU’s efficiency is the tight coupling of the hardware and software,” wrote Ofer Shacham, senior staff engineer, and Masumi Reynders, director of product management. “Our software controls many more details of the hardware than in a typical processor. Handling more control to the software makes the hardware simpler and more efficient, but it also makes the IPU challenging to program using traditional programming languages.”

To ease the burden on developers with the increased complexity of the software, Google is using already-developed domain-specific languages, Halide and TensorFlow.

Once the Pixel Visual Core is switched on with the software update, third-party apps will be able to have access to that extra processing power, which means the smartphone can automatically process those HDR+ photos without slowing the smartphone down and without using the native camera app. The change will allow third-party photography apps to enhance their own HDR capabilities by accessing the faster Pixel Core rather than the application processor.

Third-party camera apps are popular for their extra features, often offering more controls including manual exposure controls and manual focusing. Once both the Pixel 2 software is updated and third-party developers update their own software, switching to the features inside of a third-party app will longer no mean losing the automatic HDR+ processing. Google shared several sample comparison images, with the images shot the Pixel Visual Core offering both brighter shadows like in a backlit selfie, and enhanced highlights, like a sky that’s a bit bluer.

Google says that the expansion of the HDR+ mode is just the start — Pixel Visual Core is a programmable chip and the company is already working to prepare more applications to expand the Pixel 2’s capabilities.

Update: Added the final developer preview and a launch date in December.