Google Activates Pixel Visual Core To Enhance Instagram, WhatsApp Pics

Google has activated the Pixel Visual Core – its secret image-processing chip – for the Pixel 2 users to help them with better photography. With the update, Google stated that the chip would use computation photography and machine learning to improve the image quality.

Google Pixel Visual Core
Image Source: Google Store (screenshot)

Share better photos on social networks

To put it in simpler words, now it will be easier for the users to shoot quality photos on third-party apps such as WhatsApp, Snapchat, Instagram and almost any other app that uses the Pixel 2 camera. In a blog post, the search engine giant stated that the user simply needs to click the picture and leave everything else to the Pixel 2, which would then make them bright, clear and detailed.

Talking about its custom chip, Google stated that it is more powerful for processing photography tasks and utilizes the battery more efficiently. Thus, the additional computing power is utilized to better the images by using the HDR+ algorithm.

Google notes that the Pixel Visual Core update would be made available to the smartphones over the next few days as part of the Pixel 2’s February monthly update. The update also includes new Augmented Reality Stickers related to winter sports.

Pixel Visual Core – how it improves the image

The Pixel Visual Core is Google’s first attempt to join the growing trend among smartphone makers to come up with their chip. This not only gives smartphone makers tighter control over its product, but also lowers dependence on traditional chipmakers.

“With eight Google-designed custom cores, each with 512 arithmetic logic units (ALUs), the IPU delivers raw performance of more than 3 trillion operations per second on a mobile power budget. Using Pixel Visual Core, HDR+ can run 5x faster and at less than one-tenth the energy than running on the application processor (AP),” the search giant said.

Isaac Reynolds, the project manager for the Pixel Camera, stated that HDR+ works quite differently compared to HDR. On the one hand, where HDR combines three or more simultaneous exposures for the best results, HDR+, on the other hand, takes around ten identical underexposed shots.

“We take them all and chop them into little bits, and line them on top of one another, and average the image together,” Reynolds said, according to Wired.

AI boost for digital zoom

Google’s Pixel Visual Core also runs RAISR to offer a sharper look to the zoomed shots, rendering a more detailed image than before. Rapid and Accurate Image Super Resolution (RAISR) enhances the otherwise ordinary digital zoom by using the machine learning system trained on real photos to offer better details.

Almost all smartphone makers are focusing on high-quality images. Such phones are largely preferred over traditional cameras for the portability and ease of clicking they offer. For a long time, digital zoom has been annoying as the hardware fails to magnify the distant subjects leaving the final images blurry and pixilated.

RAISR, however, goes beyond simple image processing as it adds one step of computer manipulation between the original photons that reach the camera’s image sensor and the final product. The tech works with filters to smoothen the skin, reduce noise to remove speckles that detract from a shot and sharpen the image.