Much has been said about the Pixel 2 and the Pixel 2 XL since their official launch in October but not everything is positive. There are some problems and issues that still need to be addressed but one thing is sure: its camera is great. It was immediately named as a new winner on DxOMark Mobile. It’s still on top of the list with an almost perfect score of 98. Aside from the camera, we know the phone can offer AR and VR experiences especially when it’s partnered with a Daydream View.
To make the Pixel 2 camera great, Google added HDR+ technology so anyone can capture images even in not-so-ideal lighting environments. The feature has done great for the Pixel 2 and Pixel 2 XL camera and now Google wants more people to experience the same by bringing the Pixel Visual Core co-processor to some of the most popular camera and photo-sharing apps today like Snapchat, WhatsApp, and Instagram. This means when taking photos using these mobile apps, the same technology on the Pixel 2 camera will work on your phone. This aims to improve the photo quality in an instant as if you’re using a Pixel 2.
Pixel Visual Core makes the HDR+ technology more effective. It uses machine learning and computational photography for results that are “picture-perfect”. It will adjust and edit the photos automatically to give you the best possible image quality all the time. Simply put, it does the image processing.
Instead of updating the camera app or the device software, the Pixel Visual Core is applied to the photo sharing and social apps that most people use. It takes advantage of the HDR+ algorithm, RAISR for clearer zoom in shots, and Zero Shutter Lag. This one is also available on Google Open Source which means developers can make use of the tech on their apps as well.
In addition, Google is including new winter sports-themed Augmented Reality (AR) Stickers. The new characters can interact with each other and the camera to make taking photos more exciting than ever. Check out sample video below:
SOURCE: The Keyword (Google)