Google is going to improve AR experience on smartphones with two cameras, but so far only on Pixel 4 and 4 XL

Today it became known that Google has updated the SDK for working with augmented reality applications. As noted by Android Police, in it, the company has improved the performance of augmented reality applications on smartphones with two cameras. Interestingly, the performance improvement will only affect the 2019 Pixel smartphones so far.

In the official report on the ARCore innovation, the company says that support for two cameras to improve the work of augmented reality will appear in the coming week. At the moment, only Pixel 4 and Pixel 4 XL from Google, which were released in 2019, are mentioned among the supported devices.

And this is rather strange, since the current Pixel smartphones with two cameras at the moment are the Pixel 5 and Pixel 4a 5G. Perhaps their support is not yet implemented due to the fact that ultra-wide-angle modules are installed in them as additional sensors. While the Pixel 4 and Pixel 4 XL, in addition to the main camera, use a telephoto module with optical zoom. It is possible that optical zoom models help algorithms better analyze the depth map of an environment.

As noted by The Verge, such a case is Google’s hesitation about additional camera modules. In the days of Pixel 2 and 3, the company opposed the use of additional camera sensors, claiming that its software algorithms were able to “squeeze” the maximum out of one camera. For example, the Pixel 4a has a neural network-based algorithm that allows you to take photos with digital 4x zoom, which are comparable in quality to optical ones. A year after the announcement of a single camera, the Pixel 5 added an additional optical zoom camera, and the Pixel XNUMX replaced it with an ultra-wide-angle module.

If you notice an error, select it with the mouse and press CTRL + ENTER.

Related Posts

Leave a Reply

Your email address will not be published.