We have come to expect that with each new generation of flagship smartphones, features are only added, never removed However, Samsung may have a different idea about the camera system of the Galaxy S21, which will be released next year
According to a report by Korea's The Elec via SamMobile, Samsung has decided not to include a time-of-flight sensor in the upcoming Galaxy S21 series (also rumored to be called the Galaxy S30)
First, the company has struggled to find an obvious use case for time-of-flight technology, and second, the LiDAR system expected to be included in Apple's iPhone 12 Pro model is more powerful, and Samsung is not confident that its approach can competeAccording to The Elec, Samsung is working hard on a new and improved indirect time-of-flight system that is not like LiDAR but is based on hardware already present in its devices Unfortunately, this solution is not expected to be available in time for the launch of the Galaxy S21 in spring 2021, and it is quite possible that time of flight will be removed from Samsung's most popular models neither the Galaxy Note 20 Ultra nor the Galaxy Note 20 has such a sensor not be included in either the Note 20 Ultra or the Galaxy Note 20 (The Note 20 Ultra does, however, have a laser autofocus sensor)
The problem is one of distance and accuracy: with LiDAR, the iPhone 12 Pro would be able to detect objects in physical space at twice the distance of conventional indirect time-of-flight sensors The LiDAR method also generates a more detailed 3D depth map than the usual time-of-flight type, making augmented reality applications smoother, more realistic, and more accurate in relation to the surrounding environment
On the other hand, indirect time-of-flight sensors, such as those found in Samsung and LG devices, are cheaper to manufacture and are more common in high-end Android devices
With regard to photography, we have had the opportunity to test a number of devices with time-of-flight cameras over the years and have never found them to be particularly effective in improving image quality Typically, phones with time-of-flight sensors use added depth perception to build a 3D map, which can more intelligently separate foreground and background in shallow depth-of-field photos that simulate blur
In many cases, however, the same results can be achieved with stereoscopic views of two different camera lenses without the need to add flight time to the mix Furthermore, software based on machine learning models alone has improved rapidly over the past few years, to the point where even single-lens devices like the iPhone SE and Google Pixel 4a can produce depth-of-field effects that are nearly equivalent to those of expensive multi-lens flagship devices
In short, time-of-flight has never been particularly useful on flagship devices (at least in the current iteration) and has been something of a gimmick that has never quite lived up to the promises of phone manufacturers That Samsung apparently believed in flight time for as long as it did before finally canning it, based on this report, is a bit of a head scratcher
Perhaps LiDAR could succeed where previous time-of-flight attempts have failed Currently, Apple is the only smartphone brand that is trying to integrate LiDAR technology into its devices Cupertino already has experience with this technology, having introduced LiDAR in its latest iPad Pro If LiDAR truly benefits the iPhone 12 experience, Apple's competitors will take notice and work tirelessly to catch up
Comments