a 200 megapixel sensor, what is it for?
200 megapixel sensor, Super HDR video, Super Quad Pixel technology… we help you understand the photo and video technologies of the Galaxy S23 Ultra.
As expected, Samsung did present, on Wednesday February 1 during its Unpacked, its new range of smartphones, the Galaxy S23, S23 Plus and S23 Ultra. But even more than in previous years, the Korean manufacturer has mainly emphasized the photo capabilities of its model.Ultra“. In the one-hour conference, a total of 32 minutes were devoted to smartphones alone, including 27 minutes specifically on the photo and video performance of the Galaxy S23s.
Suffice it to say that, for Samsung, photo and video are now at the center of communication around its smartphones. But it is still necessary to know how to decipher these novelties. Between the 200 megapixel sensors, the “dual-focus AF“video stabilization”OIS+VDIS” or the 12-bit dynamic range, we will explain to you what to remember from the photo and video performance of the Galaxy S23 Ultra.
A 200 megapixel sensor for 50 or 12.5 megapixel photos
As expected, Samsung chose to equip the main camera of its Galaxy S23 Ultra with a 200 megapixel sensor. This is actually the Samsung Isocell HP2 sensor, a 1/1.3″ format photo sensor, which the firm presented in mid-January.
In fact, to be more precise, we are mainly talking about a sensor with 200 million photosites. In a photo or video sensor, each pixel is actually recorded by a photosite, that is, a cell that will take care of transforming what it sees into a point of light. However, in the Samsung Galaxy S23 Ultra, if it is possible to take snapshots of 200 megapixels, it is not a default option.
The Galaxy S23 Ultra will actually use this ultra-defined sensor to group pixels together, depending on lighting conditions. In case of a dark scene, 16 adjacent photosites will be combined to create one pixel. Therefore, we will have a 12.5 megapixel image. In more favorable light conditions, it is after 4 that the photosites are combined, for shots of 50 megapixels.
The idea behind this technology is twofold: to provide clearer images when light is sufficient and higher dynamic range, with more brightness and less digital noise when light is too low.
For the first case, note that standard camera sensors rarely exceed 100 million pixels. This threshold is only reached by medium format photo sensors, such as the Fujifilm GFX 100S. In full frame bodies, the most specific models reach 61 million pixels – as in the Sony A7R V – while APS-C cameras are limited to 40 million pixels – as in the Fujifilm X-H2. In this order of thought, the wider the sensor, the more it will detect. Too high a definition on too small a sensor will in fact reduce the size of the photosites by increasing their density. However, the Galaxy S23 Ultra will still let you take 50 megapixel snapshots by default – or 200 megapixels via a dedicated option – for those who want to crop their photos in postproduction. This higher definition will make it possible to use this main sensor as part of the hybrid zoom, which is related to telephoto sensors.
For the second case, we will have the right to pixel binning, that is, the fusion of photosites between them. By combining them in groups of 4 or 16 to create a pixel, we will create larger virtual photosites. What allows two advantages: first, by combining pixels, we reduce the risk of digital noise – the color grains – associated with increasing ISO sensitivity. Then, we increase the dynamics of these virtual photosites which will be wider and therefore easier to capture the very bright elements than the darker ones. So, if each photosite measures 0.6 μm on a side, we are entitled to photosites of 1.2 μm on a side in 50 megapixel mode and 2.4 μm on a side in 12.5 megapixel mode.
For comparison, the iPhone 14 Pro – which uses the same technology with a sensor of 48 million photosites for 12 megapixel shots – will combine four photosites of 1.22 μm to create virtual photosites that 2 .44 μm.
Autofocus based on groups of four pixels
Photo announcements around the Galaxy S23 Ultra are not, however, confined to the single definition of the main sensor. Samsung also highlighted its autofocus technology, which it calls “Super Quad Pixel“. Actually, this is a phase detection autofocus technology that will use the different photosites used to form a pixel by recognizing the difference in focus between each one.
During the Galaxy Unpacked conference, Jaclyn Wyatt, social media manager at Samsung, explained:We’ve simplified autofocus with Super Quad Pixel, which uses each of 200 million photosites to easily focus on your subjects. By using four adjacent photosites to detect the differences between left and right, and top and bottom, it allows the camera to focus faster, because it has more reference points“.
This technology is not new in the field of smartphone photography. For several years Google has used an autofocus system called “Dual Pixelin each photosite cut in two parts to better distinguish the difference in perspective thanks to stereoscopy. It is enough to recognize, thanks to a three-dimensional view, the depth and therefore the focus. This use ofSuper Quad Pixelwhich is used to measure depth in autofocus, can also be used for portrait mode photos, measuring the depth of the scene as much as possible to produce background blur.
More advanced video functions
Samsung has also focused on the video performance of its Galaxy S23 Ultra and has high ambitions here as well. Admittedly, the manufacturer is not betting on getting the RAW format because Apple has been offering its iPhone Pro for a year and a half, but Samsung has nevertheless announced the arrival of a “Super-HDR” in the video.
In fact, this function is clearly reminiscent of the HDR modes that are already offered in photos. As a reminder, the principle for the smartphone is to take several photos at once, with different exposures. By grouping them, the smartphone will make it possible to maintain the same detail in dark areas, but also in the brightest areas.
Actually, in video, it is the same technology that is implemented, capturing the scene in different exposures. It is enough to make plans with a dynamic range of 12 bits according to Samsung. Whereas a classic video sequence is confined to a dynamic range of 8 bits, for 256 brightness values, a dynamic range of 12 bits will make it possible to go up to 4096 exposure values. In other words, videographers will enjoy a more detailed image.
Also in the field of video, Samsung announced a new stabilization technology “OIS+VDIS“. Concretely, this means that the sensor of the Galaxy S23 Ultra is stabilized mechanically – by shifting to compensate for movements – but also digitally. On its website, Samsung says that VDIS (digital image stabilization in video) helps “reduce the level of blur or distortion in videos that may result from movement or shake” and this stabilization works by increasing sensitivity and shutter speed to find the most fluid image that is possible.
Functions for photo and video professionals
Finally, as has been the trend for several years, Samsung also wants to address photo or video image professionals. In addition to the participation of filmmakers Ridley Scott and Na Hong-jin, the Korean firm also focused on the photo and pro video modes of its Galaxy S23 Ultra.
As with previous models, these modes will allow you to take snapshots through manual settings. It is enough not only to change the shutter speed by hand, but also the ISO sensitivity, the white balance or the manual focus distance. For this last point, Samsung also included a “focus peakingwhich will appeal to videographers by highlighting the elements where the focus is made in the photo thanks to a green border that is only visible during the shooting. This feature, which has been present for many years in mirrorless cameras, thus makes it possible to ensure that the focus is as accurate as possible.
Another feature that will undoubtedly appeal to budding videographers: the “clean preview on HDMI displayswithin the Camera Assistant app. Essentially, this feature will allow professional or semi-professional videographers to directly display the image return on an HDMI screen without interface elements. If this use is logically limited, given the limited use offered, however, it should also allow the Galaxy S23 to be connected to an HDMI acquisition key on a computer to turn it into a webcam.
Finally, still aiming to target image professionals, Samsung has partnered with Adobe to immediately offer Lightroom as the default photo editing application on the Galaxy S23, in addition to Expert RAW which application from Samsung. This will allow photographers who are used to the Adobe suite to maintain their habits for developing RAW files.
Obviously, with the batch of functions that Samsung has shown about its Galaxy S23 Ultra, many of them will only be used by a few people. Above all, we are still far from the capacities, formats or optical quality offered by mirrorless cameras or professional cameras. The fact remains, like Apple before it, Samsung relies more on the image and some functions, invisible to neophytes, should still make it possible to get the most out of the image module of the high- Samsung’s flagship smartphone.
Do you use Google News (News in France)? You can follow your favorite media. follow Fandroid on Google News (and Numerama).