Tech articles - DXOMARK https://www.dxomark.com/category/tech-articles/ The leading source of independent audio, display, battery and image quality measurements and ratings for smartphone, camera, lens, wireless speaker and laptop since 2008. Tue, 16 Dec 2025 11:43:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.dxomark.com/wp-content/uploads/2019/09/logo-o-transparent-150x150.png Tech articles - DXOMARK https://www.dxomark.com/category/tech-articles/ 32 32 Long range telephoto Zoom China’s Flagship Comparison: Cityscape and Portrait https://www.dxomark.com/long-range-telephoto-zoom-chinas-flagship-comparison-cityscape-and-portrait/ https://www.dxomark.com/long-range-telephoto-zoom-chinas-flagship-comparison-cityscape-and-portrait/#respond Thu, 27 Nov 2025 12:56:51 +0000 https://www.dxomark.com/?p=188979&preview=true&preview_id=188979 Our series on the latest Chinese flagship smartphones continues. After our first episode focused on daytime portraits and backlit scenarios, and a second episode where we explored night and low-light capabilities, we now turn our attention to long-range zoom performance to see how these devices hold up when pushed to their limits. Modern smartphones are [...]

The post Long range telephoto Zoom China’s Flagship Comparison: Cityscape and Portrait appeared first on DXOMARK.

]]>
Our series on the latest Chinese flagship smartphones continues. After our first episode focused on daytime portraits and backlit scenarios, and a second episode where we explored night and low-light capabilities, we now turn our attention to long-range zoom performance to see how these devices hold up when pushed to their limits.


Modern smartphones are increasingly capable of reaching exceptionally long focal lengths thanks to high-resolution sensors, hybrid optical systems, and advanced computational photography. However, as zoom ratios increase, software processing plays an even greater role, influencing how details, fine textures, and overall scene rendering are reproduced.

In this comparison, we examined long range telephoto performance across five top-tier devices: Apple iPhone 17 Pro, Honor Magic 8 Pro, OPPO Find X9 Pro, vivo X300 Pro, and Xiaomi 17 Pro Max. Our intention was to observe how each phone behaves when pushed toward the upper limits of zoom not only on static landscapes, but also on zoomed-in portraits where facial rendering becomes more challenging.

Portrait Performance

Apple iPhone 17 Pro
Apple iPhone 17 Pro (8x zoom)
Honor Magic 8 Pro
Honor Magic 8 Pro (10x zoom)
OPPO Find X9 Pro
OPPO Find X9 Pro (6x zoom)
vivo X300 Pro
vivo X300 Pro (10x zoom)
Xiaomi 17 Pro Max
Xiaomi 17 Pro Max (10x zoom)

Portraits under long-range zoom further amplified stylistic differences between devices. The iPhone 17 Pro rendered subjects slightly darker and warmer, producing a natural yet more subdued interpretation. The OPPO Find X9 Pro leaned toward a softer result with moderate detail, while Honor Magic 8 Pro produced bright, crisp faces with subtle signs of motion blur.

The Xiaomi 17 Pro Max delivered clearly defined facial detail, and vivo X300 Pro produced one of the brightest and sharpest portrayals overall. When viewed closely, vivo, Honor, and Xiaomi all displayed slight sharpening behavior that became noticeable on fine details, but in full-frame viewing, each device presented a balanced and visually pleasing portrait representation. These variations reflect each brand’s priority between natural rendering and enhanced clarity when operating at extreme zoom levels.

Cityscape Performance

Apple iPhone 17 Pro
Apple iPhone 17 Pro (8x zoom)
Honor Magic 8 Pro
Honor Magic 8 Pro (10x zoom)
OPPO Find X9 Pro
OPPO Find X9 Pro (6x zoom)
vivo X300 Pro
vivo X300 Pro (10x zoom)
Xiaomi 17 Pro Max
Xiaomi 17 Pro Max (10x zoom)

At extremely long zoom distances, all five smartphones captured usable and visually appealing landscape results, yet their processing styles showed clear differentiation. The iPhone 17 Pro delivered a natural overall rendering with a restrained approach to sharpening, while the OPPO Find X9 Pro maintained stable detail even at a slightly lower zoom level, though minor flare reduced contrast in certain situations.

The Honor Magic 8 Pro produced clean images but relied on heavier detail processing, resulting in occasional unnatural nuances when examined closely. Similarly, vivo X300 Pro delivered strong detail and brightness, though its processing introduced more noticeable artifacts in fine textures.

The Xiaomi 17 Pro Max followed a similar path, producing generally pleasant full-frame results, yet slight motion blur reduced clarity and contrast under deep magnification. Despite these variations, all devices performed competently, with differences becoming most apparent during zoom-in evaluation.

Conclusion

All five flagships demonstrated convincing capability at long zoom ranges, confirming how far mobile telephoto systems have evolved. The iPhone prioritizes a restrained and natural style, while OPPO focuses on stability and balance even with a slightly shorter zoom factor. Honor offers sharp and clean results, though sometimes with visible processing at close inspection, whereas vivo pushes clarity and brightness further for a crisp, high-impact rendering. Xiaomi sits closely beside vivo and Honor with a similarly detailed output, though fine-detail softness may appear in movement-heavy scenes.

In the end, each manufacturer approaches high-zoom photography with its own vision from natural rendering to clarity-driven enhancement allowing different user preferences to align with different devices.

The post Long range telephoto Zoom China’s Flagship Comparison: Cityscape and Portrait appeared first on DXOMARK.

]]>
https://www.dxomark.com/long-range-telephoto-zoom-chinas-flagship-comparison-cityscape-and-portrait/feed/ 0 zoom_colorgrqph copy@0.5x zoom_detailsgraph copy@0.5x
Low-Light & Night Camera Performance : A Closer Look at China’s Latest Flagship https://www.dxomark.com/smartphone-night-shootout-low-light-and-telephoto-comparison/ https://www.dxomark.com/smartphone-night-shootout-low-light-and-telephoto-comparison/#respond Wed, 19 Nov 2025 16:53:17 +0000 https://www.dxomark.com/?p=188889&preview=true&preview_id=188889 Our exploration of the newest Chinese flagships continues. Following the first episode, where we examined daytime portraits and challenging backlit shots, we’re now ready to dive into how these devices perform in low-light and nighttime conditions. If you’d like to discover how these flagship devices performed in brighter conditions, you can read the previous article [...]

The post Low-Light & Night Camera Performance : A Closer Look at China’s Latest Flagship appeared first on DXOMARK.

]]>
Our exploration of the newest Chinese flagships continues. Following the first episode, where we examined daytime portraits and challenging backlit shots, we’re now ready to dive into how these devices perform in low-light and nighttime conditions. If you’d like to discover how these flagship devices performed in brighter conditions, you can read the previous article here.


Building on that foundation, this new installment focuses on night photography, an area where flagship smartphones increasingly differentiate themselves. We compared five major devices: Apple iPhone 17 Pro, HONOR Magic8 Pro, OPPO Find X9 Pro, vivo X300 Pro, and Xiaomi 17 Pro Max.
Our image quality team analyzed each photo to better understand how these models handle both night & low-light main camera shots and telephoto night scenes, two situations that challenge even the most advanced devices.

Low-Light Performance

In this nighttime scenario, the five smartphones demonstrated noticeably different approaches to rendering the scene and presenting the subject.

Apple iPhone 17 Pro
Honor Magic8 Pro
Oppo Find X9 Pro
vivo X300 Pro
Xiaomi 17 Pro Max

The Xiaomi 17 Pro Max produced a darker overall image while still keeping the subject’s face visibly well-defined. On the opposite end of the spectrum, HONOR Magic8 Pro and the iPhone 17 Pro delivered significantly brighter interpretations, giving the scene a more illuminated appearance.

Among the devices, Apple’s rendering tended to appear slightly cooler compared to the others, while OPPO introduced a warmer interpretation of the subject’s skin. These stylistic choices created subtle but noticeable differences in the overall mood of the images.

In terms of detail, the HONOR Magic8 Pro displayed a somewhat more processed look, giving the scene a style that may appeal to users who prefer a highly refined, digitally enhanced aesthetic. The iPhone 17 Pro showed a bit more visible grain across the frame due to its imaging mode choice, though its overall luminance remained balanced.
It’s worth noting that for this scene, the iPhone automatically switched to a 12MP mode designed for improved night performance rather than the higher resolution it used in other tests.

Low-Light Telephoto Comparison

The telephoto evaluation highlighted the differences in each device’s optical setup:

    • Xiaomi: 115 mm (5×)
    • vivo: 85 mm (3.5×)
    • OPPO: 70 mm (3×)
    • HONOR: 85 mm (3.7×)
    • Apple: 100 mm (4×)
Honor Magic8 Pro
Apple iPhone 17 Pro
Oppo Find X9 Pro
vivo X300 Pro
Xiaomi 17 Pro Max

Each device produced a well-rendered telephoto image, though their visual interpretations varied. HONOR and Xiaomi offered the brightest overall results, while OPPO and vivo delivered a slightly more subdued rendering of the scene.

Color presentation remained pleasant across all models, with Apple leaning slightly warmer and Xiaomi closer to a neutral balance. These differences shaped the atmosphere of each image rather than significantly altering accuracy.

In terms of clarity, vivo and HONOR produced the sharpest and most defined telephoto results in this night scene. OPPO’s output appeared softer in comparison, while Xiaomi introduced a small trace of motion softness, likely due to longer capture processing. The iPhone 17 Pro showed more visible grain, which gave its telephoto image a more textured appearance than the others.

A JOD (Just Observable Difference) is the smallest change in quality that the human eye can detect. On this scale, a gap of 1 JOD means a noticeable improvement therefore the higher JOD values corresponds to better performance in texture / noise.

Conclusion

Each flagship smartphone delivered capable night photography results, yet the differences remain significant enough to influence user preference. Xiaomi and Honor favored brighter interpretations, vivo provided some of the clearest telephoto detail, OPPO leaned toward a softer aesthetic, and Apple maintained its characteristic rendering style with a balance between brightness and softness. These variations reflect the unique imaging philosophies of each manufacturer, allowing users to choose the device that aligns best with their visual preferences and typical nighttime shooting habits.

We will continue to dive into the performance of these newly released flagships by testing these in additional real world scenarios. Stay tuned!

The post Low-Light & Night Camera Performance : A Closer Look at China’s Latest Flagship appeared first on DXOMARK.

]]>
https://www.dxomark.com/smartphone-night-shootout-low-light-and-telephoto-comparison/feed/ 0 apple_iphone17pro (195) Honor_Magic8pro (211) Oppo_findx9pro (191) vivo_x300pro (220) xiaomi_17promax (207) lowlight_exposuregraph_article Honor_Magic8pro (272) apple_iphone17pro (251) Oppo_findx9pro (253) vivo_x300pro (281) xiaomi_17promax (264) lowligh_noisetexture copy
A Closer Look at China’s Latest Flagships: Portraits and Backlit Camera Performance https://www.dxomark.com/a-closer-look-at-chinas-latest-flagships-portraits-and-backlit-camera-performance/ https://www.dxomark.com/a-closer-look-at-chinas-latest-flagships-portraits-and-backlit-camera-performance/#respond Mon, 17 Nov 2025 14:09:34 +0000 https://www.dxomark.com/?p=188755&preview=true&preview_id=188755 Smartphone photography continues to advance rapidly, with manufacturers refining their imaging pipelines to deliver more consistent and visually appealing results. As users increasingly rely on their phones for high-quality photos in everyday situations, the ability to handle both well-lit and challenging lighting conditions has become a key differentiator. To better understand how today’s high-end devices [...]

The post A Closer Look at China’s Latest Flagships: Portraits and Backlit Camera Performance appeared first on DXOMARK.

]]>
Smartphone photography continues to advance rapidly, with manufacturers refining their imaging pipelines to deliver more consistent and visually appealing results. As users increasingly rely on their phones for high-quality photos in everyday situations, the ability to handle both well-lit and challenging lighting conditions has become a key differentiator. To better understand how today’s high-end devices perform, we tested five flagship smartphones: Xiaomi 17 Pro Max, vivo X300 Pro, HONOR Magic8 Pro, OPPO Find X9 Pro, and Apple iPhone 17 Pro.

In this first episode, we focused on two common scenarios that reveal a device’s photographic strengths and processing decisions: a daytime portrait and a backlit portrait. These scenes were selected because they represent situations encountered by most smartphone users, whether capturing a quick portrait outdoors or photographing a subject with strong light behind them.

Daytime Portrait Performance

Daylight portraits offer a relatively controlled setting, yet they also allow key differences in image processing to emerge. In this scenario, all five devices produced solid results, though each approached subject rendering and scene interpretation in its own way.

Xiaomi 17 Pro Max
Honor Magic8 Pro
Apple iPhone 17 Pro
Oppo Find X9 Pro
vivo X300 Pro

The Xiaomi 17 Pro Max, vivo X300 Pro, and OPPO Find X9 Pro delivered portraits with clear and natural-looking subject representation, maintaining consistent facial rendering and a generally realistic overall appearance. These three devices also remained relatively close in terms of color interpretation, showing only slight variations from one another.

The HONOR Magic8 Pro produced a more processed and stylized look, which may appeal to users who prefer a more enhanced aesthetic. Its rendering also appeared noticeably cooler than the other models, contributing to a less natural impression.

The Apple iPhone 17 Pro offered the warmest interpretation of the scene, with a softer and more subdued representation of the subject compared with the other devices.

Despite these differences, all smartphones handled the daylight portrait scenario effectively, with variations largely reflecting each manufacturer’s preferred interpretation of color, facial detail, and overall portrait style.

Backlit Scene Performance

The backlit test created a more demanding environment, pushing each device to balance the bright background with the subject positioned in front of it. This scenario accentuated the distinctions between the devices far more than the daylight portrait.

vivo X300 Pro
Oppo Find X9 Pro
Apple iPhone 17 Pro
Honor Magic8 Pro
Xiaomi 17 Pro Max

The Xiaomi 17 Pro Max delivered the brightest representation of the subject, making the face more prominent in the frame. In contrast, the iPhone 17 Pro rendered the subject noticeably darker, resulting in a more shadowed appearance.

The vivo X300 Pro and OPPO Find X9 Pro preserved more information in the sky and background areas, offering a better balance between the subject and the environment. Their handling allowed for greater visibility of bright elements behind the subject.

Flare significantly affected both the iPhone 17 Pro and the OPPO Find X9 Pro, reducing overall clarity in the image and softening some details. The Xiaomi 17 Pro Max, VIVO X300 Pro, and HONOR Magic8 Pro maintained stronger subject definition, although the HONOR model continued to exhibit a more processed overall look.
Among the five devices, the OPPO Find X9 Pro showed a noticeable drop in crispness due to flare, while the iPhone 17 Pro delivered the softest backlit result overall.

Conclusion

Across both scenes, the five flagship smartphones demonstrated competent performance, but they also exhibited clearly different rendering styles that reflect each manufacturer’s design choices. Xiaomi, vivo and OPPO tended to deliver brighter and more detailed subject representation, making their results appealing to those who prefer a clear and defined look. HONOR leaned toward a more processed aesthetic that may suit users who enjoy a stylized portrait effect. Apple’s iPhone 17 Pro produced softer and darker results in challenging backlit situations, offering a more subdued interpretation of the scene. These distinctions highlight how each device prioritizes different aspects of image creation, helping users select the smartphone that best aligns with their visual preferences and typical shooting conditions.

We will  continue to dive into the performance of these newly released flagships by testing these in additional real world scenarios. Stay tuned!

The post A Closer Look at China’s Latest Flagships: Portraits and Backlit Camera Performance appeared first on DXOMARK.

]]>
https://www.dxomark.com/a-closer-look-at-chinas-latest-flagships-portraits-and-backlit-camera-performance/feed/ 0 xiaomi_17ultra (10) Honor_Magic8pro (10) apple_iphone17pro (12) Oppo_findx8pro (11) vivo_x300pro (10) outdoorportrait_graph_article vivo_x300pro (127) Oppo_findx8pro (112) apple_iphone17pro (110) Honor_Magic8pro (117) xiaomi_17ultra (117) (1) backlit_graph_article
The Glass-to-Glass Experience: Is what you see truly what you truly get on your smartphone? https://www.dxomark.com/the-glass-to-glass-experience-is-what-you-see-truly-what-you-truly-get-on-your-smartphone/ https://www.dxomark.com/the-glass-to-glass-experience-is-what-you-see-truly-what-you-truly-get-on-your-smartphone/#respond Wed, 29 Oct 2025 16:40:17 +0000 https://www.dxomark.com/?p=187230 We previously examined smartphone features that are dependent on ambient Correlated Color Temperature (CCT) that are quickly becoming standard across flagship smartphones. In our initial article on CCT Adjustment Technology (CCT), we highlighted a key industry challenge: smartphones handle color rendering very differently depending on ambient light. This lack of consensus results in varied, and [...]

The post The Glass-to-Glass Experience: Is what you see truly what you truly get on your smartphone? appeared first on DXOMARK.

]]>

We previously examined smartphone features that are dependent on ambient Correlated Color Temperature (CCT) that are quickly becoming standard across flagship smartphones. In our initial article on CCT Adjustment Technology (CCT), we highlighted a key industry challenge: smartphones handle color rendering very differently depending on ambient light. This lack of consensus results in varied, and sometimes inconsistent, visual outputs across devices and environments.

Our observations led us to another question: “How do smartphones render the reality we see?”

In other words, to what extent do captured photos reflect the real-life scenes they were taken in? And how much of what users perceive comes not only from the camera, but also from the display?

To explore these questions, we designed a set of experiments examining the so-called “glass-to-glass experience which can be defined as the full imaging journey from capture through the camera lens and optics (glass) to rendering on the display panel (glass).

The aim of these experiments was to investigate how consistent the smartphone experience is from capture to display.

To do so, we put five flagship smartphones to the test: the iPhone 16 Pro Max, Samsung Galaxy S24 Ultra, Huawei Pura 70 Ultra, Vivo X200 Pro, and Honor Magic 7 Pro. Testing was conducted under both controlled laboratory conditions and real-world scenarios, allowing us to assess the entire imaging pipeline, from photo capture to on-device display rendering under real and ideal laboratory situations.

The experiment was divided into three parts:

    • Camera performance: examining HDR photo capture fidelity to the real scene
    • Display performance: analyzing how images are rendered on device screens
    • Camera to display user experience: studying how a small group of participants perceive fidelity and preference directly on smartphones

Camera capture: a variety of renderings

In this first phase of our experiment, we focused on the analysis of HDR photo capture performance applying our standardized camera testing protocol (learn more here). These included assessments on an HDR monitor (an Apple Pro XDR Display) under calibrated reference lighting conditions to ensure perceptual accuracy.

Entire imaging pipeline captured at the scene, lowlight environment
Perceptual evaluation using standardized visualization conditions
Captures viewed on a standard HDR screen under standardized visualization conditions

Comparing simultaneously the captures on the same reference display, we detected meaningful differences in the accuracy of scene reproduction only imputable to the capture process. For scenes shot in low-light conditions in particular, several challenges emerged that could noticeably affect perceived image quality.

Visualizing the set of pictures on an HDR screen, we observed important issues on some of the pictures taken:

    • Inconsistent white balance
    • Unnatural skin tones
    • In some cases, unbalanced exposure between subject and background
    • And even, for a few flagships a lack of HDR support impacting the rendering

These shortcomings varied across the devices tested, (Apple iPhone 16 Pro Max, Huawei P70 Ultra, Honor Magic 7 Pro, Samsung Galaxy S24 Ultra, and Vivo X200 Pro) highlighting the diverse approaches OEMs take in HDR imaging.

Yet the story does not end with camera capture. The way an image is displayed can amplify or mitigate these capture-related limitations, ultimately shaping user perception.

Display playback: disparities in luminance and color strategies

To assess rendering quality of the tested displays, we loaded two test images on the flagship devices:

    1. An SDR photo captured with a DSLR camera
    2. An HDR photo captured with a smartphone (converted into an HDR  format that was compatible across all devices)
SDR photo captured with a DSLR camera, displayed on each smartphone under in low-light conditions that replicates those of the original scene
HDR photo captured with a smartphone, displayed on each device under in low-light conditions that replicates those of the original scene

The goal was to evaluate the difference in rendering on each display for the same input content. Testing was conducted in a controlled lab environment under lighting conditions designed to replicate the original scenes. Detailed luminance and color measurements allowed us to analyze each device’s display behavior with precision.

These findings highlight clear differences in luminance tuning strategies. Brightness levels varied significantly, affecting both readability and overall clarity.

The Huawei Pura 70 Ultra, for example, boosted highlights more aggressively than its peers in both SDR and HDR renderings. Other devices, while appearing dimmer overall, relied on local tone mapping to preserve visibility of key details
Color accuracy was equally distinctive

Noticeable shifts were observed across devices, and users are more sensitive to these changes than they might realize. Each OEM deployed its own ambient light adaptation method. Apple’s iPhones, for instance, dynamically adjusted white balance to match environmental conditions, often leaning towards warmer tones under warm low-light environments.

The key takeaway of this study is that display rendering extends beyond simple image reproduction. It is fundamentally about contextual adaptation. The result is not a direct mirror of reality, but a perceptual experience shaped by both device design and viewing conditions. As perception varies with the environment, the visual impression does as well, reinforcing the need for holistic testing approaches that consider the entire imaging pipeline.

Evaluating Smartphone Glass to Glass User Experience: From Reality to Preference

To better understand how users perceive smartphone camera performance, we conducted a focused study with a small group of European participants. The goal was to evaluate the perceived fidelity and preference of image renderings directly on smartphones, moment after the capture. The experiment took place in-situ, meaning that participants evaluated photos directly in the same environment where they were taken.

Two complementary evaluation methods were used.

    • First, each participant provided an individual assessment by answering: “How close is the rendering to reality?” on a 1–5 scale.
    • Second, a side-by-side comparison was conducted, where participants selected the rendering that looked closest to reality and the one furthest from reality.
Figure 3: side-by-side evaluation
Figure 4: individual evaluation

The results showed a nuanced picture. In the individual evaluation: The Vivo X200 Pro, the iPhone 16 Pro Max and Huawei P70 Ultra were rated as close to reality overall. In contrast, the Honor Magic7 Pro was consistently considered the least faithful both individually and in side-by-side comparisons.

To extend our understanding beyond fidelity, the experiment was repeated with a new focus: user preference. Participants were asked: “How do you like the rendering?” (1–5 scale) and in side-by-side mode, they were instructed to select their most preferred and least preferred rendering.

Interestingly, the findings diverged from the fidelity-focused evaluation. The iPhone 16 Pro Max emerged as the most preferred device in side-by-side comparisons, followed by the Vivo X200 Pro. While the Vivo had previously been considered the most accurate, the iPhone was slightly more appreciated by participants overall. Once again, Honor Magic 7 Pro ranked last, being both the least preferred and the least faithful to reality.

These two complementary rounds of evaluation revealed two key insights.

    • A perceived faithful reproduction of reality is not necessarily the most preferred rendering. Subtle image processing choices, such as contrast or color enhancement, may influence preference even if they deviate from real-life perception.
    • There is a clear correlation between the least faithful and the least preferred rendering, as seen with Honor Magic 7 Pro.

Taken together, these insights illustrate that user experience in smartphone photography sits at the intersection of accuracy and appeal, underlining the importance of balancing fidelity and enhancement in imaging pipeline design comprising camera and display. Further studies are currently being conducted to better understand the preferences.

What does it take to get a good Glass to Glass Experience?

Delivering a unique camera-to-display experience on a smartphone requires the interplay of several critical components working in harmony.

At the foundation is the ambient light sensor (ALS), a small but essential element that constantly measures the surrounding lighting conditions. Manufacturers such as ams OSRAM, a leader in optical sensing technologies, produce advanced ALS solutions that enable this precise environmental awareness. Spectral ALS variants, in particular, enable precise measurements of chromaticity and illuminance (lux), allowing devices to adapt image capture and display playback to the ambient environment. These sensors are typically integrated on both the camera and display sides of the device.

The camera then takes on the task of capturing photos and videos, but the raw image is only the starting point. Through sophisticated image processing, each manufacturer applies its own stylistic choices, enhancing details, balancing exposure, and adjusting color reproduction to create a signature visual identity.

Once the content is captured, the display becomes the final stage of the pipeline, responsible for rendering the image or video back to the user. Here again, adaptation plays a crucial role, as the display fine-tunes brightness, contrast, and color balance to match the ambient light, striving to maintain both clarity and comfort across diverse conditions, from dim interiors to bright outdoor sunlight.

What ultimately defines the user experience, however, is not just the individual performance of these components but the way they are tuned to work together. The subtle choices made in calibration and tuning can elevate the experience, making visuals feel vivid and lifelike, or conversely undermine it, leaving them flat or unrealistic. This fine balance makes tuning a decisive factor in shaping user perception.

Conclusion

Ultimately, our findings highlight the importance of evaluating the entire imaging pipeline as a whole, from ambient light sensing to camera capture and display rendering, because this is how users actually experience their smartphones. Testing components in isolation cannot fully explain how fidelity and preference interact once everything comes together in the user’s hands.

Beyond the technical aspects, many studies also underline that preferences are not universal: they are deeply rooted in culture, habits, and regional aesthetics. A rendering style that appeals to European users may not resonate the same way in Asia or North America. This is why local evaluations are critical, ensuring that tuning strategies are adapted to specific user groups.

By embracing both technical accuracy and cultural sensitivity, manufacturers can better align their devices with real-world user perception and deliver experiences that feel both authentic and engaging.

In addition, as much as smartphones today are recognized for their distinctive “camera signatures”, reflecting each manufacturer’s unique approach to exposure, color balance, and tone, we may now be entering an era of the “display signature.” Beyond capture, the way a device presents content on screen can be just as defining. Some smartphones favor darker brightness adaptation in low light, giving a more cinematic, subdued experience, while others embrace highly vivid color rendering, creating instantly recognizable visuals across their product lines. Ultimately, crafting a meaningful display signature requires a deep understanding of the visual scene and context, enabling the device to apply the most relevant settings and parameters to deliver a consistent, intentional viewing experience, one that becomes as characteristic as its camera output.

The post The Glass-to-Glass Experience: Is what you see truly what you truly get on your smartphone? appeared first on DXOMARK.

]]>
https://www.dxomark.com/the-glass-to-glass-experience-is-what-you-see-truly-what-you-truly-get-on-your-smartphone/feed/ 0 G2G_fullarticle_KV glass to glass photo G2G_computeranalysis Glass2glass_lowlight_alldevices Glasstoglass_post3_displaylab-2 Glasstoglass_post3_displaylab glass2glass_post3_4 glass2glass_post3_5 Glasstoglass_post4-2 Glasstoglass_post4 g2gpost5_3_v2 g2gpost5_4_v2 G2G_infograph_EN G2G_articlevisual
Security cameras : 2025 benchmark https://www.dxomark.com/security-cameras-2025-benchmark/ https://www.dxomark.com/security-cameras-2025-benchmark/#respond Wed, 08 Oct 2025 08:51:06 +0000 https://www.dxomark.com/?p=187711 In 2022, DXOMARK conducted a benchmark study of doorbell security cameras. Today, we return with a broader and more comprehensive evaluation, extending the scope to include a wider range of products, covering not only doorbells, but also indoor and outdoor security cameras. While all security cameras serve the same fundamental purpose, allowing users to see [...]

The post Security cameras : 2025 benchmark appeared first on DXOMARK.

]]>

In 2022, DXOMARK conducted a benchmark study of doorbell security cameras. Today, we return with a broader and more comprehensive evaluation, extending the scope to include a wider range of products, covering not only doorbells, but also indoor and outdoor security cameras.

While all security cameras serve the same fundamental purpose, allowing users to see what’s happening in and around their homes, the requirements and expectations vary depending on the product category. In this benchmark we assess each type of camera separately, highlighting their specific use cases, strengths and limitations.

As with all DXOMARK protocols, our evaluation combines rigorous laboratory testing with perceptual analysis in real-world scenarios. The scores rely on objective tests generated by measurement software under controlled lab setups for and on perceptual tests in which a sophisticated set of metrics allow a panel of image experts to compare aspects of image quality that require human judgment. Please note that our tests covered only the image quality, not the overall performance of the doorbell (such as the quality of the Wi-Fi connection, latency…)

In this article, we present a benchmark comparing the performance of devices across different categories.

Doorbell Cameras

From left: Google Nest Doorbell (wired, 3rd gen), Kasa Smart Doorbell (KD110), Wyze Video Doorbell V2, Arlo Video Doorbell 2K, Tapo smart Video Doorbell, Ring Doorbell Pro 2

For doorbell cameras, the primary user expectation is clear: the ability to reliably identify who is at the door, whether it’s broad daylight, dusk, or nighttime. This requires accurate exposure in a variety of conditions: bright sunlight, backlit scenes, and low light. Effective recognition also depends on capturing fine details, from facial features to textures, such as brand logos, badges, or even text on packages.

As noted, doorbell cameras were the focus of a dedicated benchmark published in 2022. In this updated study, we revisit the category with six of the latest models. Each was tested using a dedicated version of our camera protocol, tailored to doorbell use cases. We evaluated their performance across a range of lighting conditions, from day to night, while accounting for dynamic scenes and varying subject distances.

Here are their respective specifications of the models tested:

To provide accuracy and perspective, we’ve added actual images of the field of view, illustrating what you will genuinely see through the camera:

Wyze Video Doorbell V2
Tapo Smart Video Doorbell Wired (Tapo D130)
Ring Doorbell Pro 2
Kasa Smart Doorbell (KD110)
Google Nest Doorbell (Wired, 3rd Gen)
Arlo Video Doorbell 2K

Taking into account all measured metrics across both daylight and nighttime scenarios, the benchmark results below present the performance ranking of all evaluated models.

Daylight use case results

Let’s dive into the results of the doorbell under the daylight use case. Daylight use cases range from well-lit conditions on a sunny day to a strong backlit situation, passing by a cloudy day. Doorbells must adapt to these situations to be usable anytime and in any situation. The chart below reflects the Daylight use case scores for all the tested models.

To dive into more details, if we look at a specific scene like bright sunlight with no porch or cover, identifying typical SDR conditions, we observe that most tested devices performed well in this relatively simple case, delivering properly exposed faces.

The main challenge lies in simultaneously managing background exposure to ensure surroundings remain visible. Here, Google Nest Doorbell distinguished itself, thanks to its HDR implementation (the only device in our benchmark showing HDR capabilities ) avoiding clipping in both highlights and shadow areas. Competing models often clipped bright areas (most often, the sky’s brightest parts)

Google Nest Doorbell – Good brightness on the face, no clipping in bright or dark areas
Tapo Smart Video Doorbell Wired – Acceptable Face brightness, clipping in the sky
Kasa Smart Doorbell (KD110) –  Acceptable face brightness, clipping in both bright and dark areas

When looking at more challenging situations, when the camera is placed under a porch (looking toward street or backyard), this scenario proved more difficult for the devices of the benchmark. Indeed, faces often fall into shadow, making recognition more challenging. Cameras needed to brighten subjects sufficiently while maintaining usable background exposure.

In these scenarios, Google Nest Doorbell again performed strongly, delivering the highest target exposure and balancing face readability with HDR-preserved background detail. Tapo, offered the brightest face exposure but clipped highlights, while other models struggled to maintain a usable balance.

Google Nest Doorbell
Acceptable brightness on face and no clipping in bright or dark areas
Tapo Smart Video Doorbell
Good brightness on face and clipping in the sky
Wyze Video Doorbell Pro 2
Low brightness on face and clipping in both bright and dark

In both daylight setups, detail rendering and low artifact levels were also crucial for identification. Ring and Tapo offered the sharpest details, while Wyze and Google Nest doorbell produced softer textures, often losing small details. Arlo was hampered by compression artifacts. Google Nest doorbell occasionally showed some HDR-related artifacts but these did not hinder recognition.

Night use case results

At night, most devices switch to infrared (IR) mode, often with a built-in IR illuminator. To simulate realistic conditions, we tested cameras under porch-light scenarios, where bright subjects contrasted sharply with dark backgrounds. This often caused clipped facial details, making recognition difficult.

Here again, Google Nest Doorbell’s HDR strategy provided a clear advantage, avoiding or minimizing clipping while maintaining background visibility. Google Nest doorbell also preserved a higher level of detail at night compared to its daytime performance. However, its HDR algorithm introduced noticeable fusion artifacts in IR mode.

Google Nest Doorbell- face is partly clipped but recognizable, background is clearly visible
Ring Wired Doorbell Pro 2 – face fully clipped, background is visible
Kasa Smart Doorbell (KD110)- face partly clipped but recognizable and some background clipping in dark areas

Overall pros & cons of each model

Pros Cons
Google Nest Doorbell

(Wired, 3rd gen)

  • Wide dynamic range, and generally accurate face exposure for face identification
  • Acceptable level of details for face identification when the subject is at close distances
  • Few visible artifacts
  • Low noise in the central areas of the frame
  • Face exposure sometimes slightly too low
  • Noise visible in corners
Ring Wired Doorbell Pro 2
  • Good detail preservation
  • Generally accurate face exposure
  • Pleasant and natural color rendering
  • Exposure instabilities especially in night conditions
  • Noticeable noise in some conditions
Kasa Smart Doorbell (KD 110)
  • Acceptable level of details
  • Well controlled noise, especially in daylight
  • Frequently low target exposure
  • Color rendering often inaccurate or unpleasant
Arlo Video Doorbell 2gen
  • Generally accurate face exposure
  • Mostly reliable white balance
  • Low detail on both faces and backgrounds
  • Strong artifacts often visible: frame freezing, compression issues
Tapo Smart Video Doorbell Wired (Tapo D130)
  • Accurate face exposure in daylight, acceptable at night
  • Good detail rendering in daylight
  • Pink hue often visible
  • Low contrast in many conditions
Wyze Video Doorbell v2
  • Generally accurate white balance
  • Poor night performance: faces lack contrast, often unrecognizable
  • Often low level of details

Conclusion

Thanks to its HDR strategy and consistent detail preservation across conditions, Google Nest Doorbell offers the most reliable user experience in our benchmark. Combined with vivid and natural color rendering, it stands out from the competition and earns our DXOMARK Gold Label as the best doorbell camera of 2025.

Outdoor Cameras

From left: Arlo Pro 5 2K, Eufy Security Outdoor Cam E220, Google Nest Cam Outdoor (Wired, 2nd Gen), Ring Outdoor Cam, Wyze Battery Cam Pro

As this category enters our benchmark for the first time, it’s important to note that while the core requirement remains the same, allowing users to see what’s happening around their property, the perspective and use cases differ. Unlike doorbell cameras, which capture visitors directly at the doorstep, outdoor cameras are typically installed higher up or to the side, offering a broader field of view. Their role is less about face-to-face interaction and more about general surveillance, such as monitoring cars entering the driveway or children playing in the yard.

When looking at outdoor security cameras, there are several aspects to consider and to look for. Field of view (FOV) is crucial. From a usability perspective, the wider the coverage, the fewer cameras you will need to monitor your property effectively. Exposure handling is equally important, as the camera must reveal details in both bright and dark areas of the scenes, ensuring that nothing is missed. Finally, texture rendering plays a critical role in identification tasks, such as recognizing unusual activity or reading license plates, which often require maintaining clarity even at long distances.

For the evaluation of outdoor cameras, we considered 5 different models :

To provide accuracy and perspective, we’ve added actual images of the field of view, illustrating what you will genuinely see through the camera:

Ring Outdoor Cam
Google Nest Cam Outdoor (Wired, 2nd Gen)
Eufy Security Outdoor Cam E220
Arlo Pro 5 2K
Wyze Battery Cam Pro

Taking into account all measured metrics across both daylight and nighttime scenarios, the benchmark results below present the performance ranking of all evaluated models.

Specifically looking at the specification of the Field-of- View, the Google Nest Cam Outdoor (wired, 2nd gen) delivers a wide lateral field of view, comparable to most models currently available on the market. Among the other models of our benchmark, Ring stands out with the broadest lateral coverage while Wyze offers the narrowest.

Daylight use case results

Let’s dive into the results of the doorbell under the daylight use case. As for doorbells, daylight use cases range from well-lit conditions on a sunny day to a strong backlit situation, passing by a cloudy day. In addition, specific use cases such as license plate readability as well as ball play in the background were specifically studied.

Our results indicate that in daylight conditions, all tested devices provide accurate face exposure under standard SDR scenarios. The Google Nest Cam Outdoor stands out in high-dynamic-range situations, delivering the widest dynamic range and avoiding clipping in both bright areas (such as the sky) and shadows, a challenge for most competing models. At closer distances, Google Nest Cam Outdoor consistently offers the highest level of detail, only occasionally surpassed by Eufy in specific scenes. As the subject moves farther from the camera, identifying a person becomes more challenging for all devices; however, Google Nest Outdoor Cam and Eufy maintain the clearest recognition at longer distances.

Google Nest Cam Outdoor
Accurate exposure, dynamic range and acceptable level of details for face recognition
Ring Outdoor Cam Pro
Low level of details on the face
Arto Pro 5 2K
Low level of details on the face

Night use case results

Similarly to doorbells, to simulate realistic conditions, we tested cameras under porch-light scenarios, using an external light source and not the potentially built-in light of the cameras for a fair evaluation.

Looking at night use cases, the Google Nest Cam Outdoor provides the best overall balance, delivering accurate facial brightness while preserving visibility in the background. Other cameras often struggle, either overexposing faces or losing detail in dark areas, which can limit effective surveillance.

Achieving accurate target exposure in low-light conditions can be challenging, particularly when it comes to controlling noise. Google Nest manages this well, showing only slightly more visible noise than its competitors while still delivering a solid balance for surveillance use cases.

Regarding detail retention, all devices face challenges in low-light conditions; however, the Google Nest Cam Outdoor performs better than most, maintaining enough clarity for faces to remain identifiable at close range.

Google Nest Cam Outdoor
face partly clipped but still recognizable, background remains visible.
Eufy Security Outdoor Cam E220
Face fully clipped though the background is visible
Ring Outdoor Cam
Face fully clipped with additional clipping in dark areas of the background

Overall pros & cons of each model

Pros Cons
Google Nest Cam Outdoor 
  • Wide field of view
  • Very extended dynamic range
  • Sufficient detail for face identification at close distances
  • Noise on faces generally well controlled
  • Fine details may be lost in daylight conditions
  • Visible artifacts such as blocking and ghosting/fusion artifacts
Ring Outdoor Cam
  • Wide field of view
  • Noise generally well controlled
  • Generally low level of detail
  • Limited dynamic range with visible clipping in bright areas
Arlo Pro 5 2k
  • Generally accurate target exposure
  • Pleasant color rendering
  • Generally low level of detail
  • Noise visible in dark areas
eufy Security Outdoor Cam E220
  • High level of details
  • Noise generally well controlled
  • Accurate target exposure on faces allowing recognition
  • Limited dynamic range with bright clipping
  • Slightly narrower field of view
Wyze Battery Cam Pro
  • Accurate target exposure on faces allowing recognition
  • Generally well controlled noise
  • Strong visible blocking and ringing artifacts
  • Limited dynamic range with bright clippings

Conclusion 

Google Nest Cam Outdoor – 1st Place
Ring Outdoor Cam

Thanks to its HDR processing and consistently strong level of detail across all conditions, the Google Nest Outdoor Cam delivers optimized surveillance performance. Combined with its natural and pleasant color rendering, it stands out as the best outdoor camera in our benchmark.

Following the Google Nest Outdoor Cam, the Ring Outdoor Cam also delivers strong overall performance, offering a wide field of view and well-controlled noise. Both devices qualify for the DXOMARK Gold Label for outdoor cameras.

Indoor Cameras

From left: Arlo Essential Indoor Camera (2nd Gen), Eufy Indoor Camera S350, Google Nest Cam Indoor (Wired, 3rd Gen), Ring Outdoor Cam (2nd Gen), Wyze Cam V4

While doorbell and outdoor cameras are primarily focused on monitoring specific events such as intrusions or a vehicle entering the driveway, indoor cameras serve a broader purpose: capturing as much of the room as possible, including windows and general activity within the scene.

When evaluating indoor cameras, several key factors come into play. Field of view (FOV) is critical from a practical standpoint, the wider the coverage, the fewer cameras are needed to monitor the space effectively. Exposure must be balanced, ensuring visibility in both bright and dark areas of the room. Finally, texture rendering is important for identification purposes, allowing intruders or other subjects to be recognized clearly at short to medium distances.

For the benchmark of the indoor cameras, we considered 5 different models, presented below with their respective specifications:

To provide accuracy and perspective, we’ve added actual images of the field of view, illustrating what you will genuinely see through the camera:

Arlo Essential Indoor Camera (2nd Gen)
Eufy Indoor Cam S350
Google Nest Cam Indoor (Wired, 3rd Gen)
Ring Indoor Cam (2nd Gen)
Wyze Cam V4

Taking into account all measured metrics across both daylight and nighttime scenarios, the benchmark results below present the performance ranking of all evaluated models.

Within our benchmark, the Google Nest Cam indoor camera offers the widest lateral field of view, capturing the most area and providing a comprehensive view of the room. In contrast, the Wyze cam v4 has the narrowest coverage, leaving some corners of the room outside its view.

Color mode / Daylight test results 

In this specific part, we will qualify the performance of indoor cameras when light levels are sufficient to use a visible-light sensor.

In daylight conditions, using natural light from windows, overall face brightness is generally accurate. While the Google Nest Cam indoor has a slightly lower target exposure on faces compared to Eufy and Wyze, faces remain easily recognizable. Arlo, by contrast, has the lowest face exposure, making recognition more challenging. Google Nest Cam Indoor’s key advantage over competitors is a wide dynamic range, minimizing clipping in bright areas such as windows and allowing effective monitoring of both subjects and background.

Under “evening” conditions with only dim artificial lighting, Google Nest Cam Indoor maintains the lowest face brightness, yet faces are still fully identifiable.

In terms of detail, Google Nest Cam Indoor provides an acceptable level for recognition, though fine textures appear softer and less sharp compared to Eufy or Wyze. Color rendering and white balance are generally pleasant across most cameras; however, Wyze can appear slightly desaturated, while Arlo exhibits a pronounced orange cast under warm artificial lighting in evening conditions.

Google Nest Indoor
Accurate face exposure, wide dynamic range
Eufy Indoor Cam 350
Accurate face exposure, clipping in bright parts
Ring Indoor Cam 2nd gen
Accurate face exposure, clipping in bright parts

Infrared mode test results 

When light levels are too low for usual cameras to offer an acceptable quality, surveillance devices can switch to Infrared (IR) mode. All cameras in our benchmark offer a black-and-white IR mode.

Most deliver accurate overall brightness, though some lose detail in shadowed areas. The Google Nest Cam Indoor stands out with the widest dynamic range, but its tone compression can sometimes render faces unnaturally, making identification slightly more difficult. It is the only camera that avoids complete clipping of objects near the IR source, allowing at least partial recognition. In terms of detail, Google Nest Cam Indoor provides an acceptable level for face identification, although fine textures are softer and less sharp compared to Eufy or Wyze.

Google Nest Cam Indoor
Accurate face exposure, details appear soft
Arlo Essential Indoor Cam 2nd generation
Slightly low face exposure at longer distances, with noticeable loss of detail
Wyze Cam v4
Accurate face exposure with sharp, well-defined details

Overall pros & cons of each model

Pros Cons
Google Nest Cam Indoor (wired, 3rd gen)
  • Wide field of view
  • Very extended dynamic range
  • Sufficient level of details for face identification at close distance
  • Well controlled noise on faces
  • Fine details are lost
  • Visible artifacts such as blocking/compression, ringing, and ghosting/fusion
Ring Indoor Cam (2nd gen)
  • Target exposure is accurate on face for recognition
  • Noise is generally well controlled
  • Dynamic range is limited with bright clipping visible
  • Chromatic noise slightly visible in low light
Arlo Essential Indoor Cam (2nd generation)
  • Artifacts relatively well controlled
  • Color rendering is rather pleasant
  • Target exposure often too low, especially in daylight
  • Limited dynamic range
  • Level of details often very low
  • Chromatic noise can be visible
eufy Security Indoor Cam S350
  • High level of details
  • Noise generally well controlled
  • Target exposure is accurate on face for recognition
  • Limited dynamic range with bright clipping visible
  • Slightly limited field of view
Wyze Cam v4
  • Target exposure is accurate on face for recognition
  • Details well preserved
  • Noise generally well controlled
  • Limited field of view
  • Blocking and ringing strongly visible
  • Limited dynamic range with bright clipping visible

Conclusion

Google Nest Cam Indoor (wired, 3rd gen) – 1st Place
eufy Security Indoor Cam S350

Thanks to its HDR strategy and wide field of view, the Google Nest Indoor provides optimized indoor surveillance in all lighting conditions, along with pleasant color rendering and a sufficient level of detail. The Eufy Indoor Cam also offers a good experience, delivering high levels of detail and accurate target exposure. Both devices qualify for the Gold Label for indoor cameras.

The post Security cameras : 2025 benchmark appeared first on DXOMARK.

]]>
https://www.dxomark.com/security-cameras-2025-benchmark/feed/ 0 close-up-of-cctv-installed-in-the-house-2025-02-10-11-51-57-utc Doorbells 0216a771e1134809c2616a3523762658 Wyze_Video_Doorbell_Day_TheDelivery Tapo_TPLink_S_Day_TheDelivery_00_00_02.900 Ring_DXOMARK_Doorbell_Day_TheDelivery Kasa_Smart_Video_Doo_Day_TheDelivery_00_00_25.650 Google_Nest_Doorbell_Day_TheDelivery_001 Arlo_Video_Doorbell_Day_TheDelivery 69e8f19c481c1bbf81b9b85900e1cb12 Doorbell Daylight GoogleNest_Doorbell_Day_SunFriend_0001 Tapo_TPLink_Smart_Day_SunFriend_0001 Kasa_Smart_Video_Doorbell_Day_SunFriend_0001 Doorbell night case Guava2025_P03_Night_Friend_LightOn_0002 Ring_Doorbell_Pro2_Friend_LightOn_0002 Kasa_Smart_Video_Doorbe_Night_Friend_LightOn_0002 Doorbell_gold Outdoor_cams 04af1527ece417d695d9ca9f9c44babf Ring_StickUpCam_Pro_Day_BallPlay_00_00_03.875 Google_Nest_OutdoorCam_BallPlay_00_00_05.265 Eufy_Outdoor_2k_Day_BallPlay_00_00_05.200 Arlo_Pro_5S_2k_Day_BallPlay_00_00_08.379 Wyze_BatteryCamPro_DxOMark_OutdoorCam_Day_BallPlay_0001 614c92a60b40fb87a4dc82880ef6f882 d1e73a5e4fde946e88d8dec6f0a3f9c1 16c2b5b3ddc3f6e680f7ad087aa678ae Google_Nest_Outdoor_Night_TheDelivery_00_00_32.062 Eufy_Outdoor_2k_Night_TheDelivery_0002 Ring_StickUpCam_Pro_Night_TheDelivery_0001 Nest outdoor Ring outdoor Indoor_Cams 76002c1f4abd5932230d5cd7759786f5 Arlo_Essential_Indoor_2k_DxOMark_IndoorCam_Natural_VIS_Day_Static_00_00_08.000 Eufy_Indoor_S350_DxOMark_IndoorCam_Natural_VIS_Day_Static_00_00_08.133 Google_Nest_IndoorCam_Natural_VIS_Day_Static_00_00_11.828 Ring_IndoorCam_2ndGen_DxOMark_IndoorCam_Natural_VIS_Day_Static_00_00_08.000 Wyze_Cam_V4_DxOMark_IndoorCam_Natural_VIS_Day_Static_00_00_08.006 cea1ca3c134a125a67363b285b8b18c4 Indoor daylight Google_Nest__IndoorCam_Natural_VIS_Day_Intrusion_0001 Eufy_Indoor_S350_DxOMark_IndoorCam_Natural_VIS_Day_Intrusion_0001 Ring_IndoorCam_2ndGen_Natural_VIS_Day_Intrusion Indoor night usecase Google_Nest_IndoorCam_Natural_IR_Model_0001 Arlo_Essential_Indoor_2k_Natural_IR_Model Wyze_Cam_V4_DxOMark_IndoorCam_Natural_IR_Model_0001 Nest indoor Eufy Indoor
What’s New in DXOMARK’s Camera protocol? https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/ https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/#respond Wed, 25 Jun 2025 18:40:49 +0000 https://www.dxomark.com/?p=185512&preview=true&preview_id=185512 At DXOMARK, the evolution of our protocols is a continuous process, aimed at keeping pace with the accelerating innovation in smartphone imaging. With each generation of devices introducing new technologies and user-centric features, our testing methodologies adapt accordingly, not only to stay up to date but also to ensure our scores remain relevant and meaningful [...]

The post What’s New in DXOMARK’s Camera protocol? appeared first on DXOMARK.

]]>

At DXOMARK, the evolution of our protocols is a continuous process, aimed at keeping pace with the accelerating innovation in smartphone imaging. With each generation of devices introducing new technologies and user-centric features, our testing methodologies adapt accordingly, not only to stay up to date but also to ensure our scores remain relevant and meaningful for real-world users.

Today, we officially unveiled the 6th version of our Smartphone Camera Protocol, the most advanced and user-aligned protocol to date. This release is the product of our traditional multi-phase development strategy that reflects both technical rigor and human-centric evaluation.

A methodological framework grounded in real-world use

Each update to our protocol follows a structured methodology built on major key pillars, starting with identifying user needs and preferences, developing representative and repeatable test scenarios, and culminating in the thorough evaluation and scoring of products.

Understanding user needs and preferences

Our foundation lies in in-depth research, including multi-year investigations about user preferences through DXOMARK Insights. These studies, conducted with large panel groups, explore key user pain points, particularly in challenging domains such as HDR portrait photography. Since 2023, we have carried out extensive studies across China, India and Europe, uncovering detailed insights about user expectations in portrait photography.

To enrich this broad understanding, DXOMARK regularly collaborates with independent experts in their fields (photographers, video makers…). Ahead of the launch of our Camera v6 protocol, we are deepening our engagement through the creation of the DXOMARK Expert Committee, a body of professionals and academic experts who provide valuable perspectives on emerging trends and real-world usage scenarios.

Designing representative and repeatable test scenarios

At our state-of-the-art labs in Boulogne-Billancourt, we design tests that mirror real-world usage with scientific precision. Our approach combines objective measurements and perceptual testing, covering a wide range of lighting conditions, motion scenarios, and diverse skin tones. Purpose-built setups and proprietary tools enable us to achieve a new level of granularity in testing. Beyond the lab, we also conduct tests and analyses in varied natural environments. Each device is tested extensively with over 4000 photos captured and 200 minutes of video recorded under a wide range of conditions.

Scoring: The tip of the iceberg

While our scores are publicly visible, they represent just the surface of a comprehensive evaluation process. These scores distill the end-user experience into a clear, comparable format. With the launch of our sixth-generation protocol, we introduce a revamped scoring architecture and weighting system, aligned with an updated testing matrix and newly refined quality metrics.

What’s new in the new version of our protocol?

The sixth version of our protocol introduces updates across three key areas:

    • Enhanced HDR evaluation, featuring a new testing process and refined scoring methodology.
    • Updated portrait testing, informed by recent global studies to better reflect real-world user expectations.
    • Expanded focus on zoom performance, with particular attention to video zoom capabilities.

Now, let’s dive into the details of our protocol updates

Portrait evaluation: adapting to user trends and expectations

Taking portrait photos is one of the most common and emotionally resonant use cases in smartphone photography. Guided by our extensive insights run between 2023 and 2025 in different parts of the world, we identified 3 key elements looked at by users while defining a good portrait picture: a consistently well exposed face and overall picture, natural skin tones as well as an accurate and neutral white balance. We also observed that users were consistently unhappy and still identified challenges when it came to lowlight and nigh photography, which still represented major challenges for users.

These insights provided us with clear guidelines on users’ expectations as well as general trends on preferences, that directly drive our methodology and our tests.

What have we changed in our evaluations?

To better reflect real-world usage, including both everyday and challenging situations, we’ve significantly upgraded our portrait testing protocols:

    • 50 new portrait scenes, covering a full range of lighting conditions, from moonlight to sunlight, captured both in natural environments and simulated lab settings. In total, we now evaluate 9 lighting conditions in photo and 13 in video.
    • A broader spectrum of skin tones, ensuring inclusive and comprehensive evaluation across diverse subjects.
    • Three motion profiles, simulating real user behavior, from static handheld shooting (two-handed grip) to walking scenarios (for video), to test performance in typical portrait capture situations.

 

Our tools and test methods have evolved to deliver deeper, more meaningful insights into image quality. At the core of this is our newly developed All-in-One Portrait Lab setup, powered by Analyzer, designed to simulate real-world challenges in a controlled environment. It includes:

    • Two high-fidelity mannequins representing deep and fair skin tones, used to assess facial detail preservation.
    • Dynamic lighting simulation, covering a wide range from 0.1 lux to 10,000 lux, to evaluate performance under various illumination levels.
    • Motion simulation tools, including moving objects, a hexapod (six-axis motion platform), and a time box to rigorously test autofocus accuracy and motion blur.
    • Reflective and transmissive gray scales, supporting in-depth analysis of noise and contrast behavior.

To complete the evaluation of portraits, we now run a systematic perceptual evaluation of flare, which can affect facial clarity and background rendition.

HDR: A consistent evaluation of the HDR formats

As outlined in our recent publications, HDR is reshaping the landscape of smartphone photography. As brands explore various approaches to HDR integration (see our China Insights), new creative opportunities are emerging, alongside fresh technical challenges. The evolution of HDR formats now includes standardized versions that are compatible across a wide range of smartphones, ensuring more consistent user experiences.

To reflect the growing importance of HDR, we have now integrated a dedicated and systematic evaluation of HDR performance into our testing protocol, applicable when the tested device supports a publicly documented format that is compatible with common HDR viewing tools. This enhancement ensures a more accurate and comprehensive understanding of how HDR impacts image quality across devices.

It includes:

    • Expanded HDR scenes coverage: our testing now includes a broader range of natural scenes (across all lighting conditions, including night scenes) as well as in controlled lab environments using our AF-HDR setup.
    • New lab-based metrics: Additional objective measurements offer finer granularity in assessing HDR performance under reproducible conditions.
    • Perceptual analysis with professional reference HDR display: Evaluations are conducted using our AQuA tool (which brings an objective perspective on perceptual analysis) on an ISO 22028-5 reference HDR monitor.

When a supported HDR format is detected, images are processed with the appropriate gain maps and evaluated through our HDR visualization pipeline using dedicated scoring criteria. If the image is in a non-HDR format or an unsupported HDR format, it is analyzed as an SDR image using the same tools, ensuring consistency and fairness across all devices.

Zoom: An increased focus on growing user features

Zoom capabilities have emerged as a key differentiator among flagship smartphones. Increasingly valued by users, zoom is now widely used across a variety of scenarios, from close and mid-range portraits to long-range landscape and wildlife photography. In recent years, we’ve observed significant advancements across devices, enabling users to capture high-quality images even in the most demanding conditions.

In response to evolving user behavior, we have redefined our zoom testing protocol with a stronger emphasis on emerging use cases, such as video zoom, which is increasingly used during live events and concerts to capture subjects from a distance.

Key evolutions in our testing include:

    • A focus on the 85–300mm zoom range, which is especially relevant for medium to long-range portrait photography.
    • Simulation of user motion
    • Evaluation criteria covering a broad set of attributes: from static elements like face exposure, contrast, dynamic range, and texture, to temporal aspects such as stabilization and autofocus consistency, as well as usability metrics like zoom smoothness.

While we’ve refined our protocol for close- to medium-range zoom, representing most of everyday use cases, ultra zoom (200 mm and beyond) continues to be evaluated through a dedicated protocol. We’ll soon publish updated results from this specialized testing.

Video: Simulating Real-Life Movement and Light

Smartphone video performance has advanced significantly in the past few years with results now getting closer to professional standards. Videos are now marked with significantly richer color, enhanced contrast and greater detail. For the past eight years, devices like the Apple iPhone have consistently set the benchmark for mobile video quality, delivering reliable performance and excellent detail retention across a wide range of lighting conditions.

To stay aligned with the rapid advancements in smartphone videography, we have significantly updated our evaluation protocols. In the sixth version of our video testing protocol, we’ve introduced several key enhancements:

    • Simulated user motion in the lab: We are the first to incorporate a protocol that evaluates video quality using captures recorded under controlled, simulated user movement bringing greater realism and reproducibility to our tests.
    • Broader range of use cases: We’ve expanded scene diversity to include a wider variety of skin tones, better reflecting real-world usage.
    • Extended lighting scenarios: Our automated lab setup now covers four distinct lighting levels (from 5 to 1000 lux), each paired with systematic HDR scene simulations. Additionally, we’ve implemented a dedicated night-shooting plan, designed to evaluate performance across a variety of low-light situations and user scenarios.

Revised Architecture and Scoring System

In the latest version of our protocol, we have revised the scoring methodology to provide a more detailed and user-relevant evaluation of device performance. The updated framework now includes two main sub-scores: Photo and Video, each assessing the performance of the device’s primary focal lengths: main, tele, and ultra-wide.

This structure offers a clearer view of how each focal length performs in both still and motion capture. Additionally, we’ve introduced use-case scores to reflect real-world scenarios, providing insights into the device’s capabilities in specific contexts such as portrait photography, zoom performance (across both photo and video), and low-light shooting—a persistently challenging condition identified in our previous research.

Initial Results from Camera v6 protocol

With the new protocol version comes an updated camera ranking, resulting in some shifts in smartphone positions compared to previous rankings.

To give you a clearer idea of what to expect, this section presents an overview of the evaluation of three popular devices.

Apple iPhone 16 Pro Max

In our Camera v6 protocol, the iPhone 16 Pro Max is mostly impacted in our Photo score. Indeed, while fine noise has a reduced impact compared to earlier versions, faces frequently appear underexposed. This lower brightness is generally less appreciated by users, resulting in a greater negative impact on perceived exposure quality.

Highlights on the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: in our new protocol, the iPhone 16 Pro Max remains an excellent choice for portrait pictures, whether capturing a single person or a group on the same focal plane. Thanks to effective HDR management, portrait images appear immersive, vibrant and visually appealing.
    • Zoom Video: Video continues to be a strong area for the iPhone 16 Pro Max under our new testing protocol. When analyzing zoom performance during video recording, the device delivers smooth transitions and maintains high image quality throughout the zoom range.
    • Lowlight: With the inclusion of more low-light and challenging scenes in our protocol, the iPhone 16 Pro Max continues to perform strongly in photo mode while remaining the top performer in video. It produces bright images with a wide dynamic range, preserving both detail and contrast even in difficult lighting conditions.

Xiaomi 15 Ultra

Under our Camera v6 protocol, the Xiaomi 15 Ultra benefits from the increased emphasis on portrait color and telephoto zoom performance, resulting in a higher ranking.

Highlights on the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: in our new protocol, the Xiaomi 15 Ultra was capable of capture nice portraits with realist skin tones and good exposure across all lighting conditions.
    • Zoom photo & video: The Xiaomi 15 Ultra delivers a strong performance in telephoto, keeping a high level of detail and sharpness across the entire zoom range. It also performs strongly in video zoom, offering stable and clear results.
    • Lowlight: The device provides good low-light imaging experience, featuring a warm white balance that preserves the ambient atmosphere, along with impressive noise reduction.

Samsung Galaxy S25 Ultra

In our new protocol, the Samsung Galaxy S25 Ultra is mostly impacted by the lower impact on noise and the growing weight of telephoto zoom.

Highlights of the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: The Samsung Galaxy S25 Ultra delivers strong portrait photography performance across default, bokeh, and tele modes, with good subject detail, accurate edge detection, and versatile features like realistic blur effects and adjustable lighting.
    • Zoom: The Samsung Galaxy S25 Ultra offers impressive telephoto performance with sharp, detailed images across medium to long zoom ranges, supported by fast and reliable autofocus. While it delivers strong overall quality, some softness and noise appear at extreme zoom levels, placing it just behind top competitors like the Oppo Find X8 Ultra and Xiaomi 15 Ultra.
    • Lowlight: In low-light conditions, the camera generally delivered good exposure and accurate white balance, though occasional underexposure and unnatural tones were observed. Testers also noted inconsistencies in noise and detail between shots, highlighting a lack of consistency in performance across challenging lighting conditions.

Conclusion

With the launch of DXOMARK’s sixth-generation Smartphone Camera Evaluation Protocol, we reaffirm our commitment to providing the most accurate, relevant, and user-centric assessments in the mobile imaging space. By integrating cutting-edge testing tools, global user insights, and real-world use cases into our methodology, Camera v6 marks a significant step forward in how smartphone camera performance is measured. As innovation in mobile photography accelerates, this new protocol ensures our rankings remain not only scientifically robust but also truly reflective of the everyday experiences and expectations of users worldwide.

The post What’s New in DXOMARK’s Camera protocol? appeared first on DXOMARK.

]]>
https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/feed/ 0 DXOMARK launches smartphone camera protocol v6 Discover how DXOMARK’s new smartphone camera protocol v6 elevates HDR, portrait, and zoom testing to deliver scores that truly reflect real-world use. DXOMARK smartphone camera protocol V6 CameraV6_keyvisual Media NewRanking_Camv6 (2) AppleiPhone16ProMax-1 FaceToFace_Xiaomi15Ultra
Portrait Photography Preferences: What European Users Expect from Their Smartphones   https://www.dxomark.com/portrait-photography-preferences/ https://www.dxomark.com/portrait-photography-preferences/#respond Thu, 22 May 2025 15:58:41 +0000 https://www.dxomark.com/?p=184569 At DXOMARK, our mission goes beyond technical analysis — we are deeply committed to capturing real user experiences. In 2023, we launched DXOMARK Insights, a global initiative aimed at understanding user preferences and pain points when it comes to smartphone imaging, particularly portrait photography. As we are preparing to release the next version of our [...]

The post Portrait Photography Preferences: What European Users Expect from Their Smartphones   appeared first on DXOMARK.

]]>

At DXOMARK, our mission goes beyond technical analysis — we are deeply committed to capturing real user experiences. In 2023, we launched DXOMARK Insights, a global initiative aimed at understanding user preferences and pain points when it comes to smartphone imaging, particularly portrait photography.

As we are preparing to release the next version of our Camera v6 protocol, we conducted a blind survey in Paris to explore how flagship smartphones meet (or fall short of) user expectations. This study focused on evaluating performance in a variety of scenes—especially the most demanding ones, such as low-light environments, high dynamic range (HDR) settings, and backlit conditions. In addition to helping us understand user preferences for portrait photography in Europe, the scenes used in this Insights study are the ones that will be applied in the implementation of our new camera protocol.

In this article, our findings will show the areas of image quality that help identify the trends in preferences of European consumers. We’ll also look at how these compare with the preferences of Chinese users.

3 key takeaways:

  • Despite strong improvements from the latest flagships, there is still room for improvement across all lighting conditions
  • Portrait pictures taken by Huawei Mate 70 Pro+  & Oppo FindX8 Pro were generally preferred across all lighting conditions
  • 3 major factors looked for in Europe and in China when evaluating portraits: an accurate exposure of the face and the scene, a neutral white balance as well as natural skin tones

 

Our methodology


This DXOMARK Insights study on smartphone HDR portrait photography focuses on assessing the perceived image quality of HDR photos themselves, rather than how they appear on specific smartphone displays.

The study involved seven of the most popular flagship smartphones: Apple iPhone 16 Pro Max, Honor Magic7 Pro, Samsung Galaxy S25 Ultra, Oppo FindX8 Pro, Google Pixel 9 Pro XL, Huawei Mate70 Pro+ and the Xiaomi 15 Ultra.

The shooting plan was designed to feature a variety of everyday scenes from European consumers, with a focus on challenging ones including indoor, lowlight and night scenes, as well as some backlit and high contrast situations, covering a total of 50 scenes.

As for previous Insights surveys, blind comparisons run by the user panel were performed with HDR visualization tools. For this study, we ensured HDR image files were viewed as they would be through third-party software, aligning with real-world usage. HDR formats that are either documented or follow ISO specifications were processed using gain map information, while non-standard formats were handled according to ITU guidelines. This approach emphasizes the importance of interoperability—highlighting which devices produce images that are easily shareable and viewable across various smartphones. Notably, all tested devices use open HDR formats, apart from the Honor Magic 7 Pro. Meanwhile, the Huawei Mate 70 Pro Plus supports ISO-compliant HDR output on the latest HarmonyOS version, ensuring broad compatibility.

In addition, due to the technical constraints in displaying HDR content on the web, please note that the photos used in this article are for illustration only. To view the HDR format rendering, these visuals need to be viewed with the proper HDR visualization tools.

To better understand user perception, the models featured in the photos were asked to provide feedback on their own portraits. The panel consisted of 39 demanding users, including flagship smartphone owners and photography enthusiasts. The survey was structured into two key phases:

  1. Blind Pairwise Comparison – Participants were shown two images of the same scene, each taken with a different smartphone. Through a series of side-by-side comparisons, they selected their preferred photo until a consistent JOD (Just Objectionable Difference) scale was established across all seven devices.
  2. Photo Series Rejection – In this step, participants were presented with a full set of photos from a single scene and asked to identify any images they disliked or would avoid sharing on social media.

This two-step approach allowed us to capture detailed data per scene, including:

  • The overall rejection rate across all respondents
  • The rejection rate within the specific user group being analyzed
  • The calculated JOD scale for each comparison

From these insights, we derived a Satisfaction Index for every image, a core from 0 to 100 that reflects how often users accept or reject an image, providing a clear view of user preferences. A score of 0 indicates that more than half of respondents rejected the image, while a score of 100 acceptance
while a score of 100 indicates strong acceptance by the user panel and no rejection for the specific scene or group of scenes. To enrich our understanding, participants were also asked to explain the reasons behind their rejections—giving us valuable qualitative feedback on what makes or breaks a portrait photo.

What are the pain points identified by European users when it comes to portrait pictures?


In blind tests conducted across Europe, users consistently identified three major pain points affecting image quality in portrait photography.

  • First and foremost, underexposure was frequently cited—faces and entire scenes often appeared too dark, diminishing the overall appeal of the image.
  • Second, white balance inconsistencies were a common complaint, with images sometimes looking unnaturally cold or overly warm.
  • Lastly, unnatural skin tones emerged as a critical factor; even subtle shifts in tone were enough to significantly lower user satisfaction, underscoring how sensitive viewers are to color accuracy in portraits.

Among these, exposure stood out as the most persistent issue, clearly noticeable not just in outdoor shots, but also in more complex lighting scenarios such as backlit or high-contrast scenes. While exposure is an important factor, this doesn’t imply that European users are simply seeking brighter images. Instead, they place significant emphasis on the relative brightness of the face in comparison to the overall image. We’ll explore this topic in more detail later in the article”

To expand our understanding, we compared these findings with a similar study conducted in China the previous year. Interestingly, users in both Europe and China exhibited remarkably similar preferences, highlighting a growing global convergence in what people expect from smartphone portrait photography.

In terms of lighting conditions, lowlight scenes are the ones that still show significant challenges, due to users being more demanding in these settings, and even more than in night scenes.

Our satisfaction ranking in Europe


The results from Europe clearly highlighted two top performers: Huawei and OPPO. These brands led in nearly every lighting condition — from bright daylight to dim indoor scenes — and scored consistently high across several key criteria:

  • Top 3 in exposure and white balance across the board
  • Very low rejection rates, particularly in low-light scenarios
  • Steady performance across all portrait scenes

According to the JOD (Just Objectionable Difference) analysis, Huawei and OPPO stood out as the most preferred devices for users across the whole shooting plan

When analyzing the overall shooting plan and aggregating user preferences across all scenes, Huawei and Oppo emerged as the most favored smartphones. The graph below presents the devices ranked by their average performance across all scenes and lighting conditions.

Similarly, the ranking obtained in Europe reached the same results as with the China insights (check out the last part of the article for more details). One important point to highlight is that, although these devices were not initially designed for the European market and are not commercially available in the region, their image quality tuning appears to align well with the preferences of European users.

Among all lighting conditions, smartphones struggle more in night & low-light photography


While most flagship devices perform well in good lighting, our study revealed that low-light and night scenarios remain major weak points. These settings continue to produce significantly lower satisfaction scores, even for premium models.

For easy and bright outdoor scenes, users are increasingly demanding

Even under well-lit conditions, users demonstrated high sensitivity to subtle differences. Minor variations in exposure or white balance could significantly sway the Satisfaction Index.

  • Devices with slightly underexposed faces, were consistently rated lower (Apple & Honor in the following example)
  • Bright, high-contrast renderings performed better and were preferred among our panel of users (Oppo & Huawei in the following example).

This tells us that users are becoming more refined in their tastes and more critical, even of images that are objectively “good.” They expect not just technical accuracy, but visual appeal and emotional resonance in every shot.

Indoor Low-Light Scenes Fall Short

Indoor low-light conditions proved especially difficult for most smartphones. Satisfaction scores dropped sharply, often below 70, even though the photos were technically acceptable.

Why? Users simply do not perceive these scenes as difficult and therefore expect image quality to remain high. When devices underexpose faces or produce strange hues under artificial lighting, the mismatch between expectations and results creates frustration.

In these scenes, Huawei & Oppo stood out with a more consistent exposure stability and natural-looking tones. iPhone & Google devices also brought a strong performance in lowlight and night respectively. Across all devices, however, users pointed to white balance inconsistency and contrast instability as major drawbacks.

Night Photography Quality Is Still Lacking

In dark environments, the performance gap widened and user satisfaction was still far from ideal in this condition. Overall, Huawei, OPPO, and Google delivered the most acceptable results, though many examples still struggled with low JOD scores. We observe still a lot of pain points on many different devices and all devices were challenged by the night shots. There is a high rejection rate and low JOD scores on many use cases.

  • Issues like clipped highlights, underexposed faces, and unnatural skin tones were common.
  • Devices like the Samsung Galaxy S25 showed inconsistency in facial exposure even with advanced hardware.

Overall, facial exposure, white balance, and skin tone rendering remain the top drivers of user dissatisfaction in night photography.

Interestingly, in night photography, a brighter face isn’t always seen as better. What truly matters to users is the balance between the face and the background. A well-managed exposure ratio between these elements plays a crucial role in perceived image quality. Users consistently favored images where the face stands out naturally without overpowering or being lost in the background, highlighting the importance of cohesive, well-balanced lighting in low-light portrait scenarios.

Still ways for improvements for most devices of the study


Expectations are rising, with even the smallest flaws in top-tier devices now being easily detected by discerning users. Fine-tuning product details has become more critical than ever for manufacturers.

Each device showed its strengths — but also clear areas needing improvement:

Honor Magic7 Pro

HDR format is not supported by third-party devices resulting in exposure and contrast issues. Also, performance under complex lighting conditions lacks stability. Skin tones tend to appear yellowish-green, leaving users with a sense that the images lack warmth.

Samsung Galaxy S25 Ultra

Highlights are overly clipped in low-light HDR scenes photography, and there’s a noticeable issue of underexposure in night-time shooting scenarios. Skin tone rendering sometimes unnatural as well.

Apple iPhone 16 Pro

Insufficient facial brightness is a core subjective complaint. Skin tones lean towards orange and green hues, which makes the images less pleasing than expected.

Huawei Mate 70 Pro Plus

Performs consistently overall, but tends to overexpose faces in high backlight conditions. Occasional inaccuracies in skin tone reproduction are observed in portrait mode.

Google Pixel 9 Pro XL

Shows instability in HDR rendering, with excessive highlight clipping in low-light environments. Noticeable facial noise remains a key issue for users.

OPPO Find X8 Pro

In certain HDR scenes, images are often perceived as overly bright or washed out. At times, users also notice a pinkish tint in the white balance.

Xiaomi 15 Ultra

Dim facial exposure is one of the key challenges for this device, and its HDR failure rate is slightly higher compared to other tested models.

By anchoring our evaluations in real user feedback, we highlight issues that matter most to consumers, offering manufacturers a clearer path to meaningful improvements.

Converging user expectations between Europe and China


One of the most striking findings from our survey was the similarity in user expectations across Europe and China. We could observe that due to the earlier Insights run in Shanghai, qualifying trends in user preferences specifically in China.

Rather focus on the key aspects for both China & Europe

In both markets, when it comes to portrait pictures, users ranked the same priority criteria

✅ Subtle shifts in white balance can make or break satisfaction
✅ Proper exposure — especially on the face — is non-negotiable
✅ And yes, natural-looking skin tones are a must

This convergence underscores a growing global alignment in what users value most in smartphone photography. Regardless of cultural differences, people want accurate, natural, and well-exposed portraits.

Interestingly, Huawei Mate 70 Pro+ and OPPO Find X8 Pro were also the top-ranked devices in China — mirroring the preferences in Europe. Their strengths in delivering consistent exposure, accurate white balance, and lifelike skin tones clearly resonated across regions

Conclusion: The road ahead for smartphone photography


As smartphone users grow more discerning, their expectations for image quality continue to rise—particularly when it comes to portrait photography. The DXOMARK Insights survey underscores this shift: even subtle visual imperfections, such as slight exposure missteps or unnatural skin tones, can significantly impact overall satisfaction. This increasing demand means that smartphone manufacturers can no longer rely solely on hardware improvements; fine-tuning image processing and delivering consistent results across lighting conditions has become critical.

While a few brands, notably Huawei and OPPO, are setting the benchmark with strong performance across varied scenarios, the study highlights that key challenges remain—particularly in low-light, night, and HDR photography, where user satisfaction still drops considerably. These pain points reflect not just technical limitations, but also a gap between user expectations and current image processing capabilities.

Looking ahead, our upcoming Camera v6 protocol, launching in June, will address these issues head-on. By introducing more complex and demanding test scenarios, especially in extreme lighting environments, the updated benchmark aims to better reflect real-world usage and provide even deeper insights into what users truly value. This evolution marks a crucial step toward helping the industry deliver photography experiences that meet—and exceed—modern user expectations.

The post Portrait Photography Preferences: What European Users Expect from Their Smartphones   appeared first on DXOMARK.

]]>
https://www.dxomark.com/portrait-photography-preferences/feed/ 0 Insights_camera_v6 Pie Chart lighting and scenes Camv6Insights_post1 (2) Camv6Insights_post1 (3) Camv6Insights_post1 (6) Camv6Insights_post1 (5) 1 (1)webarticle 2 (1)webarticle insights_v6_indoor 2 (2)webarticle 5 (1)webarticle 4 (2)webarticle 2 (3) Untitled-1 Camv6Insights_post1 (2) Camv6Insights_post1 (3) 4noswipe
Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/ https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/#respond Sun, 18 May 2025 17:30:22 +0000 https://www.dxomark.com/?p=184483 Since launching our laptop testing protocol in 2023, DXOMARK has evaluated over 30 devices, uncovering critical insights into what makes a great integrated webcam. Two years ago, Apple’s MacBooks dominated with no real competition, but today, the landscape is shifting. Here’s what our data reveals about the state of laptop webcams, from hardware design to [...]

The post Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? appeared first on DXOMARK.

]]>
Since launching our laptop testing protocol in 2023, DXOMARK has evaluated over 30 devices, uncovering critical insights into what makes a great integrated webcam. Two years ago, Apple’s MacBooks dominated with no real competition, but today, the landscape is shifting. Here’s what our data reveals about the state of laptop webcams, from hardware design to software optimization.

Apple’s Historical Lead, and the New Challengers

When we published our first laptop webcam rankings in 2023, Apple’s MacBook Pro M2 led the pack with an impressive score of 135, showcasing its dominance in image quality. At the time, the best Windows devices lagged behind, both the Lenovo ThinkPad X1 Carbon Gen 11 and Microsoft Surface Pro 9 (Highest scoring Windows devices) scored a modest 100, highlighting a significant gap in webcam performance. The quality of most Windows PCs we tested, including some flagship models, was disappointing, with the majority delivering an underwhelming video experience

For example, in our early tests, a MacBook Pro with the M2 chip excelled in challenging scenarios, such as a scene with a brightly lit window behind the user, delivering vibrant colors and stable exposure without fluctuations, a performance that Windows laptops struggled to match at the time.

Over the past two years, Windows devices have made remarkable progress in webcam quality, driven by concerted efforts from manufacturers like HP, Dell, Lenovo, and Microsoft. These companies have invested heavily in running hardware and software optimizations, steadily closing the gap with Apple. Meanwhile, Apple’s webcam quality has remained relatively stagnant, with only marginal improvements since 2023.

Windows devices have since made significant strides, catching up through thorough hardware and software optimization. With mobile giant Qualcomm entering this market with extensive ISP experience, this trend has been further accelerated. The Microsoft Surface Laptop 13-inch (Snapdragon-based), for instance, now handles HDR scenarios with clarity, maintaining detail in both bright backgrounds and dim foregrounds, rivaling the MacBook’s output.

Apple MacBook Pro 14” (M4,2024)
Microsoft Surface Laptop 13-inch
Apple MacBook Pro 14” (M4,2024)
Microsoft Surface Laptop 13-inch

The new Microsoft Surface laptop 13-inch provides a similar quality than the latest Macbook Pro 14” (M4,2024) M4. Face exposure and skin tones are pretty well balanced for every type of classic video conference use cases (as illustrated by the first comparison). For more complex use case such as strong backlit scene (single or duo like the second comparison), both devices still struggle to provide the right face exposure on deep skin tone while keeping a good contrast and bright preservation.

Apple MacBook Pro 14” (M4,2024)
Lenovo Thinkpad X9 Aura

Some other devices like the Lenovo X9 are now providing great results even if we are still noticing some drawbacks such as illustrated on the frame above. Face exposure and overall color rendering have been improved with respect to previous devices. Still, skin color and contrast are not yet at the same level than the one we evaluated on the latest Macbook.

As a result, the performance gap has narrowed significantly. Here’s a simple table showing how these devices scored in our tests (higher scores indicate better performances):

This near parity indicates that Apple’s lead in webcam quality is no longer unchallenged, as Windows OEMs are making significant investments in imaging technology. Analyzing the progression of scores across both categories, it’s clear that Windows laptops are steadily closing the gap with their Apple counterparts.

What explains this shift?

Sensor Resolution Matter Less Than You Think

Conventional wisdom suggests that higher sensors resolution means better image quality. Our data challenges that assumption:

While higher-resolution sensors generally correlate with improved detail scores, the relationship is relatively weak—some 2MP webcams outperform 8MP counterparts.

This is largely due to the heavy compression (1080p) imposed by the hardware video pipeline, which diminishes the potential advantages of larger or higher-resolution sensors.

Notably, smaller sensors can still deliver excellent results when paired with effective image signal processing (ISP) and tuning. For example, the Microsoft Surface Laptop 13-inch achieved a detail score of 135 using a modest sensor, thanks to Snapdragon’s advanced ISP and AI optimization, closely rivaling the Apple MacBook Pro M4’s 8MP sensor, which scored 136.

Conversely, even high-resolution hardware can underperform—devices equipped with 4K sensors have scored below 100 when tuning was subpar, highlighting the critical role of software optimization in image quality.

As seen in this graph, although there is some correlation between hardware and score, plenty of 2Mpx sensor outperform higher 4+Mpx sensors when it comes to overall image quality.

Microsoft Surface Laptop 13”
HP Dragonfly Pro Chromebook

On this video capture, we’re comparing the Microsoft Surface Laptop 13-inch equipped with a 2MP Camera to a HP Dragonfly Pro Chromebook equipped with a 8MP camera.
With 1000lux and a dynamic of 4 EV between face and bright box, we can easily observe a  strong gap of quality in terms of exposure, contrast and dynamic range capabilities. Texture is also very low on the HP Dragonfly Pro capture.

Therefore, a key finding of our extensive range of data is that ISP performance and software tuning have a greater impact on image quality than sensor specifications alone

MIPI vs. USB: Why Connection Matters

Our testing reveals a clear performance hierarchy among webcam interfaces, with the Mobile Industry Processor Interface (MIPI) consistently enabling superior image quality in laptop cameras. The table below outlines the key advantages associated with MIPI-based implementations:

Performance Factor MIPI Advantage
Image Processing Direct ISP pipeline in the Applications processor (not a 3rd party USB camera chip) with joint OEM/SOC tuning for optimized image quality.
Data Throughput 2-4x higher bandwidth, enabling seamless high-resolution data transfer.
Power Efficiency Built on newer tech nodes, MIPI offers superior efficiency for always-on AI tasks.
AI-Friendliness Bayer processing retains richer image data compared to USB’s YUV-centric approach, enhancing AI-driven features.
Resolution Support Superior image quality at 5MP and above, where USB struggles to keep up.

The graph below presents a comparison of our scores between the best-performing USB webcam and a MIPI-based solution, highlighting the quality gap between the two interfaces.

The advantages of MIPI interfaces are clearly reflected in our evaluation results. All 15 top-performing webcams in our testing utilize MIPI connections, consistently achieving higher scores than their USB counterparts.

Mobile Industry Expertise

MIPI sensors benefit from advancements in mobile imaging, allowing devices to achieve remarkable camera quality. OEMs using ARM chipsets—such as Apple and Qualcomm—lead the pack, with the MacBook Pro M4 scoring 136 and the Surface Laptop 13-inch close behind at 135, thanks to years of smartphone camera tuning expertise now applied to laptops. This technical advantage manifests in various attributes depending on products.

Overall we did not measure devices with USB connections that managed to provide in the same camera:

  • Exposure: good target, dynamic range and temporal stability
  • Color: accurate color and white balance and good skin tone rendering.
  • Texture: overall, the level of detail is systematically lower than the MIPI competition
Lenovo Thinkpad T14 Gen4 provides a good face exposure but struggles to preserve bright part of the pictures (1000 Lux EV4)
Apple Mac Book Pro 14” (M4, 2024) provide in the same time good face exposure while preserving the brighter part of the scene (1000 Lux EV4)

Implementation Quality Matters 
While USB-connected webcams typically face performance constraints—often dependent on the image quality capabilities of module manufacturers—strong tuning can still deliver competitive outcomes. For example, the MSI Prestige 16 AI Evo achieved a score of 92, outperforming some earlier MIPI implementations.

Future Outlook 
USB connectivity remains prevalent in low- to mid-range devices. However, MIPI is rapidly establishing itself as the standard for high-end laptops, supported by a mature ecosystem and superior integration capabilities. For applications requiring premium image quality, MIPI provides a clear and compelling advantage.

Conclusion: Software and MIPI Redefine Quality

After two years of rigorous testing, three key insights have emerged:

  • First, Apple’s dominance in webcam performance is no longer unchallenged, with Windows OEMs rapidly closing the gap.
  • Second, hardware alone is insufficient, successful webcam performance hinges on the integration of advanced software and ISP partnerships.
  • Lastly, MIPI has established itself as the standard for delivering premium performance, particularly in high-end devices.

For businesses, this signifies that webcam quality is no longer a Mac-exclusive advantage. Leaders in this space will be those who combine cutting-edge hardware with strong imaging expertise, regardless of whether their platform is based on x86 or ARM architecture.

The post Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? appeared first on DXOMARK.

]]>
https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/feed/ 0 SingleConferenceRoom_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_03.303 SingleConferenceRoom_MicrosoftSurfacePro12_DxOMark_VideoConference_Laptop_00_00_01.608 DuoBacklit_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_45.105 DuoBacklit_MicrosoftSurfacePro12_DxOMark_VideoConference_Laptop_00_00_44.576 Sofa_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_21.668 Sofa_LenovoThinkpadX9Aura_DxOMark_VideoConference_ModeDefault_00_00_20.275 Scores comparison_resized2 Scores history & comparison_resized Sensor resolution comparison_resized 1000LuxEV4-Apple 1000LuxEV4 Camera scores per connection type_resized 1000LuxEV4-Apple
DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/ https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/#respond Fri, 28 Feb 2025 09:40:36 +0000 https://www.dxomark.com/?p=182910 DXOMARK has long been a trusted authority in the evaluation of smartphone camera quality. Over the years, the global leader in camera evaluation has rigorously tested hundreds of devices, observing firsthand the rapid evolution of imaging technology. In 2024 alone, 50 smartphones were assessed across multiple price segments, ranging from the entry-level market to ultra-premium [...]

The post DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options appeared first on DXOMARK.

]]>
DXOMARK has long been a trusted authority in the evaluation of smartphone camera quality. Over the years, the global leader in camera evaluation has rigorously tested hundreds of devices, observing firsthand the rapid evolution of imaging technology. In 2024 alone, 50 smartphones were assessed across multiple price segments, ranging from the entry-level market to ultra-premium devices over $800. This comprehensive testing gives consumers valuable insights into the camera performance of devices at various price points, making DXOMARK’s assessments a key resource for anyone in the market looking for a new smartphone.

DXOMARK’s standard Gold, Silver, and Bronze labels, introduced a few years ago, provide a fast and straightforward indicator of the performance of devices.

In today’s rapidly evolving market, consumers are faced with an overwhelming number of product choices. As segments continue to advance and products improve, identifying the best option can be challenging. DXOMARK is proud to introduce a new label, the Smart Choice label, to provide clarity and guidance in this landscape, helping consumers quickly and confidently select high-quality, well-developed products. Highlighting devices that offer exceptional camera performance for their price range, the Smart Choice label gives consumers the confidence to make an informed and enduring purchasing decision.

To better understand how this new label was established and the rules we put in place, let’s have a look at how price segments have recently evolved.

Main evolutions in imaging quality per segment in 2024

Over the past year, the smartphone camera landscape has experienced remarkable advancements, redefining photography standards across various price segments. While the Ultra-Premium category led the charge with superior performance, significant strides were also made in the Premium ($600–$800) and High-End ($400–$600) segments. These mid-tier categories have seen marked improvements across all key camera features narrowing the gap with their Ultra-Premium counterparts.

When we examine the evolution of quality within each segment, it becomes evident that disparities are diminishing over time. This trend highlights how OEMs are aligning their offerings with the specific needs and expectations of users within each price bracket. As a result, a clear “quality standard” has emerged within each segment, setting a baseline for features. However, some standout devices continue to diverge from this standard by delivering exceptional quality that exceeds the segment’s norms, establishing themselves as references and raising the bar for what users can expect in terms of smartphone camera performance.

We’ll detail below the evolution we observed for each segment in 2024.

Ultra-Premium

The substantial evolution was especially visible in the Ultra-Premium category. Ultra-Premium smartphones continue to push the boundaries of camera technology, but the pace of improvement has slowed compared to previous years.

A significant factor contributing to this was the influx of foldable and flip smartphones in 2024. These devices, while impressive in their form factor, have shifted the focus of innovation toward design and functionality, rather than pushing the boundaries of camera quality.

Despite this, candy bar-style devices have continued to evolve, showing significant improvements in their overall image quality, with a notable increase of 8 points from 2023 to 2024. This increase, although lower than the 12-point jump seen from 2022 to 2023, demonstrates steady progress in smartphone camera capabilities.

With regards to functionalities and quality evolution, video quality has seen fewer enhancements, while photo capabilities have improved, with a particular emphasis on portraiture and skin tone rendering. Despite these advancements, capturing the perfect moment in sports photography or other fast-moving scenes remains a challenge for these devices.

Zoom, however, is where Ultra-Premium smartphones have made the most progress, with many now offering telephoto lenses that deliver much improved zoom performance.

The introduction of features like ultra-zoom (up to 100x in some cases) has further differentiated this category from others.

Interestingly, individualization and signature styles—where brands are incorporating unique camera features, AI-powered photo editing, and distinct imaging characteristics—have become an emerging trend.

Among the devices that we’ve evaluated, the Huawei Pura 70 Ultra is an example of best-in-class performance, providing top-notch performance in nearly all key imaging features.

Meanwhile, devices in the High-End and Premium categories have experienced impressive gains in their imaging features, including significant improvements in photo, video, and particularly zoom performance, and narrowing the gap with Ultra-Premium models.

Premium

Premium devices have advanced more rapidly than Ultra-Premium ones in the last year, with a marked improvement in the average camera score, which rose by 9 points compared to 2023. The gap between Premium and Ultra-Premium smartphones has narrowed, particularly in photo and video performance, though Zoom remains the key differentiator, with Premium devices still lagging behind their Ultra-Premium counterparts.

In the Premium segment, we identified devices that offer a performance way above their class. Notably, Premium devices are now incorporating similar camera modules to those found in Ultra-Premium models.  Devices providing an outstanding experience in the Premium price range include the Google Pixel 9, which shows strong performance in photo, video, and zoom quality, challenging the upper-tier flagships. Apple’s iPhone 16 is another strong contender in the Premium category, excelling in photo and video capabilities, though still falling short in zoom performance when compared to Ultra-Premium devices.

High End

The High-End segment has seen some of the most dramatic changes in camera technology in 2024. Historically, zoom quality has been a feature that sets Ultra-Premium smartphones apart, as they are typically equipped with one or two telephoto lenses. However, even lower-priced devices, especially in the High-End segment, are improving their zoom capabilities, often with the inclusion of a dedicated telephoto sensor.

With advancements in photo and zoom performance, many High-End devices are now challenging the quality of both Premium and Ultra-Premium smartphones. Video quality has also improved, though at a slower pace compared to other features.

One trend that has emerged in this segment is the increasing presence of refurbished flagship models at competitive prices. These devices, typically 1-2 years old, can offer near-flagship quality at a lower price point, giving consumers more options in terms of camera performance.

In terms of evolution, High-End devices are catching up with premium models, providing similar photo quality, enhanced zoom capabilities, and faster, more responsive cameras, especially in good lighting conditions. For example, the Honor 200 Pro and Google Pixel 8a are two good contenders in their class. The Pixel 8a shows an outstanding performance for its class, providing excellent performance in photo and video for its price segment, even competing with devices from the higher price segments.

However, considering the disparity in performance in this segment, these types of devices have further identified the need to qualify devices performing way above their class.

Entry level: Advanced and Essential

While the Essential and Advanced categories have seen some improvement, they still lag significantly behind higher-end devices. Essential devices, with an average score of 68, offer a user experience reminiscent of smartphones from over a decade ago. Zoom quality in these devices remains poor, and there is no standout performer in this category.

In 2024, devices in the Advanced segment did not show exceptional quality improvements. Quality within the segment was very close, but still far from the average quality provided by High-end devices from 2023. These devices struggle to compete with slightly pricier models in terms of photo and zoom capabilities.

Introducing the Smart Choice label

While image quality has steadily improved across all smartphone segments, choosing the best device for your budget remains a complex task. Although narrowing, significant performance gaps still exist between the lowest- and highest-quality devices, and it is sometimes challenging to identify the best option for your needs.

To address this challenge, DXOMARK has introduced the Smart Choice label, which highlights devices that deliver exceptional imaging experience within their segment and that rival the performance of higher-tier devices. These products are designed to stay relevant over time, ensuring enduring value and competitive image quality even as new models emerge.

Consumers can easily identify Smart Choice devices on DXOMARK’s website through product pages and a dedicated Smart Choice label section, making it easier to make informed, budget-conscious decisions.

Our Smart Choice label is built on well-defined thresholds, ensuring that only products meeting specific performance criteria receive the label. We established the Smart Choice label thresholds using 2024 data from DXOMARK-tested products launched worldwide. These thresholds focus exclusively on candybar form-factor devices, as other form factors have distinct performance goals and cannot be directly compared. This approach ensures accurate, segment-specific quality benchmarks for consumers.

Specifically, which devices will qualify?

Eligible products must either match or exceed the average performance of the upper segment from the previous year.

 

By introducing the Smart Choice label, we aim to simplify purchasing decisions while encouraging brands to meet higher standards. This initiative not only benefits consumers by offering transparency but also motivates manufacturers to innovate responsibly, ultimately raising the overall quality of products in the market.

The post DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options appeared first on DXOMARK.

]]>
https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/feed/ 0 Fichier 1@4x 3 Photo score per segment Score variation per segment Form factor data Zoom sub-score Smartchoice_infographicv2