Tech articles - DXOMARK https://www.dxomark.com/category/tech-articles/ The leading source of independent audio, display, battery and image quality measurements and ratings for smartphone, camera, lens, wireless speaker and laptop since 2008. Thu, 26 Jun 2025 16:06:18 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.dxomark.com/wp-content/uploads/2019/09/logo-o-transparent-150x150.png Tech articles - DXOMARK https://www.dxomark.com/category/tech-articles/ 32 32 What’s New in DXOMARK’s Camera protocol? https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/ https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/#respond Wed, 25 Jun 2025 18:40:49 +0000 https://www.dxomark.com/?p=185512&preview=true&preview_id=185512 At DXOMARK, the evolution of our protocols is a continuous process, aimed at keeping pace with the accelerating innovation in smartphone imaging. With each generation of devices introducing new technologies and user-centric features, our testing methodologies adapt accordingly, not only to stay up to date but also to ensure our scores remain relevant and meaningful [...]

The post What’s New in DXOMARK’s Camera protocol? appeared first on DXOMARK.

]]>

At DXOMARK, the evolution of our protocols is a continuous process, aimed at keeping pace with the accelerating innovation in smartphone imaging. With each generation of devices introducing new technologies and user-centric features, our testing methodologies adapt accordingly, not only to stay up to date but also to ensure our scores remain relevant and meaningful for real-world users.

Today, we officially unveiled the 6th version of our Smartphone Camera Protocol, the most advanced and user-aligned protocol to date. This release is the product of our traditional multi-phase development strategy that reflects both technical rigor and human-centric evaluation.

A methodological framework grounded in real-world use

Each update to our protocol follows a structured methodology built on major key pillars, starting with identifying user needs and preferences, developing representative and repeatable test scenarios, and culminating in the thorough evaluation and scoring of products.

Understanding user needs and preferences

Our foundation lies in in-depth research, including multi-year investigations about user preferences through DXOMARK Insights. These studies, conducted with large panel groups, explore key user pain points, particularly in challenging domains such as HDR portrait photography. Since 2023, we have carried out extensive studies across China, India and Europe, uncovering detailed insights about user expectations in portrait photography.

To enrich this broad understanding, DXOMARK regularly collaborates with independent experts in their fields (photographers, video makers…). Ahead of the launch of our Camera v6 protocol, we are deepening our engagement through the creation of the DXOMARK Expert Committee, a body of professionals and academic experts who provide valuable perspectives on emerging trends and real-world usage scenarios.

Designing representative and repeatable test scenarios

At our state-of-the-art labs in Boulogne-Billancourt, we design tests that mirror real-world usage with scientific precision. Our approach combines objective measurements and perceptual testing, covering a wide range of lighting conditions, motion scenarios, and diverse skin tones. Purpose-built setups and proprietary tools enable us to achieve a new level of granularity in testing. Beyond the lab, we also conduct tests and analyses in varied natural environments. Each device is tested extensively with over 4000 photos captured and 200 minutes of video recorded under a wide range of conditions.

Scoring: The tip of the iceberg

While our scores are publicly visible, they represent just the surface of a comprehensive evaluation process. These scores distill the end-user experience into a clear, comparable format. With the launch of our sixth-generation protocol, we introduce a revamped scoring architecture and weighting system, aligned with an updated testing matrix and newly refined quality metrics.

What’s new in the new version of our protocol?

The sixth version of our protocol introduces updates across three key areas:

    • Enhanced HDR evaluation, featuring a new testing process and refined scoring methodology.
    • Updated portrait testing, informed by recent global studies to better reflect real-world user expectations.
    • Expanded focus on zoom performance, with particular attention to video zoom capabilities.

Now, let’s dive into the details of our protocol updates

Portrait evaluation: adapting to user trends and expectations

Taking portrait photos is one of the most common and emotionally resonant use cases in smartphone photography. Guided by our extensive insights run between 2023 and 2025 in different parts of the world, we identified 3 key elements looked at by users while defining a good portrait picture: a consistently well exposed face and overall picture, natural skin tones as well as an accurate and neutral white balance. We also observed that users were consistently unhappy and still identified challenges when it came to lowlight and nigh photography, which still represented major challenges for users.

These insights provided us with clear guidelines on users’ expectations as well as general trends on preferences, that directly drive our methodology and our tests.

What have we changed in our evaluations?

To better reflect real-world usage, including both everyday and challenging situations, we’ve significantly upgraded our portrait testing protocols:

    • 50 new portrait scenes, covering a full range of lighting conditions, from moonlight to sunlight, captured both in natural environments and simulated lab settings. In total, we now evaluate 9 lighting conditions in photo and 13 in video.
    • A broader spectrum of skin tones, ensuring inclusive and comprehensive evaluation across diverse subjects.
    • Three motion profiles, simulating real user behavior, from static handheld shooting (two-handed grip) to walking scenarios (for video), to test performance in typical portrait capture situations.

 

Our tools and test methods have evolved to deliver deeper, more meaningful insights into image quality. At the core of this is our newly developed All-in-One Portrait Lab setup, powered by Analyzer, designed to simulate real-world challenges in a controlled environment. It includes:

    • Two high-fidelity mannequins representing deep and fair skin tones, used to assess facial detail preservation.
    • Dynamic lighting simulation, covering a wide range from 0.1 lux to 10,000 lux, to evaluate performance under various illumination levels.
    • Motion simulation tools, including moving objects, a hexapod (six-axis motion platform), and a time box to rigorously test autofocus accuracy and motion blur.
    • Reflective and transmissive gray scales, supporting in-depth analysis of noise and contrast behavior.

To complete the evaluation of portraits, we now run a systematic perceptual evaluation of flare, which can affect facial clarity and background rendition.

HDR: A consistent evaluation of the HDR formats

As outlined in our recent publications, HDR is reshaping the landscape of smartphone photography. As brands explore various approaches to HDR integration (see our China Insights), new creative opportunities are emerging, alongside fresh technical challenges. The evolution of HDR formats now includes standardized versions that are compatible across a wide range of smartphones, ensuring more consistent user experiences.

To reflect the growing importance of HDR, we have now integrated a dedicated and systematic evaluation of HDR performance into our testing protocol, applicable when the tested device supports a publicly documented format that is compatible with common HDR viewing tools. This enhancement ensures a more accurate and comprehensive understanding of how HDR impacts image quality across devices.

It includes:

    • Expanded HDR scenes coverage: our testing now includes a broader range of natural scenes (across all lighting conditions, including night scenes) as well as in controlled lab environments using our AF-HDR setup.
    • New lab-based metrics: Additional objective measurements offer finer granularity in assessing HDR performance under reproducible conditions.
    • Perceptual analysis with professional reference HDR display: Evaluations are conducted using our AQuA tool (which brings an objective perspective on perceptual analysis) on an ISO 22028-5 reference HDR monitor.

When a supported HDR format is detected, images are processed with the appropriate gain maps and evaluated through our HDR visualization pipeline using dedicated scoring criteria. If the image is in a non-HDR format or an unsupported HDR format, it is analyzed as an SDR image using the same tools, ensuring consistency and fairness across all devices.

Zoom: An increased focus on growing user features

Zoom capabilities have emerged as a key differentiator among flagship smartphones. Increasingly valued by users, zoom is now widely used across a variety of scenarios, from close and mid-range portraits to long-range landscape and wildlife photography. In recent years, we’ve observed significant advancements across devices, enabling users to capture high-quality images even in the most demanding conditions.

In response to evolving user behavior, we have redefined our zoom testing protocol with a stronger emphasis on emerging use cases, such as video zoom, which is increasingly used during live events and concerts to capture subjects from a distance.

Key evolutions in our testing include:

    • A focus on the 85–300mm zoom range, which is especially relevant for medium to long-range portrait photography.
    • Simulation of user motion
    • Evaluation criteria covering a broad set of attributes: from static elements like face exposure, contrast, dynamic range, and texture, to temporal aspects such as stabilization and autofocus consistency, as well as usability metrics like zoom smoothness.

While we’ve refined our protocol for close- to medium-range zoom, representing most of everyday use cases, ultra zoom (200 mm and beyond) continues to be evaluated through a dedicated protocol. We’ll soon publish updated results from this specialized testing.

Video: Simulating Real-Life Movement and Light

Smartphone video performance has advanced significantly in the past few years with results now getting closer to professional standards. Videos are now marked with significantly richer color, enhanced contrast and greater detail. For the past eight years, devices like the Apple iPhone have consistently set the benchmark for mobile video quality, delivering reliable performance and excellent detail retention across a wide range of lighting conditions.

To stay aligned with the rapid advancements in smartphone videography, we have significantly updated our evaluation protocols. In the sixth version of our video testing protocol, we’ve introduced several key enhancements:

    • Simulated user motion in the lab: We are the first to incorporate a protocol that evaluates video quality using captures recorded under controlled, simulated user movement bringing greater realism and reproducibility to our tests.
    • Broader range of use cases: We’ve expanded scene diversity to include a wider variety of skin tones, better reflecting real-world usage.
    • Extended lighting scenarios: Our automated lab setup now covers four distinct lighting levels (from 5 to 1000 lux), each paired with systematic HDR scene simulations. Additionally, we’ve implemented a dedicated night-shooting plan, designed to evaluate performance across a variety of low-light situations and user scenarios.

Revised Architecture and Scoring System

In the latest version of our protocol, we have revised the scoring methodology to provide a more detailed and user-relevant evaluation of device performance. The updated framework now includes two main sub-scores: Photo and Video, each assessing the performance of the device’s primary focal lengths: main, tele, and ultra-wide.

This structure offers a clearer view of how each focal length performs in both still and motion capture. Additionally, we’ve introduced use-case scores to reflect real-world scenarios, providing insights into the device’s capabilities in specific contexts such as portrait photography, zoom performance (across both photo and video), and low-light shooting—a persistently challenging condition identified in our previous research.

Initial Results from Camera v6 protocol

With the new protocol version comes an updated camera ranking, resulting in some shifts in smartphone positions compared to previous rankings.

To give you a clearer idea of what to expect, this section presents an overview of the evaluation of three popular devices.

Apple iPhone 16 Pro Max

In our Camera v6 protocol, the iPhone 16 Pro Max is mostly impacted in our Photo score. Indeed, while fine noise has a reduced impact compared to earlier versions, faces frequently appear underexposed. This lower brightness is generally less appreciated by users, resulting in a greater negative impact on perceived exposure quality.

Highlights on the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: in our new protocol, the iPhone 16 Pro Max remains an excellent choice for portrait pictures, whether capturing a single person or a group on the same focal plane. Thanks to effective HDR management, portrait images appear immersive, vibrant and visually appealing.
    • Zoom Video: Video continues to be a strong area for the iPhone 16 Pro Max under our new testing protocol. When analyzing zoom performance during video recording, the device delivers smooth transitions and maintains high image quality throughout the zoom range.
    • Lowlight: With the inclusion of more low-light and challenging scenes in our protocol, the iPhone 16 Pro Max continues to perform strongly in photo mode while remaining the top performer in video. It produces bright images with a wide dynamic range, preserving both detail and contrast even in difficult lighting conditions.

Xiaomi 15 Ultra

Under our Camera v6 protocol, the Xiaomi 15 Ultra benefits from the increased emphasis on portrait color and telephoto zoom performance, resulting in a higher ranking.

Highlights on the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: in our new protocol, the Xiaomi 15 Ultra was capable of capture nice portraits with realist skin tones and good exposure across all lighting conditions.
    • Zoom photo & video: The Xiaomi 15 Ultra delivers a strong performance in telephoto, keeping a high level of detail and sharpness across the entire zoom range. It also performs strongly in video zoom, offering stable and clear results.
    • Lowlight: The device provides good low-light imaging experience, featuring a warm white balance that preserves the ambient atmosphere, along with impressive noise reduction.

Samsung Galaxy S25 Ultra

In our new protocol, the Samsung Galaxy S25 Ultra is mostly impacted by the lower impact on noise and the growing weight of telephoto zoom.

Highlights of the performance of the product evaluated under the v6 Camera protocol:

    • Portrait: The Samsung Galaxy S25 Ultra delivers strong portrait photography performance across default, bokeh, and tele modes, with good subject detail, accurate edge detection, and versatile features like realistic blur effects and adjustable lighting.
    • Zoom: The Samsung Galaxy S25 Ultra offers impressive telephoto performance with sharp, detailed images across medium to long zoom ranges, supported by fast and reliable autofocus. While it delivers strong overall quality, some softness and noise appear at extreme zoom levels, placing it just behind top competitors like the Oppo Find X8 Ultra and Xiaomi 15 Ultra.
    • Lowlight: In low-light conditions, the camera generally delivered good exposure and accurate white balance, though occasional underexposure and unnatural tones were observed. Testers also noted inconsistencies in noise and detail between shots, highlighting a lack of consistency in performance across challenging lighting conditions.

Conclusion

With the launch of DXOMARK’s sixth-generation Smartphone Camera Evaluation Protocol, we reaffirm our commitment to providing the most accurate, relevant, and user-centric assessments in the mobile imaging space. By integrating cutting-edge testing tools, global user insights, and real-world use cases into our methodology, Camera v6 marks a significant step forward in how smartphone camera performance is measured. As innovation in mobile photography accelerates, this new protocol ensures our rankings remain not only scientifically robust but also truly reflective of the everyday experiences and expectations of users worldwide.

The post What’s New in DXOMARK’s Camera protocol? appeared first on DXOMARK.

]]>
https://www.dxomark.com/dxomark-smartphone-camera-protocol-v6/feed/ 0 DXOMARK launches smartphone camera protocol v6 Discover how DXOMARK’s new smartphone camera protocol v6 elevates HDR, portrait, and zoom testing to deliver scores that truly reflect real-world use. DXOMARK smartphone camera protocol V6 CameraV6_keyvisual Media NewRanking_Camv6 (2) AppleiPhone16ProMax-1 FaceToFace_Xiaomi15Ultra
Portrait Photography Preferences: What European Users Expect from Their Smartphones   https://www.dxomark.com/portrait-photography-preferences/ https://www.dxomark.com/portrait-photography-preferences/#respond Thu, 22 May 2025 15:58:41 +0000 https://www.dxomark.com/?p=184569 At DXOMARK, our mission goes beyond technical analysis — we are deeply committed to capturing real user experiences. In 2023, we launched DXOMARK Insights, a global initiative aimed at understanding user preferences and pain points when it comes to smartphone imaging, particularly portrait photography. As we are preparing to release the next version of our [...]

The post Portrait Photography Preferences: What European Users Expect from Their Smartphones   appeared first on DXOMARK.

]]>

At DXOMARK, our mission goes beyond technical analysis — we are deeply committed to capturing real user experiences. In 2023, we launched DXOMARK Insights, a global initiative aimed at understanding user preferences and pain points when it comes to smartphone imaging, particularly portrait photography.

As we are preparing to release the next version of our Camera v6 protocol, we conducted a blind survey in Paris to explore how flagship smartphones meet (or fall short of) user expectations. This study focused on evaluating performance in a variety of scenes—especially the most demanding ones, such as low-light environments, high dynamic range (HDR) settings, and backlit conditions. In addition to helping us understand user preferences for portrait photography in Europe, the scenes used in this Insights study are the ones that will be applied in the implementation of our new camera protocol.

In this article, our findings will show the areas of image quality that help identify the trends in preferences of European consumers. We’ll also look at how these compare with the preferences of Chinese users.

3 key takeaways:

  • Despite strong improvements from the latest flagships, there is still room for improvement across all lighting conditions
  • Portrait pictures taken by Huawei Mate 70 Pro+  & Oppo FindX8 Pro were generally preferred across all lighting conditions
  • 3 major factors looked for in Europe and in China when evaluating portraits: an accurate exposure of the face and the scene, a neutral white balance as well as natural skin tones

 

Our methodology


This DXOMARK Insights study on smartphone HDR portrait photography focuses on assessing the perceived image quality of HDR photos themselves, rather than how they appear on specific smartphone displays.

The study involved seven of the most popular flagship smartphones: Apple iPhone 16 Pro Max, Honor Magic7 Pro, Samsung Galaxy S25 Ultra, Oppo FindX8 Pro, Google Pixel 9 Pro XL, Huawei Mate70 Pro+ and the Xiaomi 15 Ultra.

The shooting plan was designed to feature a variety of everyday scenes from European consumers, with a focus on challenging ones including indoor, lowlight and night scenes, as well as some backlit and high contrast situations, covering a total of 50 scenes.

As for previous Insights surveys, blind comparisons run by the user panel were performed with HDR visualization tools. For this study, we ensured HDR image files were viewed as they would be through third-party software, aligning with real-world usage. HDR formats that are either documented or follow ISO specifications were processed using gain map information, while non-standard formats were handled according to ITU guidelines. This approach emphasizes the importance of interoperability—highlighting which devices produce images that are easily shareable and viewable across various smartphones. Notably, all tested devices use open HDR formats, apart from the Honor Magic 7 Pro. Meanwhile, the Huawei Mate 70 Pro Plus supports ISO-compliant HDR output on the latest HarmonyOS version, ensuring broad compatibility.

In addition, due to the technical constraints in displaying HDR content on the web, please note that the photos used in this article are for illustration only. To view the HDR format rendering, these visuals need to be viewed with the proper HDR visualization tools.

To better understand user perception, the models featured in the photos were asked to provide feedback on their own portraits. The panel consisted of 39 demanding users, including flagship smartphone owners and photography enthusiasts. The survey was structured into two key phases:

  1. Blind Pairwise Comparison – Participants were shown two images of the same scene, each taken with a different smartphone. Through a series of side-by-side comparisons, they selected their preferred photo until a consistent JOD (Just Objectionable Difference) scale was established across all seven devices.
  2. Photo Series Rejection – In this step, participants were presented with a full set of photos from a single scene and asked to identify any images they disliked or would avoid sharing on social media.

This two-step approach allowed us to capture detailed data per scene, including:

  • The overall rejection rate across all respondents
  • The rejection rate within the specific user group being analyzed
  • The calculated JOD scale for each comparison

From these insights, we derived a Satisfaction Index for every image, a core from 0 to 100 that reflects how often users accept or reject an image, providing a clear view of user preferences. A score of 0 indicates that more than half of respondents rejected the image, while a score of 100 acceptance
while a score of 100 indicates strong acceptance by the user panel and no rejection for the specific scene or group of scenes. To enrich our understanding, participants were also asked to explain the reasons behind their rejections—giving us valuable qualitative feedback on what makes or breaks a portrait photo.

What are the pain points identified by European users when it comes to portrait pictures?


In blind tests conducted across Europe, users consistently identified three major pain points affecting image quality in portrait photography.

  • First and foremost, underexposure was frequently cited—faces and entire scenes often appeared too dark, diminishing the overall appeal of the image.
  • Second, white balance inconsistencies were a common complaint, with images sometimes looking unnaturally cold or overly warm.
  • Lastly, unnatural skin tones emerged as a critical factor; even subtle shifts in tone were enough to significantly lower user satisfaction, underscoring how sensitive viewers are to color accuracy in portraits.

Among these, exposure stood out as the most persistent issue, clearly noticeable not just in outdoor shots, but also in more complex lighting scenarios such as backlit or high-contrast scenes. While exposure is an important factor, this doesn’t imply that European users are simply seeking brighter images. Instead, they place significant emphasis on the relative brightness of the face in comparison to the overall image. We’ll explore this topic in more detail later in the article”

To expand our understanding, we compared these findings with a similar study conducted in China the previous year. Interestingly, users in both Europe and China exhibited remarkably similar preferences, highlighting a growing global convergence in what people expect from smartphone portrait photography.

In terms of lighting conditions, lowlight scenes are the ones that still show significant challenges, due to users being more demanding in these settings, and even more than in night scenes.

Our satisfaction ranking in Europe


The results from Europe clearly highlighted two top performers: Huawei and OPPO. These brands led in nearly every lighting condition — from bright daylight to dim indoor scenes — and scored consistently high across several key criteria:

  • Top 3 in exposure and white balance across the board
  • Very low rejection rates, particularly in low-light scenarios
  • Steady performance across all portrait scenes

According to the JOD (Just Objectionable Difference) analysis, Huawei and OPPO stood out as the most preferred devices for users across the whole shooting plan

When analyzing the overall shooting plan and aggregating user preferences across all scenes, Huawei and Oppo emerged as the most favored smartphones. The graph below presents the devices ranked by their average performance across all scenes and lighting conditions.

Similarly, the ranking obtained in Europe reached the same results as with the China insights (check out the last part of the article for more details). One important point to highlight is that, although these devices were not initially designed for the European market and are not commercially available in the region, their image quality tuning appears to align well with the preferences of European users.

Among all lighting conditions, smartphones struggle more in night & low-light photography


While most flagship devices perform well in good lighting, our study revealed that low-light and night scenarios remain major weak points. These settings continue to produce significantly lower satisfaction scores, even for premium models.

For easy and bright outdoor scenes, users are increasingly demanding

Even under well-lit conditions, users demonstrated high sensitivity to subtle differences. Minor variations in exposure or white balance could significantly sway the Satisfaction Index.

  • Devices with slightly underexposed faces, were consistently rated lower (Apple & Honor in the following example)
  • Bright, high-contrast renderings performed better and were preferred among our panel of users (Oppo & Huawei in the following example).

This tells us that users are becoming more refined in their tastes and more critical, even of images that are objectively “good.” They expect not just technical accuracy, but visual appeal and emotional resonance in every shot.

Indoor Low-Light Scenes Fall Short

Indoor low-light conditions proved especially difficult for most smartphones. Satisfaction scores dropped sharply, often below 70, even though the photos were technically acceptable.

Why? Users simply do not perceive these scenes as difficult and therefore expect image quality to remain high. When devices underexpose faces or produce strange hues under artificial lighting, the mismatch between expectations and results creates frustration.

In these scenes, Huawei & Oppo stood out with a more consistent exposure stability and natural-looking tones. iPhone & Google devices also brought a strong performance in lowlight and night respectively. Across all devices, however, users pointed to white balance inconsistency and contrast instability as major drawbacks.

Night Photography Quality Is Still Lacking

In dark environments, the performance gap widened and user satisfaction was still far from ideal in this condition. Overall, Huawei, OPPO, and Google delivered the most acceptable results, though many examples still struggled with low JOD scores. We observe still a lot of pain points on many different devices and all devices were challenged by the night shots. There is a high rejection rate and low JOD scores on many use cases.

  • Issues like clipped highlights, underexposed faces, and unnatural skin tones were common.
  • Devices like the Samsung Galaxy S25 showed inconsistency in facial exposure even with advanced hardware.

Overall, facial exposure, white balance, and skin tone rendering remain the top drivers of user dissatisfaction in night photography.

Interestingly, in night photography, a brighter face isn’t always seen as better. What truly matters to users is the balance between the face and the background. A well-managed exposure ratio between these elements plays a crucial role in perceived image quality. Users consistently favored images where the face stands out naturally without overpowering or being lost in the background, highlighting the importance of cohesive, well-balanced lighting in low-light portrait scenarios.

Still ways for improvements for most devices of the study


Expectations are rising, with even the smallest flaws in top-tier devices now being easily detected by discerning users. Fine-tuning product details has become more critical than ever for manufacturers.

Each device showed its strengths — but also clear areas needing improvement:

Honor Magic7 Pro

HDR format is not supported by third-party devices resulting in exposure and contrast issues. Also, performance under complex lighting conditions lacks stability. Skin tones tend to appear yellowish-green, leaving users with a sense that the images lack warmth.

Samsung Galaxy S25 Ultra

Highlights are overly clipped in low-light HDR scenes photography, and there’s a noticeable issue of underexposure in night-time shooting scenarios. Skin tone rendering sometimes unnatural as well.

Apple iPhone 16 Pro

Insufficient facial brightness is a core subjective complaint. Skin tones lean towards orange and green hues, which makes the images less pleasing than expected.

Huawei Mate 70 Pro Plus

Performs consistently overall, but tends to overexpose faces in high backlight conditions. Occasional inaccuracies in skin tone reproduction are observed in portrait mode.

Google Pixel 9 Pro XL

Shows instability in HDR rendering, with excessive highlight clipping in low-light environments. Noticeable facial noise remains a key issue for users.

OPPO Find X8 Pro

In certain HDR scenes, images are often perceived as overly bright or washed out. At times, users also notice a pinkish tint in the white balance.

Xiaomi 15 Ultra

Dim facial exposure is one of the key challenges for this device, and its HDR failure rate is slightly higher compared to other tested models.

By anchoring our evaluations in real user feedback, we highlight issues that matter most to consumers, offering manufacturers a clearer path to meaningful improvements.

Converging user expectations between Europe and China


One of the most striking findings from our survey was the similarity in user expectations across Europe and China. We could observe that due to the earlier Insights run in Shanghai, qualifying trends in user preferences specifically in China.

Rather focus on the key aspects for both China & Europe

In both markets, when it comes to portrait pictures, users ranked the same priority criteria

✅ Subtle shifts in white balance can make or break satisfaction
✅ Proper exposure — especially on the face — is non-negotiable
✅ And yes, natural-looking skin tones are a must

This convergence underscores a growing global alignment in what users value most in smartphone photography. Regardless of cultural differences, people want accurate, natural, and well-exposed portraits.

Interestingly, Huawei Mate 70 Pro+ and OPPO Find X8 Pro were also the top-ranked devices in China — mirroring the preferences in Europe. Their strengths in delivering consistent exposure, accurate white balance, and lifelike skin tones clearly resonated across regions

Conclusion: The road ahead for smartphone photography


As smartphone users grow more discerning, their expectations for image quality continue to rise—particularly when it comes to portrait photography. The DXOMARK Insights survey underscores this shift: even subtle visual imperfections, such as slight exposure missteps or unnatural skin tones, can significantly impact overall satisfaction. This increasing demand means that smartphone manufacturers can no longer rely solely on hardware improvements; fine-tuning image processing and delivering consistent results across lighting conditions has become critical.

While a few brands, notably Huawei and OPPO, are setting the benchmark with strong performance across varied scenarios, the study highlights that key challenges remain—particularly in low-light, night, and HDR photography, where user satisfaction still drops considerably. These pain points reflect not just technical limitations, but also a gap between user expectations and current image processing capabilities.

Looking ahead, our upcoming Camera v6 protocol, launching in June, will address these issues head-on. By introducing more complex and demanding test scenarios, especially in extreme lighting environments, the updated benchmark aims to better reflect real-world usage and provide even deeper insights into what users truly value. This evolution marks a crucial step toward helping the industry deliver photography experiences that meet—and exceed—modern user expectations.

The post Portrait Photography Preferences: What European Users Expect from Their Smartphones   appeared first on DXOMARK.

]]>
https://www.dxomark.com/portrait-photography-preferences/feed/ 0 Insights_camera_v6 Pie Chart lighting and scenes Camv6Insights_post1 (2) Camv6Insights_post1 (3) Camv6Insights_post1 (6) Camv6Insights_post1 (5) 1 (1)webarticle 2 (1)webarticle insights_v6_indoor 2 (2)webarticle 5 (1)webarticle 4 (2)webarticle 2 (3) Untitled-1 Camv6Insights_post1 (2) Camv6Insights_post1 (3) 4noswipe
Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/ https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/#respond Sun, 18 May 2025 17:30:22 +0000 https://www.dxomark.com/?p=184483 Since launching our laptop testing protocol in 2023, DXOMARK has evaluated over 30 devices, uncovering critical insights into what makes a great integrated webcam. Two years ago, Apple’s MacBooks dominated with no real competition, but today, the landscape is shifting. Here’s what our data reveals about the state of laptop webcams, from hardware design to [...]

The post Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? appeared first on DXOMARK.

]]>
Since launching our laptop testing protocol in 2023, DXOMARK has evaluated over 30 devices, uncovering critical insights into what makes a great integrated webcam. Two years ago, Apple’s MacBooks dominated with no real competition, but today, the landscape is shifting. Here’s what our data reveals about the state of laptop webcams, from hardware design to software optimization.

Apple’s Historical Lead, and the New Challengers

When we published our first laptop webcam rankings in 2023, Apple’s MacBook Pro M2 led the pack with an impressive score of 135, showcasing its dominance in image quality. At the time, the best Windows devices lagged behind, both the Lenovo ThinkPad X1 Carbon Gen 11 and Microsoft Surface Pro 9 (Highest scoring Windows devices) scored a modest 100, highlighting a significant gap in webcam performance. The quality of most Windows PCs we tested, including some flagship models, was disappointing, with the majority delivering an underwhelming video experience

For example, in our early tests, a MacBook Pro with the M2 chip excelled in challenging scenarios, such as a scene with a brightly lit window behind the user, delivering vibrant colors and stable exposure without fluctuations, a performance that Windows laptops struggled to match at the time.

Over the past two years, Windows devices have made remarkable progress in webcam quality, driven by concerted efforts from manufacturers like HP, Dell, Lenovo, and Microsoft. These companies have invested heavily in running hardware and software optimizations, steadily closing the gap with Apple. Meanwhile, Apple’s webcam quality has remained relatively stagnant, with only marginal improvements since 2023.

Windows devices have since made significant strides, catching up through thorough hardware and software optimization. With mobile giant Qualcomm entering this market with extensive ISP experience, this trend has been further accelerated. The Microsoft Surface Laptop 13-inch (Snapdragon-based), for instance, now handles HDR scenarios with clarity, maintaining detail in both bright backgrounds and dim foregrounds, rivaling the MacBook’s output.

Apple MacBook Pro 14” (M4,2024)
Microsoft Surface Laptop 13-inch
Apple MacBook Pro 14” (M4,2024)
Microsoft Surface Laptop 13-inch

The new Microsoft Surface laptop 13-inch provides a similar quality than the latest Macbook Pro 14” (M4,2024) M4. Face exposure and skin tones are pretty well balanced for every type of classic video conference use cases (as illustrated by the first comparison). For more complex use case such as strong backlit scene (single or duo like the second comparison), both devices still struggle to provide the right face exposure on deep skin tone while keeping a good contrast and bright preservation.

Apple MacBook Pro 14” (M4,2024)
Lenovo Thinkpad X9 Aura

Some other devices like the Lenovo X9 are now providing great results even if we are still noticing some drawbacks such as illustrated on the frame above. Face exposure and overall color rendering have been improved with respect to previous devices. Still, skin color and contrast are not yet at the same level than the one we evaluated on the latest Macbook.

As a result, the performance gap has narrowed significantly. Here’s a simple table showing how these devices scored in our tests (higher scores indicate better performances):

This near parity indicates that Apple’s lead in webcam quality is no longer unchallenged, as Windows OEMs are making significant investments in imaging technology. Analyzing the progression of scores across both categories, it’s clear that Windows laptops are steadily closing the gap with their Apple counterparts.

What explains this shift?

Sensor Resolution Matter Less Than You Think

Conventional wisdom suggests that higher sensors resolution means better image quality. Our data challenges that assumption:

While higher-resolution sensors generally correlate with improved detail scores, the relationship is relatively weak—some 2MP webcams outperform 8MP counterparts.

This is largely due to the heavy compression (1080p) imposed by the hardware video pipeline, which diminishes the potential advantages of larger or higher-resolution sensors.

Notably, smaller sensors can still deliver excellent results when paired with effective image signal processing (ISP) and tuning. For example, the Microsoft Surface Laptop 13-inch achieved a detail score of 135 using a modest sensor, thanks to Snapdragon’s advanced ISP and AI optimization, closely rivaling the Apple MacBook Pro M4’s 8MP sensor, which scored 136.

Conversely, even high-resolution hardware can underperform—devices equipped with 4K sensors have scored below 100 when tuning was subpar, highlighting the critical role of software optimization in image quality.

As seen in this graph, although there is some correlation between hardware and score, plenty of 2Mpx sensor outperform higher 4+Mpx sensors when it comes to overall image quality.

Microsoft Surface Laptop 13”
HP Dragonfly Pro Chromebook

On this video capture, we’re comparing the Microsoft Surface Laptop 13-inch equipped with a 2MP Camera to a HP Dragonfly Pro Chromebook equipped with a 8MP camera.
With 1000lux and a dynamic of 4 EV between face and bright box, we can easily observe a  strong gap of quality in terms of exposure, contrast and dynamic range capabilities. Texture is also very low on the HP Dragonfly Pro capture.

Therefore, a key finding of our extensive range of data is that ISP performance and software tuning have a greater impact on image quality than sensor specifications alone

MIPI vs. USB: Why Connection Matters

Our testing reveals a clear performance hierarchy among webcam interfaces, with the Mobile Industry Processor Interface (MIPI) consistently enabling superior image quality in laptop cameras. The table below outlines the key advantages associated with MIPI-based implementations:

Performance Factor MIPI Advantage
Image Processing Direct ISP pipeline in the Applications processor (not a 3rd party USB camera chip) with joint OEM/SOC tuning for optimized image quality.
Data Throughput 2-4x higher bandwidth, enabling seamless high-resolution data transfer.
Power Efficiency Built on newer tech nodes, MIPI offers superior efficiency for always-on AI tasks.
AI-Friendliness Bayer processing retains richer image data compared to USB’s YUV-centric approach, enhancing AI-driven features.
Resolution Support Superior image quality at 5MP and above, where USB struggles to keep up.

The graph below presents a comparison of our scores between the best-performing USB webcam and a MIPI-based solution, highlighting the quality gap between the two interfaces.

The advantages of MIPI interfaces are clearly reflected in our evaluation results. All 15 top-performing webcams in our testing utilize MIPI connections, consistently achieving higher scores than their USB counterparts.

Mobile Industry Expertise

MIPI sensors benefit from advancements in mobile imaging, allowing devices to achieve remarkable camera quality. OEMs using ARM chipsets—such as Apple and Qualcomm—lead the pack, with the MacBook Pro M4 scoring 136 and the Surface Laptop 13-inch close behind at 135, thanks to years of smartphone camera tuning expertise now applied to laptops. This technical advantage manifests in various attributes depending on products.

Overall we did not measure devices with USB connections that managed to provide in the same camera:

  • Exposure: good target, dynamic range and temporal stability
  • Color: accurate color and white balance and good skin tone rendering.
  • Texture: overall, the level of detail is systematically lower than the MIPI competition
Lenovo Thinkpad T14 Gen4 provides a good face exposure but struggles to preserve bright part of the pictures (1000 Lux EV4)
Apple Mac Book Pro 14” (M4, 2024) provide in the same time good face exposure while preserving the brighter part of the scene (1000 Lux EV4)

Implementation Quality Matters 
While USB-connected webcams typically face performance constraints—often dependent on the image quality capabilities of module manufacturers—strong tuning can still deliver competitive outcomes. For example, the MSI Prestige 16 AI Evo achieved a score of 92, outperforming some earlier MIPI implementations.

Future Outlook 
USB connectivity remains prevalent in low- to mid-range devices. However, MIPI is rapidly establishing itself as the standard for high-end laptops, supported by a mature ecosystem and superior integration capabilities. For applications requiring premium image quality, MIPI provides a clear and compelling advantage.

Conclusion: Software and MIPI Redefine Quality

After two years of rigorous testing, three key insights have emerged:

  • First, Apple’s dominance in webcam performance is no longer unchallenged, with Windows OEMs rapidly closing the gap.
  • Second, hardware alone is insufficient, successful webcam performance hinges on the integration of advanced software and ISP partnerships.
  • Lastly, MIPI has established itself as the standard for delivering premium performance, particularly in high-end devices.

For businesses, this signifies that webcam quality is no longer a Mac-exclusive advantage. Leaders in this space will be those who combine cutting-edge hardware with strong imaging expertise, regardless of whether their platform is based on x86 or ARM architecture.

The post Laptop Webcam Image Quality: What Can We Learn After 2 Years of Testing? appeared first on DXOMARK.

]]>
https://www.dxomark.com/laptop-webcam-image-quality-what-can-we-learn-after-2-years-of-testing/feed/ 0 SingleConferenceRoom_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_03.303 SingleConferenceRoom_MicrosoftSurfacePro12_DxOMark_VideoConference_Laptop_00_00_01.608 DuoBacklit_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_45.105 DuoBacklit_MicrosoftSurfacePro12_DxOMark_VideoConference_Laptop_00_00_44.576 Sofa_AppleMacBookPro2024_14pM4_DxOMark_VideoConference_00_00_21.668 Sofa_LenovoThinkpadX9Aura_DxOMark_VideoConference_ModeDefault_00_00_20.275 Scores comparison_resized2 Scores history & comparison_resized Sensor resolution comparison_resized 1000LuxEV4-Apple 1000LuxEV4 Camera scores per connection type_resized 1000LuxEV4-Apple
DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/ https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/#respond Fri, 28 Feb 2025 09:40:36 +0000 https://www.dxomark.com/?p=182910 DXOMARK has long been a trusted authority in the evaluation of smartphone camera quality. Over the years, the global leader in camera evaluation has rigorously tested hundreds of devices, observing firsthand the rapid evolution of imaging technology. In 2024 alone, 50 smartphones were assessed across multiple price segments, ranging from the entry-level market to ultra-premium [...]

The post DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options appeared first on DXOMARK.

]]>
DXOMARK has long been a trusted authority in the evaluation of smartphone camera quality. Over the years, the global leader in camera evaluation has rigorously tested hundreds of devices, observing firsthand the rapid evolution of imaging technology. In 2024 alone, 50 smartphones were assessed across multiple price segments, ranging from the entry-level market to ultra-premium devices over $800. This comprehensive testing gives consumers valuable insights into the camera performance of devices at various price points, making DXOMARK’s assessments a key resource for anyone in the market looking for a new smartphone.

DXOMARK’s standard Gold, Silver, and Bronze labels, introduced a few years ago, provide a fast and straightforward indicator of the performance of devices.

In today’s rapidly evolving market, consumers are faced with an overwhelming number of product choices. As segments continue to advance and products improve, identifying the best option can be challenging. DXOMARK is proud to introduce a new label, the Smart Choice label, to provide clarity and guidance in this landscape, helping consumers quickly and confidently select high-quality, well-developed products. Highlighting devices that offer exceptional camera performance for their price range, the Smart Choice label gives consumers the confidence to make an informed and enduring purchasing decision.

To better understand how this new label was established and the rules we put in place, let’s have a look at how price segments have recently evolved.

Main evolutions in imaging quality per segment in 2024

Over the past year, the smartphone camera landscape has experienced remarkable advancements, redefining photography standards across various price segments. While the Ultra-Premium category led the charge with superior performance, significant strides were also made in the Premium ($600–$800) and High-End ($400–$600) segments. These mid-tier categories have seen marked improvements across all key camera features narrowing the gap with their Ultra-Premium counterparts.

When we examine the evolution of quality within each segment, it becomes evident that disparities are diminishing over time. This trend highlights how OEMs are aligning their offerings with the specific needs and expectations of users within each price bracket. As a result, a clear “quality standard” has emerged within each segment, setting a baseline for features. However, some standout devices continue to diverge from this standard by delivering exceptional quality that exceeds the segment’s norms, establishing themselves as references and raising the bar for what users can expect in terms of smartphone camera performance.

We’ll detail below the evolution we observed for each segment in 2024.

Ultra-Premium

The substantial evolution was especially visible in the Ultra-Premium category. Ultra-Premium smartphones continue to push the boundaries of camera technology, but the pace of improvement has slowed compared to previous years.

A significant factor contributing to this was the influx of foldable and flip smartphones in 2024. These devices, while impressive in their form factor, have shifted the focus of innovation toward design and functionality, rather than pushing the boundaries of camera quality.

Despite this, candy bar-style devices have continued to evolve, showing significant improvements in their overall image quality, with a notable increase of 8 points from 2023 to 2024. This increase, although lower than the 12-point jump seen from 2022 to 2023, demonstrates steady progress in smartphone camera capabilities.

With regards to functionalities and quality evolution, video quality has seen fewer enhancements, while photo capabilities have improved, with a particular emphasis on portraiture and skin tone rendering. Despite these advancements, capturing the perfect moment in sports photography or other fast-moving scenes remains a challenge for these devices.

Zoom, however, is where Ultra-Premium smartphones have made the most progress, with many now offering telephoto lenses that deliver much improved zoom performance.

The introduction of features like ultra-zoom (up to 100x in some cases) has further differentiated this category from others.

Interestingly, individualization and signature styles—where brands are incorporating unique camera features, AI-powered photo editing, and distinct imaging characteristics—have become an emerging trend.

Among the devices that we’ve evaluated, the Huawei Pura 70 Ultra is an example of best-in-class performance, providing top-notch performance in nearly all key imaging features.

Meanwhile, devices in the High-End and Premium categories have experienced impressive gains in their imaging features, including significant improvements in photo, video, and particularly zoom performance, and narrowing the gap with Ultra-Premium models.

Premium

Premium devices have advanced more rapidly than Ultra-Premium ones in the last year, with a marked improvement in the average camera score, which rose by 9 points compared to 2023. The gap between Premium and Ultra-Premium smartphones has narrowed, particularly in photo and video performance, though Zoom remains the key differentiator, with Premium devices still lagging behind their Ultra-Premium counterparts.

In the Premium segment, we identified devices that offer a performance way above their class. Notably, Premium devices are now incorporating similar camera modules to those found in Ultra-Premium models.  Devices providing an outstanding experience in the Premium price range include the Google Pixel 9, which shows strong performance in photo, video, and zoom quality, challenging the upper-tier flagships. Apple’s iPhone 16 is another strong contender in the Premium category, excelling in photo and video capabilities, though still falling short in zoom performance when compared to Ultra-Premium devices.

High End

The High-End segment has seen some of the most dramatic changes in camera technology in 2024. Historically, zoom quality has been a feature that sets Ultra-Premium smartphones apart, as they are typically equipped with one or two telephoto lenses. However, even lower-priced devices, especially in the High-End segment, are improving their zoom capabilities, often with the inclusion of a dedicated telephoto sensor.

With advancements in photo and zoom performance, many High-End devices are now challenging the quality of both Premium and Ultra-Premium smartphones. Video quality has also improved, though at a slower pace compared to other features.

One trend that has emerged in this segment is the increasing presence of refurbished flagship models at competitive prices. These devices, typically 1-2 years old, can offer near-flagship quality at a lower price point, giving consumers more options in terms of camera performance.

In terms of evolution, High-End devices are catching up with premium models, providing similar photo quality, enhanced zoom capabilities, and faster, more responsive cameras, especially in good lighting conditions. For example, the Honor 200 Pro and Google Pixel 8a are two good contenders in their class. The Pixel 8a shows an outstanding performance for its class, providing excellent performance in photo and video for its price segment, even competing with devices from the higher price segments.

However, considering the disparity in performance in this segment, these types of devices have further identified the need to qualify devices performing way above their class.

Entry level: Advanced and Essential

While the Essential and Advanced categories have seen some improvement, they still lag significantly behind higher-end devices. Essential devices, with an average score of 68, offer a user experience reminiscent of smartphones from over a decade ago. Zoom quality in these devices remains poor, and there is no standout performer in this category.

In 2024, devices in the Advanced segment did not show exceptional quality improvements. Quality within the segment was very close, but still far from the average quality provided by High-end devices from 2023. These devices struggle to compete with slightly pricier models in terms of photo and zoom capabilities.

Introducing the Smart Choice label

While image quality has steadily improved across all smartphone segments, choosing the best device for your budget remains a complex task. Although narrowing, significant performance gaps still exist between the lowest- and highest-quality devices, and it is sometimes challenging to identify the best option for your needs.

To address this challenge, DXOMARK has introduced the Smart Choice label, which highlights devices that deliver exceptional imaging experience within their segment and that rival the performance of higher-tier devices. These products are designed to stay relevant over time, ensuring enduring value and competitive image quality even as new models emerge.

Consumers can easily identify Smart Choice devices on DXOMARK’s website through product pages and a dedicated Smart Choice label section, making it easier to make informed, budget-conscious decisions.

Our Smart Choice label is built on well-defined thresholds, ensuring that only products meeting specific performance criteria receive the label. We established the Smart Choice label thresholds using 2024 data from DXOMARK-tested products launched worldwide. These thresholds focus exclusively on candybar form-factor devices, as other form factors have distinct performance goals and cannot be directly compared. This approach ensures accurate, segment-specific quality benchmarks for consumers.

Specifically, which devices will qualify?

Eligible products must either match or exceed the average performance of the upper segment from the previous year.

 

By introducing the Smart Choice label, we aim to simplify purchasing decisions while encouraging brands to meet higher standards. This initiative not only benefits consumers by offering transparency but also motivates manufacturers to innovate responsibly, ultimately raising the overall quality of products in the market.

The post DXOMARK’s Smart Choice label: Guiding consumers with pragmatic camera options appeared first on DXOMARK.

]]>
https://www.dxomark.com/dxomarks-smart-choice-label-guiding-consumers-with-pragmatic-camera-options/feed/ 0 Fichier 1@4x 3 Photo score per segment Score variation per segment Form factor data Zoom sub-score Smartchoice_infographicv2
Lens flare unveiled: The challenges in the latest flagship devices https://www.dxomark.com/lens-flare-unveiled-the-challenges-in-the-latest-flagship-devices/ https://www.dxomark.com/lens-flare-unveiled-the-challenges-in-the-latest-flagship-devices/#respond Wed, 18 Dec 2024 17:57:20 +0000 https://www.dxomark.com/?p=180515 In recent years, smartphone cameras have been offering impressive photography capabilities, rivaling traditional cameras in many aspects. As people rely more and more on their smartphones to capture everyday moments, understanding the nuances of smartphone photography becomes increasingly important. One such nuance is lens flare, a scattered light that creates various visual artifacts in the [...]

The post Lens flare unveiled: The challenges in the latest flagship devices appeared first on DXOMARK.

]]>
In recent years, smartphone cameras have been offering impressive photography capabilities, rivaling traditional cameras in many aspects. As people rely more and more on their smartphones to capture everyday moments, understanding the nuances of smartphone photography becomes increasingly important.

One such nuance is lens flare, a scattered light that creates various visual artifacts in the image. While lens flare can sometimes add a creative touch to photos and videos, it often results in unwanted effects that can diminish the overall quality of the image.

Understanding and managing lens flare is crucial for any mobile photographer who is looking to improve his smartphone photography. By learning how this phenomenon occurs and how to control it, photographers can ensure that their images maintain clarity, contrast, good dynamic range, and color accuracy. Manufacturers, meanwhile, need to know how to measure, limit, and control the occurrence of flare during the design and development process of the smartphone camera.

What is lens flare?

Lens flare is a phenomenon that occurs when a bright light source, such as the sun or an artificial light, enters the camera lens and scatters in various shapes, sizes, and patterns in the image.

This scattering of light can result in different types of flare, including ghosting and veiling glare. Ghosting appears as multiple, often circular, bright spots or shapes in the image, caused by reflections between the lens elements. Veiling glare, on the other hand, looks like a haze over the image, with low contrast across the image, reducing overall clarity.

Colored spots and ghosting
Luminous halo

Veiling glare

Lens flare can range from artistic, dreamy effects to distracting artifacts that degrade image quality, which is the kind of flare DXOMARK studies. Understanding these types of flare helps smartphone photographers manage or creatively use them to enhance their photos.

Causes of lens flare in smartphone cameras

Light sources such as direct sunlight and strong artificial lights are the primary culprits in the presence of flare in a photo. Their interaction with the lens elements produces artifacts you would not see with your bare eyes. Scratches or dust, or even fingerprints on the lens are examples of interactions, but they can occur also when lens elements or infrared filters are not perfectly aligned.

The design and coatings of the lens play a crucial role in managing these effects. High-quality lenses often feature advanced coatings intended to reduce reflections and improve light transmission, thereby minimizing flare. However, even with these coatings, internal reflections within the camera module, including those of the sensor, can still contribute to flare, even increasing it. These internal reflections can create unwanted artifacts and reduce overall  image clarity.

Effects of lens flare on photos

Lens flare can have both positive and negative effects on photographs. On the positive side, flare can be used creatively to add an artistic touch to images. Photographers often exploit flare to create dreamy, ethereal effects or to emphasize the warmth and intensity of a light source, adding a sense of drama and atmosphere to their shots.

However, the negative consequences of flare are reductions in contrast, color saturation, and overall image clarity. This can result in photos that appear washed out or hazy, detracting from the subject’s details and vibrancy. For instance, a landscape photo taken with the sun directly in the frame might show beautiful, artistic flare patterns, but the image might also lack sharpness and have muted colors. Comparing photos with and without flare highlights these differences, showing how flare can both enhance and detract from the visual impact of an image.

As smartphone image quality continues to push the limit of high dynamic range capture with (multi-frame) processing that aims to enhance shadows and darktone contrasts, it can also make the flare effects more visible, making it more important than ever to reduce flare at the source.

That is why flare remains a key challenge even for the most advanced cameras and why manufacturers are trying to overcome this defect as effectively as possible.

Analyzing flare the DXOMARK way

DXOMARK specializes in measuring and analyzing this image phenomenon thanks to its flare bench — a dedicated laboratory setup just for flare. This equipment can generate sun-like flare images by incorporating a bright light source, close in apparent size and color temperature to the sun.  This lab provides reliable metrics on flare based on real-use cases, such as specific measurements on the flare’s intensity based on processing the RAW images taken on the device being tested.

The laboratory setup to test flare

We recently tested several of the latest flagships released from the top brands, by measuring single frame RAW files with very limited processing. We observed how each device managed the light source that was either in the lens’s field of view or out of the field of view, and at various angles. We also evaluated light sources that were collimated –with parallel beams of light–  and non-collimated —  with divergent beams.

Our study focused on the image quality of the latest flagship devices with measurements of the image degradation caused by flare.  For comprehensive results, our image quality engineers evaluated the following devices in the lab and out in the field:

• Oppo Find X8 Pro
• Vivo X200 Pro
• Xiaomi 15 Pro
• Huawei Pura 70 Ultra
• Apple iPhone 16 Pro Max

The perceptual study on flare was conducted in the following manner:

  • Outdoors, in various light conditions, including the sun in the field of view, a strong or subtle backlight, and a powerful artificial light source placed outside the field of view but toward the camera.
  • Indoors, with a strong to low-intensity diffused light placed outside the field of view.

Results

Flare is nothing new. Every smartphone camera produces flare to some extent, and we aimed to study the conditions and intensity of the flare produced from these devices.  Our results clearly showed that some flagships were more prone to flare than others, some with very distinct patterns, and we were also able to determine the precise angles that would generate the flare.

Outdoors in natural light, with no additional light source:

With sunlight, flare was visible on the  Oppo and Vivo images, and barely visible on the Xiaomi.

 

 

Indoors:

Low-intensity light source that is not collimated or divergent, flare was visible on the Oppo, Vivo, and Xiaomi, even though in this example, Oppo’s flare is barely noticeable.

 

To further test flare behavior, we also put these devices through unusual and extremely challenging cases not normally encountered in the real world by shining a powerful artificial light source, such as in the following indoor scene:

 

With a source of light, such as a flashlight, sunlight, or city lights, flare was quite noticeable.  In this outdoor example, we also placed a powerful artificial light source outside the field of view and toward the camera. The latest Vivo and Oppo flagships have visible flare with flare slightly more visible on the Oppo.

In the following outdoor examples, a powerful artificial light source, a torch light, was used outside the field of view and has generated haze flare in the image.

The measurements and analyses also demonstrated how flare affected an image of a high dynamic range scene. Flare, particularly from strong sunlight or city lights, can limit a device’s HDR capabilities by degrading the raw information that gets captured.

In this example, for instance:

Results from the laboratory provide further details about each device’s handling of flare.

The following graph shows the presence and intensity of flare produced by the latest flagships in regards to the angle of the light source to the lens in or out of the field of view.

Flare intensity is the ratio between the average in the image of the unwanted signal (i.e. outside of the light source) and the normalized signal of the source. This ratio is expressed in decibels (dB).  Flare readings of less than -60 dB will not be visible in an image and will have no impact on image quality. However, as flare intensity rises above -60 dB, it could become more visible in the image (depending on the light conditions when the photo was taken.)

Flare readings of more than -40 dB and above indicate significant image quality degradation.

In the following graph, we can see that when the light source is in the camera’s field of view and at a 40 to 45 degree angle to the lens, flare intensity is measured considerably above  -40 dB by all the flagship devices, meaning the quality of image is severely compromised.

However, where flare behavior among the flagships becomes more divergent is when the angle of the light source increases and starts moving out of the camera’s field of view.

Our measurements showed the Oppo and Vivo devices containing higher volumes of flare than the other flagships even when the light source was at higher angles and out of the camera’s field of view.

Flare Intensity measurements

The  Xiaomi 15 Pro lands between those devices and the Apple iPhone 16 Pro Max. The four devices (Apple, Xiaomi, Oppo and Vivo) will be affected by  flare, especially for HDR scenes as it will degrade the quality of the darker signal.

A graphic depiction of flare

The following graphs help to scientifically illustrate the presence and intensity of flare at specific angles. In these examples, we can also see the characteristics of the flare produced.

The colored bar to right of each graph indicates the intensity of flare. For, example, the darker the blue, the less flare in the image. Using -30 db as a threshold at which flare begins to significantly degrade image quality, we can see that the intensity of the flare at 47-, 55-, and 63-degree angles, with the light source out of the field of view, never exceeded -30 decibels on these flagships. What is notable, however, is how the flare differs among these devices .

Flare at 47 degrees
Oppo Find X8 Pro
Vivo X200 Pro

 

Flare at 55 degrees
In the following graphs, we see that that both the Vivo and Oppo devices’s flare follows the curvature of a lens

Oppo Find X8 Pro
Vivo X200 Pro

 

Xiaomi 15 Pro
Huawei Pura 70 Ultra
Apple iPhone 16 Pro Max

In the range between 45° and 55° angles, the Xiaomi 15 Pro’s flare was barely visible, while the Huawei Pura 70 Ultra and Apple iPhone 16 Pro Max were able to maintain their maximum flare intensity below -49 db and -46 db, respectively, and their average flare measurement below -53 db.

Between 60° and 65° Huawei showed more flare than the competition, but the impact on the quality of the final image was more limited in this range, when compared between 45° and 55° angles.

Flare at 63 degrees
Oppo Find X8 Pro
Vivo X200 Pro

 

Xiaomi 15 Pro
Huawei Pura 70 Ultra
Apple iPhone 16 Pro Max

 

Harnessing lens flare

 

The Huawei Pura 70 Ultra’s burst of sunlike flare

While a burst of flare can sometimes appear aesthetically pleasing in a photo, we have seen from the previous graphs and some of the real-life images that most of the time flare does not add much artistically to a photo because of its intrusive and random nature. In any case, whether intended or not, flare reduces the basic technical quality of an image by interfering with contrast and the collection of light.

In this regard, most smartphone lenses are also still far from producing  pleasing forms of flare that can be sometimes created with professional lenses.

So what does all this mean to the average mobile photographer? While there might not be much anyone can do when it comes to lens flare, there are steps one can take to minimize flare and enhance image quality, such as avoiding shooting directly into strong light sources, adjusting camera angles, keeping lenses clean, and employing software solutions.

However, mobile photographers also shouldn’t shy away from experimenting with flare creatively, as it can add a unique artistic touch to photos. Balancing technological advancements with artistic expression is key in photography; while technology provides tools to enhance image quality, the photographer’s vision and creativity ultimately define the art.

DXOMARK is continuing to do deep dives on other pertinent aspects of smartphone photography in the latest flagships. Keep checking dxomark.com for the results of those studies that are coming soon.

The post Lens flare unveiled: The challenges in the latest flagship devices appeared first on DXOMARK.

]]>
https://www.dxomark.com/lens-flare-unveiled-the-challenges-in-the-latest-flagship-devices/feed/ 0 Lens flare unveiled: The challenges in the latest flagship devices - DXOMARK Learn how to handle lens flare and transform everyday moments into stunning visuals Challenges of lens flare flare_outdoor flare2 Lifestyle,USA,A woman, low angle view, side profile, looking upwards. Lens flare. A woman, low angle view, side profile, looking upwards. Lens flare. 03_flare_explained Flarebench_setup Flare outdoors Flare indoors Flare collimated indoor S16_Apple iPhone 16 Pro Max S16_Vivo X200 Pro S16_Oppo Find X8 Pro Picture1 Flare intensity graph comparison 47.0_flare_intensity_map_Y_Oppo 47.0_flare_intensity_map_Y_Vivo 55.0_flare_intensity_map_Y_Oppo 55.0_flare_intensity_map_Y_Vivo 55.0_flare_intensity_map_Y_Xiaomi 55.0_flare_intensity_map_Y_Huawei 55.0_flare_intensity_map_Y_iPhone 63.0_flare_intensity_map_Y_Oppo 63.0_flare_intensity_map_Y_Vivo 63.0_flare_intensity_map_Y_Xiaomi 63.0_flare_intensity_map_Y_Huawei 63.0_flare_intensity_map_Y_iPhone IMG_20241017_093839
Putting the Vivo X200 Pro to the test in China https://www.dxomark.com/putting-the-vivo-x200-pro-to-the-test-in-china/ https://www.dxomark.com/putting-the-vivo-x200-pro-to-the-test-in-china/#respond Wed, 04 Dec 2024 15:36:03 +0000 https://www.dxomark.com/?p=180026 The much-awaited Vivo flagship, the X200 Pro, has been available on the Chinese market for several weeks, and we know you have been eagerly awaiting a DXOMARK evaluation. Our engineers and technicians haven’t wasted any time in examining the performance of the X200 Pro’s camera. DXOMARK’s ongoing commitment to identify and study consumer preferences, particularly [...]

The post Putting the Vivo X200 Pro to the test in China appeared first on DXOMARK.

]]>
The much-awaited Vivo flagship, the X200 Pro, has been available on the Chinese market for several weeks, and we know you have been eagerly awaiting a DXOMARK evaluation.

Our engineers and technicians haven’t wasted any time in examining the performance of the X200 Pro’s camera.

DXOMARK’s ongoing commitment to identify and study consumer preferences, particularly in technically challenging areas such as smartphone portrait photography, has allowed us to enrich the way we are evaluating the latest smartphone flagships, particularly those from China.

To ensure our tests and protocols reflect daily use cases and local practices, DXOMARK performed localized photo sessions for evaluation to ensure thorough and typical use cases for devices specifically made for the Chinese market.

We are pleased to share localized images, laboratory measurements, and consumer surveys on the Vivo X200 Pro, a flagship that was recently released only in China.

Our methodology for this hands-on evaluation of the Vivo X200 Pro involved four days of shooting in Shanghai, two days of consumer portrait surveys, 30 local participants, three brands for comparison (four devices), 13 shooting modes, and three days of intensive testing in our Paris labs. More than 3,000 images and 250 real-life scenes were taken for this evaluation.

We focused on five main areas:

1. Zoom
2. Portrait and landscape quality testing
3. Photo styles
4. Flare
5. Zero shutter lag

We compared the Vivo X200 Pro’s performances with the Vivo’s predecessor, the Vivo X100 Pro; the current No. 1 phone in our ranking, the Huawei Pura 70 Ultra; and the Apple iPhone 16 Pro Max.

While each model has its own strengths and weaknesses, our results, detailed further below, showed that the Vivo X200 Pro smartphone proved to be a formidable flagship performer in tele zoom and portraits, with renderings that were highly favored by Chinese consumers.

Zoom: Excellent details

A main area of competition among smartphone camera makers in the ultra-premium category is the quality of their zoom performance. The image  quality of a smartphone camera’s zoom from ultrawide to longest tele depends largely on the device’s ability to maintain a good amount of image details.

Based on our evaluation, the Vivo X200 Pro’s zoom capabilities were class-leading, showing outstanding quality for medium tele and long ranges, starting from 85mm.

Combined with impressive hardware, a 1/1.4-inch 200-megapixel sensor, the tele module delivered outstanding results from  3.7x all the way to 10x, as the following examples show.

Huawei Pura 70 Ultra at 5x
Loss of details, no noise
Vivo X200 Pro at 5x
Lots of details, no noise
Apple iPhone 16 Pro Max at 5x
Strong noise, loss of details

 

Huawei Pura 70 Ultra
Vivo X200 Pro
Apple iPhone 16 Pro Max

 

In our lab, we measured detail preservation at various distances, but the Vivo X200 Pro clearly surpassed the others, particularly from 3.7x to 8x for all lighting conditions.

However, despite its stellar medium to long-range tele performance, the Vivo X200 Pro’s image detail rendition was a bit behind some of its competitors at close-range zoom, which we consider the 35mm to 70mm equivalent, and at ultra-wide focal lengths. In this regard, the Huawei Pura 70 Ultra, sets the bar high for detail rendering at all focal lengths by using its tele module to enhance details in the center of images on intermediate zoom levels.

The device’s performance at close-range zoom, from 35 mm to 70mm, and even at ultrawide, was behind some of its competitors, as seen in the graph and photos below:

Close-range zoom
Huawei Pura 70 Ultra - close range
Vivo X200 Pro
Apple iPhone 16 Pro Max

At close range zoom, top photo, the Vivo X200 Pro’s detail and resolution were better than the iPhone 16 Pro Max, but not as good as the Huawei Pura 70 Utlra. At ultrawide, below, details don’t reach Huawei’s levels, but the X200 Pro’s noise handling offers good results.

Ultrawide
Huawei Pura 70 Ultra
Vivo X200 Pro
Apple iPhone 16 Pro Max

 

Low-light zoom

In more challenging lighting conditions, such as in a dimly lit room, late twilight or early dawn, the Vivo X200 Pro’s medium to long-range tele module performance (from 85mm and up ) was equally impressive in terms of detail preservation, outdoing both the Huawei and Apple. While noise levels were well controlled on the X200 Pro, they were not at the same levels as the Apple and Huawei flagships.

 

At 3.7x, 20 lux measurements show even higher details preservation than several competitors like the iPhone for outdoor conditions
In low light, the Vivo X200 Pro performs very well in our noise metric.

 

High-quality portraits

The Vivo’s outstanding tele zoom kicks in at 3.7x, or 85mm equivalent, which happens to be the preferred focal length for portrait photographers and photo enthusiasts, making the device an excellent performer for this type of usage. This hardware choice allows the device to capture high-quality portraits with minimal perspective distortion, regardless of the lighting conditions, whether outdoors or indoors.

However, at 3.7x, the iPhone 16 Pro Max continues to use its main lens. The iPhone switches to its tele lens at 5x. The following photos, taken at the equivalent of 85 mm to 95mm, so equal or slightly more than 3.7x on the Vivo device, with the Huawei and Vivo using the tele lens, but the Apple camera using the main lens.

The Vivo X200Pro provided the best long-range zoom performance, capturing high-quality portraits with minimal perspective distortion with its powerful  telephoto lens.

Apple iPhone 16 Pro Max
Huawei Pura 70 Ultra
Vivo X200 Pro

 

Photos at 3.7x

 

Portraits taken at 3.7x
Huawei Pura 70 Ultra
Vivo X200 Pro
Apple iPhone 16 Pro Max

 

In the following examples, all devices were using the tele camera:

All photos taken with tele module at 5x
Huawei Pura 70 Ultra - Original
Vivo X200 Pro - Vivid
Apple iPhone 16 Pro Max - Standard

 

Bright faces, soft contrast

In portraits, the Vivo X200 Pro showed a significant improvement over its predecessor, the Vivo X100 Pro. The X200 Pro’s default Vivid mode provided bright face renderings with soft contrast. The device’s default mode, Vivid, was the largely preferred over other Vivo modes—Natural and Textured.

Vivo X200 Pro
Vivo X100 Pro

 

Huawei Pura 70 Ultra
Apple iPhone 16 Pro Max - Standard

The Vivo’s  Bokeh performance was class-leading, with high image quality in outdoor and night settings. The device has very limited segmentation artifacts around the subject.

Vivo X200 Pro – Bokeh

 

More on tones and styles

A key to good portraits and images, in general, is the way the mood can be conveyed through the tone. Device manufacturers are providing mobile photographers with a selection of options when it comes to personalizing their images or giving their photography a certain “look.” The Vivo X200 Pro, for example,  offers three modes: Vivid (default), Natural and Textured, which is billed as giving the image the mood of a masterpiece.  We saw that in bright light, the textured mode produced subtle differences in the images, but as the light decreased, the device wavered with exposure, providing an effect that was more intense and sometimes harsh or too strong.

A comparison of Vivo’s default “Vivid” mode and the “Textured” mode.

 

The Huawei Pura 70 Ultra also offers three modes, named Original, Vivid, Bright. The Apple iPhone 16 Pro Max offers six modes or undertones that it says are “specific to the skin undertones your camera captures”: Standard (default), Neutral, Amber, Gold, Rose Gold and Cool Rose.

The Vivo X200 Pro’s Natural look provided a softer rendering, with less local and global contrast, less saturation and slightly cooler tone, providing a neutral image from which photographers/users could easily apply filters from Social Media or other third-party apps to personalize their photos.

Vivo says that its Textured is inspired by postwar humanist photography, adding drama to the overall image with a generally lower target exposure, a darker image heavy on contrast.

Let’s see how the X200 Pro’s modes compared with the skin tone renderings of the iPhone and the Huawei for the following portrait:

Portrait
Default modes for our comparison devices
Huawei Pura 70 Ultra – Original
Vivo X200 Pro – Vivid
Apple iPhone 16 Pro Max -Standard
Vivo X200 Pro's 3 styles
Vivo X200 Pro – Vivid
Vivo X200 Pro – Natural
Vivo X200 Pro – Textured

 

Apple iPhone 16 Pro Max's 6 undertones
Apple iPhone 16 Pro Max – Standard
Apple iPhone 16 Pro Max – Neutral
Apple iPhone 16 Pro Max – Rose Gold

 

Apple iPhone 16 Pro Max – Cool Rose
Apple iPhone 16 Pro Max – Gold
Apple iPhone 16 Pro Max – Amber

 

Huawei Pura 70 Ultra's 3 styles
Huawei Pura 70 – Original
Huawei Pura 70 – Vivid
Huawei Pura 70 – Bright

 

In a one-on-one comparison between the Vivo X200 Pro and the Apple iPhone 16 Pro Max, our study of local participants showed that the renderings of the Vivo X200 Pro’s Vivid mode had a tangible edge over the iPhone 16 Pro Max’s Neutral rendering.

Still, it should be noted that individually, just under half of the participants (12 out of 30 participants) still ranked one of the iPhone’s modes higher or equal to the Vivo X200 Pro Vivid mode.

When we added the Huawei Pura 70 Ultra’s renderings into the mix, all 13 tested renderings, four people ranked one of the iPhone modes as No. 1, while one consumer ranked Vivo’s Vivid mode as No. 1. Overall, our blind comparison study showed that our Chinese consumers preferred Huawei’s Original and Bright modes over the Vivo X200 Pro regardless of its mode, putting the X200 Pro behind the Huawei’s rendering.

Comparing user satisfaction on average

 

Calculating the Satisfaction Index

Consumers were asked to provide feedback on the photos. The survey was divided into two parts:
1. Blind pairwise comparison: Participants were asked to choose between two photos of the same scene taken with different cameras. They take part in successive side-by-side comparisons between 2 of the 8 renderings of one scene until they achieve a consistent JOD (Just Objectionable Difference) scale across all 8 images (from 7 smartphones + 1 professional camera).
2. Photo series rejection: Participants were shown multiple photos of the same scene and asked to identify which ones they dislike or would not post on their social media.
This two-step survey allows us to collect the following information for each scene:
• the overall rejection rate for all respondents
• the rejection rate for the group being studied
• the JOD scale
A Satisfaction Index is then calculated for each picture, allowing us to determine user preferences and more.
To delve deeper, we included a questionnaire asking our panelists to specify why they might reject a particular picture.
To learn more about the Satisfaction Index, read Smartphone portrait photography: How did we measure user preference?

 

Sharp landscape images

The quality of landscape photos depends a lot on the details captured.

In technical terms, edge acutance, which refers to how sharp and clear the edges in an image appear, is an important attribute for landscape shots of cityscapes and architecture. Edge acutance measures how quickly the brightness changes from dark to light at the edges of objects in a photo. The higher the edge acutance results, the crisper and sharper the image appears.

The Vivo’s landscape images from the main camera displayed a high level of details, particularly in the center of the image but a tendency to show a visible loss of sharpness at the extreme corners. The Huawei Pura 70 Ultra and Apple iPhone 16 Pro Max did better in detail preservation throughout the image, especially in outdoor conditions.

Lab measurements showed that the Vivo X200 Pro’s edge acutance was nearly on the level of the Huawei and Apple, but the real-life photos helped to distinguish the difference among the devices in more detail.

 

Huawei Pura 70 Ultra
Vivo X200 Pro
Apple iPhone 16 Pro Max

Landscape photos produced vivid and pleasant colors. In cloudy weather, the Vivo X200 Pro’s white balance tended to lean toward slight green, while the comparison devices were more blue in tone.

But in sunny conditions, the X200 Pro’s colors were a bit more saturated than the colors on the comparison devices.

Intrusive flare

The device’s weak point was the presence of the artifact of flare, which was widely discussed on Chinese social media because of the flare’s obvious impact on image quality.

DXOMARK was able to precisely evaluate the characteristics of flare on its dedicated flare setup in its laboratory. With the device held at various angles to the light source, either in the field of view or not,  we were able to reproduce the issues spotted by consumers in very repeatable conditions and measure the flare. Other devices were also evaluated using the exact similar testing conditions for comparison.

Flare was significantly more visible than some competitors mainly when the light source was at a particular angle. The presence of flare is mainly linked to the device’s optical design and coating, and the chances of fixing it with a software solution are relatively low.

In our tests, all devices are vulnerable to flare in certain conditions, however, they are more noticeable in the Vivo and the iPhone.

The Vivo X200 Pro experienced more flare than the Huawei Pura 70 Ultra, for light sources that were at 50-degree and 70- to 75-degree angles. The flare from the Vivo remained significantly high even when the light source moved out of the field of view. The Huawei Pura 70 Ultra’s flare decreased faster as the angle of the light source increased.

The following graph depicts the device’s flare behavior at various other angles to the light source.

 

Capturing the moment

How is the Vivo X200 Pro at capturing the moment? In our tests, we measured a slight shutter lag, which is the delay between triggering the shutter and the image capture, with the X200 Pro across varying lighting conditions.  When the delay is more than 100 milliseconds, users can easily notice the delay when photographing moving subjects, affecting the user experience. The following example shows the milliseconds to capture after the shutter button is pressed.

 

The post Putting the Vivo X200 Pro to the test in China appeared first on DXOMARK.

]]>
https://www.dxomark.com/putting-the-vivo-x200-pro-to-the-test-in-china/feed/ 0 Screenshot 2024-11-25 133032 Detail preservation 1K Lux_New 1 Max detail 20 lux_New 1 Noise metric_new 1 1008_025_AppleiPhone16Pro_Standard_1 _cuva _cuva 1008_025_VivoX200Pro_Vivid Portrait focal length bokeh_2x_I_1_VivoX200Pro Vivid vs Textured 0008_021_VivoX200Pro_Vivid _cuva _cuva 0008_021_VivoX200Pro_Vivid 0008_021_AppleiPhone16Pro_Standard_1 0008_021_VivoX200Pro_Vivid 0008_021_VivoX200Pro_NaturalZeiss 0008_021_VivoX200Pro_Textured 0008_021_AppleiPhone16Pro_Standard_1 0008_021_AppleiPhone16Pro_Neutral_5 0008_021_AppleiPhone16Pro_RoseGold_4 0008_021_AppleiPhone16Pro_CoolRose_6 0008_021_AppleiPhone16Pro_Gold_3 0008_021_AppleiPhone16Pro_Amber_2 _cuva _cuva _cuva _cuva _cuva _cuva Global results-1 AVG Entropy Landscape graph_New 1 Flare 3 Flare graph Picture9 Delay Picture11
Color adjustment technology : a lack of consensus in the smartphone industry https://www.dxomark.com/color-adjustment-technology-a-lack-of-consensus-in-the-smartphone-industry/ https://www.dxomark.com/color-adjustment-technology-a-lack-of-consensus-in-the-smartphone-industry/#respond Mon, 21 Oct 2024 13:56:45 +0000 https://www.dxomark.com/?p=179530&preview=true&preview_id=179530 Most premium smartphones today come equipped with a feature that adjusts the display’s Correlated Color Temperature (CCT) based on ambient lighting. Since its color can vary depending on the type of light bulb (warm/neutral/cold white LED, incandescent, …) or time of the day (for natural light), this technology aims to create a more natural viewing [...]

The post Color adjustment technology : a lack of consensus in the smartphone industry appeared first on DXOMARK.

]]>

Most premium smartphones today come equipped with a feature that adjusts the display’s Correlated Color Temperature (CCT) based on ambient lighting. Since its color can vary depending on the type of light bulb (warm/neutral/cold white LED, incandescent, …) or time of the day (for natural light), this technology aims to create a more natural viewing experience by shifting the color temperatures of the screen closer to the color temperatures of the environment. CCT adaptation can help to minimize cognitive dissonance, a type of discomfort or tension that the viewer can experience when what he expects to see on the display does not match what he actually sees.

Some manufacturers even suggest that their displays are “paper like,” providing the viewer with a readability experience that resembles the naturalness of paper. That being said, this feature is rarely activated by default, with the notable exception of True Tone on iPhone.

Apple True Tone: The iPhone screen automatically adapts to ambient light conditions to optimize color display according to the environment.

 

Honor Natural Tone: Automatically adjust color temperature based on ambient lighting for a consistent, paper-like viewing experience.

 

Devices Apple iPhone 16 Pro Max Honor Magic6 Pro Samsung Galaxy S24 Ultra Xiaomi 14 Ultra Oppo Find X7 Ultra Vivo X100 Pro Google Pixel 9 Pro Huawei Mate 60 Pro+
Presence of CCT adaptation feature Yes Yes Yes Yes Yes Yes No No
CCT adaptation feature name True Tone Natural Tone Adaptative Color Tone Adaptative colors Natural Tone Display Color temperature adjustment N/A N/A
Activated by default Yes No No No No No N/A N/A

 

As part of our constant exploration of features contributing to user comfort, we compared color adaptation to ambient lighting of several flagship devices: the Apple iPhone 16 Pro Max, Honor Magic6 Pro, Xiaomi 14 Ultra, Vivo X100 Pro, Oppo Find X7 Ultra, and Samsung Galaxy S24 Ultra.

Lab Findings on Color Temperature Adaptation:

Although each manufacturer says that their colorimetry is optimized, our study showed that the experience was quite different with each device when it came to adapting to ambient lighting.

To gain deeper insights, we conducted a series of laboratory tests:

    • Evaluation in a use-case office environment (mixed lighting)
    • Evaluation in our perceptual booth with controlled lighting conditions, CCT ranging from 2700K (warm white) to 10000K (extreme case of bluish white, not reflecting natural lighting)

Here are the key observations:

    1. No Consensus Among Devices on CCT: 
      A notable observation was that the smartphones all had different CCT values, regardless of the lighting conditions. Each device had its own response to ambient light, resulting in varying color tones from one to the other, which could impact the user experience, for example, when switching between different devices. OEMs have undoubtedly studied user preferences regarding CCT adaptation, however their conclusions are widely different, as if preferences depend more on the brand than on the users.
    2. CCT Never Drops Below 5000K: 
      None of the devices we tested provided a consistent viewing experience with ambient lighting. However, there is one point on which all OEMs agree: a CCT under 5000K is not pleasant. Even in the warmest tested environment, all of the smartphones kept to a CCT higher than 5000K. In particular, the viewing experience is very far from paper reading experience.
    3. No adaptation from the Samsung Galaxy S24 Ultra: 
      One of the most striking findings was the behavior of the Samsung Galaxy S24 Ultra. Although the rendering visibly changes when activating the feature, dropping from 7000K to 6000K, the device consistently displayed the later CCT in all tested conditions, showing no signs of adapting to the ambient light. This points to a possible hardware or software issue because the device did not adjust in scenarios where other smartphones at least attempted some level of adaptation.
    4. No Adaptation to Unnatural Lighting Conditions: 
      It’s no surprise that most devices will not be able to adapt to an extreme 10000K lighting condition, which is rarely encountered in nature or artificial lighting. However, we wanted to test the limits of these devices to see how far they would adjust. Most smartphone displays demonstrated a bluish-white adaptation up to around 7500K. A notable exception was the Vivo X100 Pro display, which adapted as high as 9000K, delivering a consistently smooth viewing experience. While this is a notable achievement from the Vivo device, most people will not notice it.

Conclusion: 

While many premium smartphones advertise advanced ambient light adaptation technology, our tests suggest that there is still significant room for improvement in how devices handle color temperature adjustments. The choice to stay above 5000K, inconsistent behavior in warm lighting, and hardware or software limitations, such as the static CCT on the Samsung Galaxy S24 Ultra, highlight the need for further refinement in this area.

For users seeking the most natural and comfortable viewing experience, these differences in CCT adaptation could play a crucial role in their overall satisfaction with a device. Manufacturers will need to focus more on fine-tuning these algorithms through user preference studies which might differ regionally as TV white point settings do and to ensure that their devices can more accurately provide a comfortable experience in real time.

While all these devices promise to bring the best experience to the user, it is nonetheless surprising that there remains no technical consensus on adaptation. Who do you think provides the best CCT adaptation?

The post Color adjustment technology : a lack of consensus in the smartphone industry appeared first on DXOMARK.

]]>
https://www.dxomark.com/color-adjustment-technology-a-lack-of-consensus-in-the-smartphone-industry/feed/ 0 CCT_KV_1 Flagship_screen_display_color displayroom_edit 2700K_v2CCT_EN 5000K_v2_CCT_EN 6500K_v2CCT_EN 10000K_v2CCT_EN
Feature focus : diving into the Apple iPhone 16 series undertones functionality https://www.dxomark.com/feature-focus-diving-into-the-apple-iphone-16-series-undertones-functionality/ https://www.dxomark.com/feature-focus-diving-into-the-apple-iphone-16-series-undertones-functionality/#respond Fri, 18 Oct 2024 09:27:54 +0000 https://www.dxomark.com/?p=179308 The user experience is a key focus in all of DXOMARK’s testing. Portrait photography from smartphones has recently been a topic of great interest at DXOMARK. In addition to testing, we have also been running various Insights across the globe about user preferences when it comes to portraits. While our Insights documented clear regional preferences, [...]

The post Feature focus : diving into the Apple iPhone 16 series undertones functionality appeared first on DXOMARK.

]]>

The user experience is a key focus in all of DXOMARK’s testing. Portrait photography from smartphones has recently been a topic of great interest at DXOMARK. In addition to testing, we have also been running various Insights across the globe about user preferences when it comes to portraits. While our Insights documented clear regional preferences, our studies also identified that in each region, individual complexities such as age or photography experience had a significant impact on preferences, which indicated that the more users can truly individualize their photos, the more satisfied they will be with them.

With its latest flagship, the iPhone 16 Pro Max, and other 16-series models,  Apple has provided an innovative feature that addresses this very topic by adding more base tones to its “Photographic Styles.” The addition of tones such as Cool Rose, Neutral, Rose Gold, Gold, and Amber, can give images a very personalized look via tone and warmth. Apple’s website says that “the Photographic Style you select will be specific to the skin undertones your camera captures,” and that “you can make adjustments to it in Camera or edit it in the Photos app.”

We ran a further study on this new feature to see how it works from a technical standpoint. We also wanted to see how the feature would resonate with a small group of everyday consumers.

How we tested the undertones feature

In a very informal two-day experiment that applied the principles of our Insights methodology (but on a smaller scale), we gathered 10 people in Paris representing a range of skin tones to test what they thought of their portraits when applying these undertones to their photos and asked them to choose their preferred rendering using a blind test comparison method.  The experiment involved shooting 30 scenes in which we took individual and group photos using each of the iPhone’s new undertone settings and compared them with the standard image results from the Samsung Galaxy S24 Ultra (released in January) and the Google Pixel 9 Pro XL (released in August), the flagship phones from the iPhone’s closest competitors.

We asked the participants to indicate their preferred image twice: once on the smartphone screen immediately after capture and then on a computer screen. For the latter, participants viewed the HDR pictures under standardized conditions (ISO-22028-5) with compatible HDR screens. In addition to being displayed according to each manufacturer’s HDR settings, the pictures were also shown at smartphone-size dimensions.

What happens to the color of the image when the new photographic styles are applied? In our brief experiment, we saw that there was no significant impact on the face brightness and that the undertone modified not just the skin tones, but also had a global impact on colors in the image to a lesser extent.

For all images in the study, we were able to measure how skin tone (on cheeks and forehead) is rendered, and we also included an 18% gray patch in part of the images to measure the impact of different undertone settings on an image’s white balance. In this way, we were able to measure the degree to which each style changed the image’s grays and skin tones from the standard. We averaged indoor scenes with different skin tones to show the impact the feature had on the colors in the image.

The following graph is a chromaticity map (in the ICtCp color space that we use for HDR images),  and it shows how the different devices and how the iPhone’s undertone settings render gray areas in the image and the skin tone measured on models’ skin in these scenes.  What was striking was how the color of these new tones deviated, particularly on the skin tones. The graph shows the magnitude of the impact of undertone settings on the color, which compares the gap in skin color among the standard iPhone vs Google vs Samsung to the gap among all the undertone renderings.

The chart illustrates the effect and divergence of the undertones on the skin tone compared with the other colors in an image.

We observed that regardless of the region, scenes in which color made the difference on user preferences in our Insights studies, these undertone settings would surely affect user-preference rankings.

In our Insights studies, we observed that regardless of the region, there were scenes in which color played a deciding role in users’ preferences. We believe that Apple’s new undertone settings would have affected those user-preference rankings.

We also observed in our Insights the large extent to which exposure was playing in users’ preference rankings. This aspect remains unchanged for different undertone settings, so users’ pain points are unlikely to be resolved solely with the default settings available in the undertone modes in the cases where exposure was the main criterion. Exposure can be adjusted but only by diving deeper into the feature and playing with the tone pad.

The tone pad on the Apple iPhone 16 Pro Max.

 

Scene evaluation example 1

 

Scene evaluation example 2

 

What did our small group of individuals prefer?

The results of our experiment in Paris showed that our 10 participants generally preferred the Apple iPhone’s Standard rendering as well as the Cool Rose, Neutral and Rose Gold tones over the default tones of the Samsung Galaxy S24 Ultra and the Google Pixel 9 Pro XL.

 

The consumer satisfaction (the percentage of scenes for which the Satisfaction Index was higher than 70) among the four preferred iPhone tones was generally equal. What was more revealing was the overall low preference for the iPhone’s Gold and Amber tones in our test group, even though some individuals selected those renderings as the best in certain circumstances.

It was also interesting to note that undertone preferences differed in some cases when viewing an image directly on the device versus on a computer screen, confirming that the monitor on which an image is viewed can influence a user’s preferences.

Scene Evaluation example 1

 

Scene evaluation example 2

When looking at the results per individual, the preferred renderings varied widely, sometimes even from scene to scene. This showed that sometimes there was no dominant preference even for the individual. The ability for the user to experiment with and to fine-tune an image’s tones before or after the capture on a smartphone shows the degree of importance that some manufacturers are now placing on individual preferences. Providing individuals with the tools necessary to personalize every image the way they want is sure to lead to a more satisfying consumer experience.

While advanced mobile photographers were likely already using third-party apps such as Lightroom Mobile or other filter apps to personalize their images, but we’ve hardly seen a feature like this being so integrated into a device’s default camera app that opens such a degree of personalization to every user.

The iPhone 16 Pro’s undertone feature, however, did show some limitations when it came to group photos. In photos with multiple people, our experiment showed that the iPhone 16 Pro’s Standard and Neutral tones were almost as preferred as the images from the Google Pixel 9 Pro XL and the Samsung Galaxy S24 Ultra.

But when questioned separately, each person in the photo had a different preference, and the current feature does not appear to allow to fine-tune specific portions of the photo with different undertone.

 

In our Insights study in Paris last year, we identified specific indoor scenes on the iPhone 14 Pro Max that were clearly not the favored renderings, such as the example below.

Apple
Google
Samsung

In this example, the subject in the scene preferred the Google rendering, while most other participants preferred the Samsung rendering. Looking at the measurements for these images, we see that face exposure was similar between the iPhone and Google, but that the skin-tone color measurement showed a more saturated rendering for the iPhone.*

*The Insights Paris study was performed in SDR format at that time, which explains the use of L*a*b* color space in this case.

Scenes like the one above from our Insights Paris in which the iPhone 14 Pro Max rendering was not preferred (most likely because of color), we are confident to conclude from our recent small study that the iPhone 16 Pro Max’s undertones would have been both significant and subtle enough to provide a rendering that would have shown a much higher satisfaction of the scene.

The need for further study 

It has been over a month since the iPhone 16 series has been released, and it will take some time to see how people end up using these undertones features and how it impacts user satisfaction.

But our small experiment in Paris showed that people have varying degrees of individual preferences, depending on the scene–a finding that corresponds with the results from our much larger Insights study in Paris, which showed the huge impact that undertone settings can have on a user’s satisfaction of a portrait.

This indicates that smartphones that give users the capability and the options on hand to adjust to those preferences will always gain a slight advantage over those products that don’t.

In addition, some participants changed their minds about their preferred undertone renderings, choosing one after viewing the images “live” on the devices but choosing another undertone preference when viewing the image later in standardized conditions. The benefit of Apple’s undertone feature is that it gives users the means to change their minds about their preferred renderings at any moment —  before or after capture — and regardless of the visualization conditions. This is a step in the right direction when it comes to increasing consumer satisfaction.

In its quest to continuously improve the end-user experience, DXOMARK will continue to monitor and measure the impact of user preferences on the innovative features from the latest flagship devices.

 

The post Feature focus : diving into the Apple iPhone 16 series undertones functionality appeared first on DXOMARK.

]]>
https://www.dxomark.com/feature-focus-diving-into-the-apple-iphone-16-series-undertones-functionality/feed/ 0 iPhone_Undertones_KV_2 Screenshot 2024-10-15 153123 iPhone_Undertones_KV_2 Undertone_article_comparison_1 Indoor – undertones points graph Undertones – Satisfaction Index Low light – undertones bar graph Indoor – undertones bar graph Group – undertones bar graph 1143_2_AppleiPhone14Pro 1143_2_GooglePixel7Pro 1143_2_SamsungS23Ultra Screenshot 2024-10-15 160825
We found out what Chinese consumers really think of HDR smartphone portraits https://www.dxomark.com/we-found-out-what-chinese-consumers-really-think-of-hdr-smartphone-portraits/ https://www.dxomark.com/we-found-out-what-chinese-consumers-really-think-of-hdr-smartphone-portraits/#respond Tue, 03 Sep 2024 20:46:34 +0000 https://www.dxomark.com/?p=177748 Following our research in Paris, DXOMARK conducted its Insights study about smartphone portrait photography in Shanghai, China. The goal of the study remains to understand user preferences, expectations and pain points with portraits captured in everyday scenarios. At DXOMARK, we observe and analyze any technologies and advancements that are meaningful in improving for the user [...]

The post We found out what Chinese consumers really think of HDR smartphone portraits appeared first on DXOMARK.

]]>

Following our research in Paris, DXOMARK conducted its Insights study about smartphone portrait photography in Shanghai, China. The goal of the study remains to understand user preferences, expectations and pain points with portraits captured in everyday scenarios. At DXOMARK, we observe and analyze any technologies and advancements that are meaningful in improving for the user experience. When it comes to smartphone photography, we naturally turned our attention and research to the HDR format, a feature that is being integrated by all manufacturers into various products, with a particular focus on portrait, one of the most common but challenging use cases.

In this article, our findings will show the areas of image quality that help identify the preferences of Chinese consumers.

FOUR KEY TAKEAWAYS

  1. Pictures taken with Huawei and Vivo flagships are generally preferred by the panel in most lighting conditions, surpassing even the photographer’s rendering.
  2. HDR imaging is a game changer, but the formats are not perfected, and brands are still exploring different strategies.
  3. A brighter face rendering is generally preferred by Chinese consumers, but only to a point (the skin tone should not look too oily or too bright).
  4. Flagship devices still face issues when it comes to specific use cases.

 

 Methodology in brief

This DXOMARK Insights study on smartphone HDR portrait photography was conducted to evaluate the perceived quality of images captured in HDR formats, rather than their playback qualities on proprietary smartphone screens. We applied the same methodology used in our Insights Portrait study in Paris, with a focus on the visualization of HDR images under standardized conditions.

The study involved seven of the latest flagship devices popular in China: Apple iPhone 15 Pro Max, HONOR Magic6 Pro, Huawei Pura 70 Ultra, OPPO Find X7 Ultra, Samsung Galaxy S24 Ultra, Vivo X100 Pro, and Xiaomi 14 Ultra.  A full-frame mirrorless professional camera was also included for comparison, with images edited by a professional photographer using an HDR pipeline in Photoshop.

A group of 80 consumers, who also served as models in the photos, and 10 local professional photographers participated in the survey. The aim was to capture the general public’s diversity in age, gender, and skin tone.

See below for more details on how we conducted this survey.

A guide to our China study

The HDR experience, from capture to display

HDR stands for High Dynamic Range. It describes a scene with a high ratio between the brightest and darkest parts of a scene.

The term “HDR” can be used in different contexts:

  • An HDR scene is a “real world” setting with a high ratio between its brightest and darkest parts. It is the difference in the amount of light between bright and dark areas.
  • Image capture HDR technologies allow smartphone cameras to create images that retain details in both shadows and highlights, which might otherwise be lost, partly because smartphones have smaller sensors with less Dynamic Range than a Full-frame professional camera to capture the whole dynamic of an HDR scene. This is achieved by combining multiple shots taken at different exposures into a single image, resulting in better contrast and color accuracy. This technology is especially useful in scenes with high contrast, such as landscapes or backlit portraits.
  • This captured image is then stored in a file whose type, format and bit-depth define how much the dynamic of the scene captured will be compressed in the file: for instance, dynamic range will be more compressed in 8-bit .jpeg files (which is the common SDR image format) than in a 10-bit .heif file. When referring to HDR formats, we usually mean a file of 10 bits or more, or an 8-bit jpeg file complemented with metadata called a “gainmap” that describes how to display an image on an HDR display in terms of local brightness.
  • HDR displays increase brightness to levels of 1,000 nits or more from the 200 to 300 nits in Standard Dynamic Range. This improvement allows images to be displayed with higher overall brightness, dynamic and therefore improved contrast, but it requires specific file formats or image metadata mentioned above to optimize image rendering on the displays.

The third and fourth aspects imply key differences between the Shanghai and Paris Insights, with the study in China placing more emphasis on the HDR experience in flagship devices, especially by using images in their HDR formats. Some are “in-house” manufacturer-specific HDR formats, others are more publicly available. As a new and fast-developing technology, not all HDR formats can yet be visualized outside the OEM’s ecosystem, so when images are shared (through social media or messaging apps), they are often converted or viewed as SDR images (as jpeg).

The primary objective of this DXOMARK Insights study on smartphone HDR portrait photography was to assess the perceived quality of HDR images captured on smartphones, focusing on their HDR formats rather than the playback quality on proprietary smartphone screens. Consequently, all participants viewed the HDR pictures under standardized conditions (ISO-22028-5) with compatible HDR screens. In addition to being displayed according to each manufacturer’s HDR settings, the pictures were also shown at smartphone-size dimensions.

In addition, due to the technical constraints in displaying HDR content on the web, please note that the photos used in this article are for illustration only. The images in this article are either an SDR version for illustration or have a linear tone mapping applied to them to approximate the appearance of HDR on devices that have limited dynamic range in terms of brightness, which will be mentioned when that is the case. This means that the overall brightness difference between images is more or less preserved compared to their original HDR versions.

Read our DXOMARK Decodes: Understanding HDR imaging. 

The technical framework

The shooting plan was designed to feature the common use cases of Chinese consumers, such as portraits taken in restaurants or against the backdrop of the Shanghai skyline. A total of 400 scenes, featuring a combination of staged settings and 80 models, were photographed under various setups and lighting conditions.

The seven latest flagship smartphones popular in China were used (with public firmware versions available in China when we took images in Shanghai at the end of May 2024) as well as a full-frame mirrorless camera for comparison; the camera’s renderings were edited by a professional photographer using the HDR pipeline in Photoshop.

The models were asked to provide feedback on the photos of themselves. The panel consisted of 80 models and 10 professional Chinese photographers. The survey was divided into two parts:

  1. Blind pairwise comparison: Participants are asked to choose between two photos of the same scene taken with different cameras. They take part in successive side-by-side comparisons between 2 of the 8 renderings of one scene until they achieve a consistent JOD (Just Objectionable Difference) scale across all 8 images (from 7 smartphones + 1 professional camera).
  2. Photo series rejection: Participants are shown multiple photos of the same scene and asked to identify which ones they dislike or would not post on their social media.

This two-step survey allows us to collect the following information for each scene:

  • the overall rejection rate for all respondents
  • the rejection rate for the group being studied
  • the JOD scale

A Satisfaction Index is then calculated for each picture, allowing us to determine user preferences and more.

To delve deeper, we included a questionnaire asking our panelists to specify why they might reject a particular picture.

To learn more about the Satisfaction Index, read Smartphone portrait photography: How do we measure user preference?

Participants who are typical consumers in China

We gathered 80 consumers, who also served as the models in the photos, and 10 local professional photographers for the survey. The goal was to reflect the general public’s diversity in age, gender, and skin tone. Using scientific studies on skin-tone distribution in Shanghai1 and measuring the skin-tone reflectance of our models, we defined six distinct skin-tone groups and built our panel.

The six representative skin tones that formed the panel from darkest (far left) to lightest.

The satisfaction ranking in China

How did the most popular devices in China perform? Let’s take a closer look at the results.

DXOMARK experts developed the Satisfaction Index, a metric that quantifies user preferences when viewing an image and measures the level of satisfaction of respondents. It takes into account several factors and is scored on a scale of 0 to 100, with 0 indicating that the image was rejected by more than 50% of respondents and 100 indicating no rejection at all.

The overall mean Satisfaction Index for all the portrait pictures reviewed in Shanghai is a high 87.

The pictures taken with the flagship devices are generally well-received by Chinese users. Huawei Pura 70 Ultra and Vivo X100 Pro are solid performers in all conditions, with an overall satisfaction index of 93 and 90, respectively. They are even preferred over the photographers’ renderings.

Our survey found that an astounding 76% of the respondents preferred images from the Huawei device, ranking it No. 1 on average for all 400 scenes, followed by Vivo at 11%.

In outdoor conditions, the Huawei Pura 70 Ultra is the preferred choice by a significant margin, achieving a Satisfaction Index of 98, 15 points more than its closest competitor. The Huawei Pura 70 Ultra and the Vivo X100 Pro also perform well in low light and night conditions, taking the leading positions as the most-favored devices, with the Samsung Galaxy S24 Ultra also ranking highly.

Overall, the level of satisfaction remains lower in challenging conditions such as low light and night-time scenes even as manufacturers continue to promote image-quality improvements in those environments.

Participants are not completely satisfied with Honor Magic6 Pro’s and Oppo Find X7 Ultra’s HDR renderings: More than 20% of images achieve a Satisfaction Index below 70.

The game changer in HDR strategy: Where to brighten the image

The Chinese study confirms that every OEM has a unique approach to leveraging the extra luminance provided by HDR screens. Specifically, not all OEMs use the HDR display headroom, or the available dynamic range, in the same way.

What do we mean by “headroom”? For example, SDR displays typically show images within a brightness range of about 200 to 300 nits, with highlights reaching the display’s maximum luminosity. In contrast, HDR displays can exceed 1,000 nits in brightness. Initial industry-standard discussions would suggest setting pure white at around 200 nits and reserve the remaining luminosity (200 to 1,000 nits) for the highlights. This reserved brightness range is referred to as headroom.

Currently, manufacturers employ two main strategies:

  1. Render the majority of the image content within the SDR range (200 to 300 nits) while using the headroom only for scene highlights. This approach enhances overall contrast but results in lower subject brightness.
  2. Use the headroom to enhance the face or subject brightness. This approach allocates less of the luminosity range to highlights, resulting in slightly reduced contrast but increased brightness for faces and subjects.

Oppo and Honor primarily boost the highlights, resulting in an overall lower target exposure compared to other flagships (following strategy 1 above). Additionally, Honor and Oppo do not always fully use the headroom, which means they sometimes fail to take full advantage of their HDR displays’ capabilities.

In contrast, Huawei and Apple use part of the headroom to increase the overall brightness of their images (following strategy 2 ). As a result, Huawei and Apple rank higher and are preferred over the Oppo and Honor flagships.

There is a strong correlation between user preference and face/overall image brightness. Huawei and Apple enhance subject brightness by utilizing part of the headroom, a rendering strategy that users find more appealing

From left: Huawei Pura 70 Ultra, Apple iPhone 15 Pro Max, Oppo Find X7 Ultra, Honor Magic6 Pro.

The illustrations above show the HDR histogram for each image. The blue pixels in the images highlight the areas rendered within the headroom range of the display. As observed, Huawei and Apple boost certain parts of the face and background to enhance facial brightness and overall image brightness. Oppo and Honor, however, offer minimal boost in these areas, opting instead to partially utilize the headroom only for white parts.

From left: Huawei Pura 70 Ultra, Honor Magic6 Pro, Apple iPhone 15 Pro Max

In the example above, we can see that the Huawei and Apple devices use the headroom (highlighted in green) across most of the pixels in the image, while Honor primarily boosts the brightness of the sky in the background. These images, which are for illustration only, have a linear tone map applied to them in order to approximate the appearance of HDR.

The brighter the face the better, but how bright?

What are the main reasons a picture gets eliminated? We asked our panelists to select the images they wouldn’t post on their social media—an accessible criterion for everyone—and then provide the reasons for their rejection.

An image being too dark  (36%) or a face being too dark (24%) were the top 2 reasons for rejection identified by participants. However, participants also rejected the photos when the image was too bright (12%) and when the face was too bright (12%).

The top two reasons participants rejected images were because an image was too dark (36%) or a face was too dark (24%). The third reason was because  “the skin tone of the model looks unnatural” (19%). It is interesting to note that Chinese users sometimes identify images as “photoshopped” (5%), when in fact, they are overprocessed by smartphone algorithms or have contrast problems.

This is the case in the example below. Respondents rejected the renderings of the Apple iPhone 15 Pro Max and Honor Magic6 Pro and thought the photographer’s rendering was the best image.

From left: Apple iPhone 15 Pro Max, Photographer rendering, Honor Magic6 Pro

Participants prefer brighter images, especially when it comes to faces. However, brightness has its limitations. When exposure is high, the skin tone can appear oily or shiny, if there is also clipping on part of the face, or specular reflection, due to overexposure. The appearance of oily areas on the skin can lead to rejection from Chinese users. Although the Vivo device was second in the overall ranking, it was only the third preferred device for outdoor conditions. Users often perceived the face as being “too bright” in these cases, and we consistently observed an oily rendering due to reflections on the skin, which was particularly noticeable on the Vivo device.

Due to the technical constraints in displaying HDR content on the web, please note that the photos in this article are for illustration only. The following images have a linear tone map applied to them in order to approximate the appearance of HDR on devices that have limited dynamic range in terms of brightness. This means that the overall brightness difference between images is more or less preserved compared to their original HDR versions.

From left: Huawei Pura 70 Ultra, Vivo X100 Pro, Xiaomi 14 Ultra, Honor Magic6 Pro.

“Not white enough”: A local beauty standard persists

Another reason for rejection is that the skin tone is “not white enough” (13%), confirming that white skin remains a standard of beauty for Chinese consumers. This reason was commonly cited by both men and women, but more so by younger panelists.

This explains the rejection of the iPhone, which provides a warmer rendering of faces.

From left: Apple iPhone 15 Pro Max, Huawei Pura 70 Ultra, Xiaomi 14 Ultra, Samsung Galaxy S24 Ultra

 

Pro photographers vs. consumer expectations

Interestingly, the overall Satisfaction Index for the professional rendering is 89, which is lower than the Huawei and the Vivo devices. In the Paris study, the pictures taken with a professional camera almost always achieved the highest Satisfaction Index. In Shanghai, we observed that consumer expectations can differ from those of professional photographers.

In the example below, the photographer’s rendering highlights its subject with a lot of contrast. The sense of night in the background is better preserved compared to the Huawei Pura 70 Ultra’s rendering, which has a softer overall contrast. In this case, Chinese consumers largely preferred the smartphone rendering, while Chinese professional photographers preferred the professional camera rendering.

Photographer rendering (left) and Huawei Pura 70 Ultra

When people view their portraits on a smartphone screen, they want their face to be clearly visible and the image to be bright, which might help explain why photographer renderings in this study, which are slightly darker and tend to focus on capturing a mood, get lower ratings.

 “Our study shows that face visibility and user satisfaction are strongly correlated when using HDR formats, and that most people preferred images with bright faces, even if it altered the original scene’s atmosphere. Still it is worth noting that the same is not always true for professional photographers, whose preferences are more impacted by subtle colors and contrasts rather than  than brightness.”

Pierre-Yves Maître, Image Quality Expert at DXOMARK

Chinese consumer satisfaction is more influenced by face brightness than color, which is not the case for Chinese professional photographers.

The preference ranking between Chinese consumers and Chinese photographers changes slightly for night photography. While Chinese consumers prefer Vivo’s rendering for night pictures, Chinese photographers prefer the Huawei Pura 70 Ultra and the photographer rendering: Photographers show a stronger sensitivity to color and contrast.

The challenges of Chinese use cases for flagships

A portrait with the city skyline illuminated in the background is the most typical use case in our shooting plan. This HDR scene with a backlit model is quite challenging. It appears that this is the use case where flagship devices still face difficulties. In this scenario, the photographer’s rendering demonstrates the best approach, as smartphones struggle to properly illuminate both the subject and the environment.

From left: Honor Magic6 Pro; photographer rendering, Xiaomi 14 Ultra

Another challenge is indoor photography in restaurants or clubs with mixed lighting. Flagship devices still encounter difficulties in these situations. Professional photographers can achieve finer saturation and vibrance tuning with a localized approach, showcasing bright areas while distinguishing the face from the background in terms of brightness.

Photographer rendering (left), Honor Magic6 Pro

 

Going beyond cultural preferences

Huawei and Vivo flagships’ renderings are widely preferred even over the professional photo renderings, and thus in most conditions. With an impressive overall Satisfaction Index, this generation of smartphones provides a generally very satisfying experience for Chinese users.

However, flagship devices still face challenges and shortcomings with key Chinese use cases, such as indoor scenes in clubs and restaurants, and city walks at night. When examining the reasons for picture rejections, besides the universal perception that “the brighter the face, the better,” there are specific aspects of Chinese consumer preferences to consider. This study confirms that results that satisfy a professional photographer might not meet the expectations of a regular consumer using their device.

The new DXOMARK Insights study details the crucial technical factors that ensure high-quality portrait photography and enhance user satisfaction in the HDR ecosystem of every flagship device.

Based on the results observed in the various countries where the study was conducted (France, India, and China), it appears that location is among the influential factors, but not the only one, when it comes to preferences. Instead, there are a host of other factors to consider such as the user’s age, personal ideas about photography, and people’s complexions, just to name a few. In the end, preferences are very personal, and our in-depth and global studies suggest that OEMs should open the way to greater personalization and customization of their devices in all markets.

Learn more about DXOMARK Insights

1    1A review of the evidence for intrinsic ethnic differences as important determinants of skin aging and carcinogenesis: a hope for all ethnicities – Stephanie Ying Chan

The post We found out what Chinese consumers really think of HDR smartphone portraits appeared first on DXOMARK.

]]>
https://www.dxomark.com/we-found-out-what-chinese-consumers-really-think-of-hdr-smartphone-portraits/feed/ 0 Insights_China_articlekeyvisual Skintone_range Slide19_graph Slide21_graph Slide33_heatmap (1) Slide29_heatmap Slide48_graph Slide23_3photovisual Slide45_4photovisual Slide51_graph Slide52_4photovisual Slide54_2photovisual Slide22_graphs_2 Slide55_Graphs Slide59_3photovisual 2photovisual_new