Html: Responsive images causes performance issues on high-resolution devices

Created on 7 Mar 2019  ·  16Comments  ·  Source: whatwg/html

tl;dr: Correctly marked up responsive images (srcset + sizes) result in high-resolution devices (2x, 3x, 4x, 5x, 6x dpi) downloading and using very large files causing significant performance hits, especially on mobile devices on mobile networks.

Scenario

  • Layout with full-bleed (edge-to-edge) image, served in a responsive layout to all device widths.
  • srcset list includes images of varying src options, from small (~400px wide) to very large (~3000px wide) to account for mobile, tablet, desktop, and large monitors, high-resolution laptops etc.
  • sizes attribute marked up to spec with natural breakpoints.

Under these circumstances, a high-resolution smartphone will download larger image files to meet screen resolution demands, thus negating the original intent (as I understand it) of the RICG responsive images work which was to _lower_ the bandwidth hit on mobile devices when viewing images.

This scenario is further complicated by modern design trends including full-bleed images as described above. Because the srcset list must include very large image sources to account for large high-resolution screens, small devices with high-resolution screens have access to and will use these same sources resulting in wasted bandwidth and significantly degraded performance.

Practical use case

This issue came to my attention while working on the new content editor for WordPress ("Gutenberg"). The content editor allows for images to be aligned wide and full, the latter meaning the image takes up the full available width which often means the width of the viewport.

When updating WordPress to generate appropriate srcset and sizes attributes to account for these layouts, I found significant performance reductions on mobile devices caused by the download of larger-than-required images as per the sizes attribute.

A functional example, with detailed explanation, is available here:
https://gutenberg.mor10.com/responsive-images-demo-corrected-code/

The result of this discovery was a sub-optimal workaround for WordPress core which is live on every WordPress site running version 5.0 or above: To avoid the performance hit on mobile devices, the largest image source in the srcset list is 1024px unless the theme explicitly states otherwise. In other words, browsers are served with smaller-than-necessary images, causing a poor user experience.

A functional example of the current auto-generated output as of WordPress 5.1 is available here:
https://gutenberg.mor10.com/image-test-current/

Summation: Due to the performance issues introduced by the srcset + sizes combination, 33.3% of the web is currently shipping the wrong image sources _on purpose_ as a workaround. That said, this same issue will be experienced by anyone setting up a site with full-bleed images as described above, WordPress or not.

Possible solutions

From my perspective as a front-end developer, the ideal solution would be to amend the spec to allow developers to declare, through attributes or similar on each individual image, the pixel density the sizes attribute should be measured against. Something like this:

<img 
  src="fish-1-1024x684.jpg" alt="" 
  srcset="
    fish-1-1024x684.jpg 1024w, 
    fish-1-300x200.jpg 300w, 
    fish-1-768x513.jpg 768w, 
    fish-1-1568x1047.jpg 1568w" 
  sizes="
    (min-width: 768px) calc(8 * (100vw / 12) - 28px), 
    (min-width: 1168) calc(6 * 100vw/12) - 28px),
    calc(100% - (2 * 1rem))"
  resolution="1x"
/>

Alternatively, the browser could detect available bandwidth and other factors and actively throttle the srcset list accordingly so high-resolution mobile devices on mobile networks would receive the appropriate resolution interpretation based on available data and performance. This of course brings into question how to measure bandwidth and download limits, especially for users on max-megabytes-per-month plans.

I know this type of throttling is technically possible using client hints, but configuring client hints and server side solutions is beyond the capacity of most CMSes and site owners, and puts the onus of having the web work as expected on the individual user.

In lieu of the suggestions above type of bandwidth throttling, a third option could be to put browser limits on how high the resolution can be for images (2x limit on a 4x screen as an example).

cc @yoavweiss @joemcgill @getsource

img

Most helpful comment

The entire point of this feature is that browsers should be smart enough to be able to intelligently choose the "best" resolution to download; they have access to more information than the page author does, and whatever heuristics they come up with to intelligently pick between options get automatically applied to every page using this feature, rather than only being for whatever tiny % of people use a particularly good image-choosing JS library.

If browsers aren't making good decisions, that's a bug on browsers. There shouldn't be any need for the page to further intervene here. It's definitely plausible that browsers are being naive here, so please file bugs showing bad results. ^_^

All 16 comments

Alternatively, the browser could detect available bandwidth and other factors and actively throttle the srcset list accordingly so high-resolution mobile devices on mobile networks would receive the appropriate resolution interpretation based on available data and performance.

^ This. IMHO, the spec is about ar art directionish as it should be and all this should be left to the browsers. But I wholeheartedly agree that this needs to be adressed.

I've discussed this with @mor10 last week. It seems like the ubiquity of high resolution screens has made it so that there's a need for a cap on the DPR levels browsers take into account, either implicitly or explicitly.

From my perspective, the main hurdle towards an on-by-default cap, automatically enforced by the browser, is the lack of data on the cut-off ratio, where higher-resolution doesn't necessarily mean better user-experience. If such data could be provided, that'd be helpful on that front.

Barring that, the fastest route towards a solution here would be an opt-in cap (e.g. a maxresolution attribute).
Those are not mutually exclusive, and we could start out with an opt-in and modify its default value from the current infinity into a more reasonable default once we have data to back it up.

Regarding bandwidth based restrictions, it's possible that @tarunban has looked into that.

/cc @tabatkins @eeeps

If memory serves, most browsers used to cap the selection process at 2x. If browsers are no longer specifically doing this, then I would agree that a markup solution would be nice so that developers could control the resolution caps themselves. I would suggest doing this via a meta value that applied to the whole page, rather than on an individual image basis, but I could see a use case for both options.

The entire point of this feature is that browsers should be smart enough to be able to intelligently choose the "best" resolution to download; they have access to more information than the page author does, and whatever heuristics they come up with to intelligently pick between options get automatically applied to every page using this feature, rather than only being for whatever tiny % of people use a particularly good image-choosing JS library.

If browsers aren't making good decisions, that's a bug on browsers. There shouldn't be any need for the page to further intervene here. It's definitely plausible that browsers are being naive here, so please file bugs showing bad results. ^_^

First of all: I agree with @tabatkins , the spec gives browsers great power here, and with that power comes all of the responsibility. The bugs belong with them.

While you're all here though... I barfed out some initial research and thoughts re: Hi-DPR’s diminishing quality returns. https://observablehq.com/@eeeps/visual-acuity-and-device-pixel-ratio

TL;DR @joemcgill I don't remember any browser limiting DPR to 2, but by the looks of it, maybe that's not such a terrible rule-of-thumb to start with.

I too agree browsers _should_ be doing this, but from what I see in the real world, they are not. Which gives us an impossible dilemma of either serving gigantic images for fancy 5x screens and making people on expensive data plans pay for image data they neither want nor need, or serving too-small images to all devices causing people on larger screens to question their eyesight.

How do we move forward in a constructive way here? Is the next step reaching out to browser manufacturers and asking them where bandwidth / dataplan throttling is in the pipeline?

Just to reiterate, this issue is currently causing 33% of the web to receive incorrectly sized images via WordPress (through no fault of WordPress). Finding a path toward a solution is non-trivial to the overall UX of the web.

While you're all here though... I barfed out some initial research and thoughts re: Hi-DPR’s diminishing quality returns. https://observablehq.com/@eeeps/visual-acuity-and-device-pixel-ratio

@eeeps - That's amazing work. Thank you for that!!
My read is that the mean 25YO can't see much more than 2x, but that probably means that some chunk of that population can. Is there more data/research on the distribution within that group?

@tabatkins - thoughts on the above and the conclusion? Should browsers use that to cut-off DPR calculations at ~2x? Higher?

Is the next step reaching out to browser manufacturers and asking them where bandwidth / dataplan throttling is in the pipeline?

@mor10 - you already have 😺
As a browser implementor, the next step is to gather data on what the ideal cut-off is, which @eeeps has already started above. Once we reach conclusions on that research, I can probably send an intent to modify the current behavior.

Once Chrome ships this and proves that it's useful behavior, I suspect it won't take too long for other browser vendors to follow.

As an aside, I do think that while a default should be spec'ed, that this should be overridden by the UA and/or markup.

I can imagine scenarios where the UA might desire to have a max-DPR that is non-standard and >2x. I think of TVs as an example. While mobile devices (phones, laptops, watches) have a generally accepted interaction distance, TVs have a less standard use. Home viewing is one thing, but dashboards in the office or campus, or even signage in a store or trade-hall have very different viewing and interaction distances. For this reason I could reasonably see a TV manufacture wanting a 5x DPR to compensate for the viewing and interaction distances.

As an aside, I do think that while a default should be spec'ed, that this should be overridden by the UA and/or markup.

I'm not sure we even want to specify a default, although we may add some data-backed recommendation of a good cut-off value, which browsers can then adopt as they see fit.

@mor10 do you have any more detailed data on the “costs” side of things?

I ran your test page through Chrome dev tools on a 1024x768 px viewport at DPRs:

1x: 839K
2x: 1.4MB
3x: 1.4MB

...which, 1.4MB for a page with three images on it is not great, but I feel like there's a better story here about the sort of widespread performance impact that WordPress is trying to avoid by limiting images to 1024w. What would happen to a typical WP site if it adopted the new Twenty-Ninteen theme and all limits were removed, so that the largest srcset resources were the full, user-uploaded versions?

I'll update the example with larger image sizes. The original was put together to demonstrate the negative effects of small images on wide screens.

@eeeps Here's a new post with an extended range of srcset options, most of which are wider than what WordPress now outputs. The new generated image sizes are based on this proposal which also holds the rationale for the size breakdown based on popular viewport widths.

For reference, my Pixel 3 in horizontal mode now pulls down the 2304px images for all three images in the example.

https://gutenberg.mor10.com/responsive-images-demo-extended-srcset-range/

I asked for this or a similar feature already in 2014 (I think optimumdensity attribute).

As it was shut down I created a JS plugin for lazySizes.

@eeeps
There is also a demo page which allows you to constrain pixel density and outputs the sizes. If you think this could be useful I can tweak the tool with other images and a wider range of density. Note: You must use a high density device to use this tool currently.

@yoavweiss
It would be also nice to tweak source selection algorithm with this. Because the simple arithmetic middle isn't that good either. My algorithm looked shit because I'm bad in math but having an algorithm that tends to select the smaller image if both images are high dpi is better and of course the algorithm should tend to select a higher dpi image if both are low dpi.

For reference, the WordPress core ticket to find a solution has been punted to a future release: https://core.trac.wordpress.org/ticket/45407#comment:33

Hi, has something changed in this topic?

Was this page helpful?
0 / 5 - 0 ratings