mpv-x86_64-20180519-git-05b392b.7z, Windows 10 (1803)
Playing a 4k HDR movie (any of them), in the brighter parts of the picture, it appears to be too bright and causing color banding (see picture)

It may have something to do with this commit?
https://github.com/mpv-player/mpv/commit/05b392bc949e918aaedb6383193edfd667bba646
To reproduce, I can play any HDR movie with the same result. Previous builds, play fine.
Log file
Portable mpv log.txt
It has nothing to do with that commit. Try enabling dither-depth=auto?
Thanks for the suggestion. Sadly, no change. So this is a dithering issue?
And I was mistaken. I can reproduce this on the last release 2017-12-25
Thanks for the suggestion. Sadly, no change. So this is a dithering issue?
It's either an issue with the source or an issue with the display. Since you are not using FBOs during processing, the only lossy step in the output path is the quantization, which happens twice (once to 16-bit, and once to 8 bit).
If dithering doesn't fix it, I'd begin to suspect an issue with the source. Does disabling hwdec solve anything?
My TV is a Vizio M55-E0, so it has the smartcast app. If I cast the movie, and play with the smartcast app, it plays perfectly.
I'll test with hwdec, disabled. I'll also test in a different player, and see if I get the same result.
Actually, looking at your screenshot again, I realized that there are far worse artifacts than just the banding in them; specifically, it contains a lot of what I can only imagine is clipping.
You could try enabling --gamut-warning. Does that highlight anything? What about if you set --target-peak=200? Finally, what about setting --hdr-compute-peak=no?
I'm narrowing it down. I have this line in my mpv.conf, and removing it stops the color banding. I shall experiment, further. Any suggestions?
vf=d3d11vpp=deint=yes:interlaced-only=yes:mode=adaptive
If removing that vf solves the issue then I can only imagine something in your platform's d3d11va implementation is insufficiently equipped to handle high bit depth video.
But you could try finding another play that lets you use the equivalent of d3d11vpp post processing and seeing if that also reproduces the same clipping artifacts?
Ah, yeah.....that might be Windows not detecting my display as an HDR display. I figured using PQ, that wouldn't be a factor?
I don't think that has anything to do with it. d3d11va is a decoder, not a display engine.
You've pointed me in the direction of my GPU settings. So I changed a few of them. Changing the input range to Full, made a big difference, but not completely solved it.
Enabling --deband , just about gets me there
OK, well this doesn't appear to be an mpv issue. So feel free to close the ticket.
Thank you for the assist, hassn
It still seems like the best solution would be not using d3d11vpp. I doubt you have any HDR interlaced sources anyway.
d3d11vpp filters are often up to the driver vendor to get right (just the interfaces are more or less specified by MS), and whether they handle things like 10bit is up to debate (and how they convert to 8bit for the processing if they don't). HDR content generally is 10bit because of the added requirements of a wider range of values and such. @haasn , thank you for taking the time and helping with this one.
edit: Actually, looking at the log you can actually spot the p010-to-nv12 conversion in there :) .
[ 0.122][v][vf] [in] 3840x1608 d3d11[p010] bt.2020-ncl/bt.2020/pq/limited/display SP=10.000000 CL=mpeg2/4/h264
[ 0.122][v][vf] [userdeint] 3840x1608 d3d11[p010] bt.2020-ncl/bt.2020/pq/limited/display SP=10.000000 CL=mpeg2/4/h264
[ 0.122][v][vf] [userdeint] (disabled)
[ 0.122][v][vf] [d3d11vpp] 3840x1608 d3d11[p010] bt.2020-ncl/bt.2020/pq/limited/display SP=10.000000 CL=mpeg2/4/h264
[ 0.125][v][d3d11vpp] Found 1 rate conversion caps. Looking for caps=0x4.
[ 0.125][v][d3d11vpp] - 0: 0x0000001f
[ 0.125][v][d3d11vpp] (matching)
[ 0.134][v][vf] [autorotate] 3840x1608 d3d11[nv12] bt.2020-ncl/bt.2020/pq/limited/display SP=10.000000 CL=mpeg2/4/h264
And yes, PQ in 8bit will look bad.
Yeah, but mpv is being used in Emby Theater, so it's being used for Live TV, too. Deinterlacing is needed, and the --deinterlace=yes, is causing a lot of dropped frames. Using d3d11vpp, I don't get the dropped frames. This is another issue I've being trying to remedy. Finding the right balance of settings. Nothing I've tried, is perfect. I've also discovered that if I enable --deband, some of my 4k HDR movies will start dropping frames. It seems as though, if I want to watch 4k HDR content, for the best picture I need to disable hwdec, d3d11vpp and debanding. For Live TV, I need to enable hwdec=auto/d3d11va and d3d11vpp. Changing the settings each time for different content, is a big problem.
And a side note. For that commit of increasing the contrast, I think it may be too high. I've been running a bunch of testing (after dialing in my settings), and some aspects are noticeably very bright, compared to having my tv handle the HDR.
Oh, so with the d3d11vpp enabled, it's converting to 8 bit?
@Doofussy2 , for the record, you shouldn't have to disable hwdec for HDR playback. I am constantly using it as CPU decoding of HEVC is painfully slow. As long as the decoder is giving out the picture unmodified from the decoder interface (I'm generally using d3d11va-copy for it, although I guess in many cases d3d11va itself might be OK as well - call it an "old habit").
And yes, NV12 is an 8bit surface format, the original on which P010 and P016 were then later based on. The filter functionality was converting to 8 bit. I can check later if it's a code limitation, but I would guess it's just what that filtering interface supports.
Thank you. I have experimented with copyback, and I get a many dropped frames. I have an i7 7700k. I can play HDR content just fine with --hwdec=auto or d3d11va, or no hwdec (as long as there are no filters being used). I'm struggling with this.
It's good to know that NV12 is 8 bit. That'll help me when I check the logs.
You can use auto-profiles to change the settings based on the content, FWIW
I don't think that would help. Interlaced content is the snag I need to fix. Unless a profile can be made that will only be used for interlaced content? Is that possible?
This is separate issue. I shouldn't be asking in this thread. Thanks again for the help
Unless a profile can be made that will only be used for interlaced content? Is that possible?
Yes, unless the information mpv has isn't accurate (e.g. container-fps can be garbage). A property https://mpv.io/manual/master/#command-interface-video-frame-info does exist, so assuming it's reliable you could do something like this:
[interlaced]
profile-desc=cond:get('video-frame-info/interlaced')
...options for this profile
[non-interlaced]
profile-desc=cond:not get('video-frame-info/interlaced')
...options for this profile
(For simplicity, I'm assuming video-frame-info/interlaced returns a boolean. I have no idea what it really returns.)
These conditions can use all available properties: https://mpv.io/manual/master/#property-list. The condition itself is a Lua expression so you can do comparisons and stuff. More info: https://github.com/wm4/mpv-scripts/blob/master/auto-profiles.lua
There's one catch though. Profiles don't get "un-applied", so in a case like this, where you have two mutually exclusive profiles, each needs to revert the stuff the other one did.
edit: profile conditions are re-evaluated whenever the used properties change. Not sure but video-frame-info could change upon every frame and therefore re-evaluate the conditions constantly. In that case, video-frame-info might not be a good choice.
Thanks, Argon. I just gave this a quick test, and deinterlacing wasn't applied.
[interlaced]
profile-desc=cond:get('video-frame-info/interlaced')
deinterlace=yes
Also, with that in my mpv.conf, no log is generated. What am I doing wrong? I haven't made any profiles before now. Should they be separate from the mpv.conf?
If I use hwdec=d3d11va-copy with no deinterlacing, I get great playback. If I add deinterlace=yes, then I get lots of frame dropping. If I use hwdec=auto with deinterlace=yes, d3d11va is used, no frame dropping, but of course it gets converted to 8 bit and I get banding. hwdec=auto/d3d11va with no deinterlacing, also produces great playback, no conversion to 8 bit. Deinterlacing has to be disabled when I watch 4k HDR, regardless. Is this just my hardware, or can anyone else reproduce this? Should I start another issue?
Auto-profiles is a script you have to download first: https://github.com/wm4/mpv-scripts/blob/master/auto-profiles.lua
Oh, right. I knew I was doing something wrong. Thanks, I'll take a look
Ok, put that in my scripts folder. Still no joy, and no log generated
OK guys, the deinterlacing setting is the problem. It cannot be used with HDR 10 bit. There needs to be an intelligent config so it gets disabled when not needed. I think if I were able to use hwdec=dxva2, with vf=lavfi=yadif=field:yes, it might work. But yadif isn't working with dxva2.
I've switched back to the last release, 2017-12-25. Using hwdec=auto-copy, and deinterlacing=yes, it works just fine. I guess I'll just keep using that until this is fixed.
Thanks again, guys.
Also, with that in my mpv.conf, no log is generated.
The profile applies to everything below it. You should have that bit as the last thing in your mpv.conf, if it's anywhere else, everything below it it will also get lumped in with it.
But if the auto-detection doesn't work for you in this case, you're out of luck.
Gotcha! Thanks haasn.
Do note that the last release is in various ways sub-optimal - for example you're not getting the tone mapping with proper peak calculation unless you are using the native OpenGL context, the default D3D11 one doesn't work. Since you're watching HDR content I would say that will affect you.
There were other improvements to tone mapping etc as well, but getting the peak brightness data is probably the most important thing in that stuff.
Thanks jeeb. I did notice the difference, and I changed my mind, I am using the latest build. A large amount of what I watch is Live TV and recorded TV, that is interlaced. And I remembered why I started using vf=d3d11vpp=deint=yes:interlaced-only=yes:mode=adaptive instead of deinterlace=yes. With the latter, I get many dropped frames. I only have a small library of 4k HDR movies, so I'm opting to use the latest build, and then when I want to watch HDR, I'll just disable the deinterlacing, or watch it in my portable mpv. The deinterlacing is a big problem for me. I have to use it, but it causes problems. If you guys have a solution for the deinterlacing, please let me know.