Hello,
I noticed that radv shows a lower input latency than proprietary drivers of both AMD and Nvidia in some games. I suspect radv limits the number of frames that are allowed to be rendered ahead, which should be a good thing as the AMD DX11/9 driver on Windows does the same (Nvidia DX11/9/OGL can be configured to do so).
Would it be possible for dxvk to limit the amount of frames rendered ahead to one to reduce input latency also with the Nvidia driver?
This is already possible with the dxgi.maxFrameLatency knob as explained on the Wiki page for the new configuration file. You'll need to run latest master though as the changes have landed only today.
Thanks for pointing me to it and sorry for the inconvenience.
Have you observed any positive changes related to this?
After trying out a value of 1 in Hitman, I don't think I noticed an effect. Feels quite laggy compared to Windows DX11 prerenderlimit 1 or DX12.
What's your GPU? I'm hearing a lot of complaints about exceptionally poor performance on Nvidia and Vega GPUs.
I'm with a GTX 1070 Ti and a few days ago, I had a RX 560 installed for testing purposes.
With Nvidia, I e.g. get like ~70fps in an exemplary scene in Hitman in 1440p. After shader compile stuttering is done, it runs okayishly, but the input latency is quite high. I don't notice a difference between dxgi.maxFrameLatency = 1 or = 3, unlike on Windows with DX11 via driver prerender limit. With radv and the RX 560, the input latency was always low without further adjustments.
With Nvidia, I tried storing dxvk.conf in the game's folder and also setting its path via global environment variable, didn't make a difference.
The mouse jitter issue of wine-staging is really nasty, it clearly makes things worse in Hitman over regular wine. But subjectively I'd still say that the latter doesn't feel as direct as on Windows with similar fps (I got my window compositor disabled).
Though it's very playable at least, much better choice than the OGL port.
You can build wine-staging without the server-send_hardware_message, which should solve the mouse issue.
Yeah, I saw that, but thanks anyway. :)
Is there anything known about a problem with mouse sensitivity changing with high fps? I already noted this a year or so ago in D3D9 (e.g. Counter-Strike Source or Global Offensive) games: When the fps get too high, the mouse sensitivity suddenly increases a lot.
I could also provoke this in Hitman in 720p: When I look in the sky and the fps go up, so does the sensitivity. I haven't yet analyzed what's the magic fps threshold for this to happen, or if there maybe is any connection to the display refreshrate.
My mouse input issues are gone with wine-staging 3.15, some other users could also already confirm it:
https://github.com/ValveSoftware/Proton/issues/147#issuecomment-418149442
Most helpful comment
This is already possible with the
dxgi.maxFrameLatencyknob as explained on the Wiki page for the new configuration file. You'll need to run latest master though as the changes have landed only today.