Hi,
Since there are quite a number of games that are locked to 30 fps, and many of which even has their game logic tied to that cap, I was thinking of solutions to get around this. Obviously, per-game hacks/mods/patches are a possibility, but those are left for the 3rd parties, understandably, nor are they a universal solution.
As I haven't heard of a universal solution yet, I thought that maybe a universal workaround could do the trick. This is when frame interpolation came to my mind (to be fair, I was also watching some Drakengard 3 frame-interpolated gameplay in the meantime).
I do have a limited understanding on gfx programming, so this will be either just rough around the edges, or completely wrong. The idea however is as follows: we put ourselves 1 frame behind, and then interpolate between the that and the current, while relying on the velocity data that was left behind. This should produce a better interpolation than a post-prod video interpolation, as the per-object velocity data is not guessed, but known. This is of course at the expense of precision (minor artifacts will still appear) and responsitivity (not sure if it'll be 33.3 ms or 16.67 ms behind tho, I'd say the former), so it should be made optional, but not default. It essentially doubles whatever framerate one might have, altough, this prediction does suffer noticeably more under 30 fps (see video 掳1).
Do note, that if possible, this feature should remain disabled during cutscenes. Not sure if that part can be universally applied though.
What made me confident about this whole technique however, was these two videos:
https://www.youtube.com/watch?v=swI8fb4V2c4
This one showcases the technique implemented at engine level. The 30->60 conversion looks decent enough to me, given the resolution ppl usually render the games, the artifacting should be reduced even further. I'd definitely use it. Making this work well with the games' in-built motion blur might need some further brainstorming though.
https://www.youtube.com/watch?v=sK8FoFzgPDw
This is an example for post-prod video interpolation. Even though the video's quality is not that great, it's convincing enough for me to think, that this idea might be worth something.
This is it, thanks for reading. I hope that these ideas make sense, at the very least partially.
Sounds more like a beta level feature. Could be a long time waiting. If ever.
Adding frame interpolation to an emulator is difficult, if not impossible.
It would also add input lag.
https://www.reddit.com/r/emulation/comments/7cjmol/could_realtime_frame_interpolation_ever_be_a_thing/
Your best bet would be to use a TV to get that effect.
Since RPCS3 utilizes gpu at effectively safe level, this could not be bad if implemented in gpu. Could be placed maybe in the long term goals?
@Margen67 In the thread they wrote that realtime interpolation is not quite possible, but 1-frame-behind might very well be. Also, the TV would do kinda the same, only that then it'd take off the load from the emulator and would also lose velocity data, thus requiring more samples to render the next inbetween.
tl;dr: as far as I can understand, that'd be essentially the same.
I don't think there's a post processor built in to rpcs3, and this would basically require one.
The label longterm should be added to this issue
Due to the way the ps3 works, we only have the graphics hardware view of the system, not software. That means draws have no context, and therefore no information outside of the final frame is available to play with. That means all you have is the final product, complete with HUD elements, etc. Frame interpolation is possible, but TAA I'm not hopeful. In short we can only do what a TV does to interpolate. There is no per-object information since the hardware has no concept of an object. This is also why there is no postfx pipeline on rpcs3 (the core is actually there and used for other things like custom UI interfaces)
Also for the sake of discussion, note that any addition would be global, i.e there is no such thing as AA removal or cutscene detection without intrusive hacks. If something is enabled it will affect every frame.
Tbf, some taa (like smaa 2tx) only operate on the final frames, without any knowledge of geometry
Yes, but usually when done in-engine you do this before adding the HUD elements to avoid any smearing. The problem isn't that the image cannot be anti-aliased, its that it will AA everything including text, subtitles, healthbars etc. It always looks off even when velocity is taken into account. UI element edges tend to be partially transluscent which will mess with the velocity estimation. Temporal reprojection itself is "easily" doable though for both TAA and frame interpolation.
Temporal anti-aliasing requires by design the motion vectors of the frame, so it's not as plug and play as FXAA for example and would require RPCS3 to hack into the software to output this buffer. SMAA can be used without temporal filtering too though (it's less clean but still better than FXAA imo).
I dont really see the need for additional aa at all tbh. Supersampling works great for me.
I thought maybe we could utilize that extra frame for it, since:
Of course, if no velocity data is available (based on what kd-11 said), TAA goes out of the question. Other methods however are still a possibility, but I'm definitely not well-versed enough in the topic to pick a different one that would fit.
Actually, if I take into account that a better-than-TV interpolation technique is not realistically possible, I'd rather think of this as more of an experiment rather than an actual feature request, since it's doubtable if it will ever even work well enough to be playable and enjoyable.
Slight update on frame interpolation: nvidia's ai-enhanced interpolation techique. It's pretty interesting stuff, altough I'm not quite sure if it'd would be a viable or even a good idea to implement here. In the video and the blogpost, they don't mention how many frames do they look ahead either, so it might be tricky. There's a whitepaper though, which might contain some further info - although, I haven't read that.
It's 2 years on and we've now got the VBlank Frequency and Clocks Scale option.
Although those two options don't work for every game, I thought I'd bump the post to see the general consensus and feasibility towards Frame Interpolation being used as a vector for fixing frame caps vs the two existing options.
Could this still ever be a possibility or is VBlank + Clocks the best compromise?
VBlank and clock scaling cover far less ground than one would assume. There are also some really incredible interpolation techniques, but almost all of those are not realtime, and the usage of interpolation would incur an input lag penalty also (when active).
Some examples with DAIN (depth-aware interpolation, not a realtime applicable technique):
original: https://cdn.discordapp.com/attachments/529347626475323402/707291012107599902/RDRHorse.mp4
dain: https://cdn.discordapp.com/attachments/529347626475323402/707291025999134750/59fps_audio_RDRHorse.mp4
original: https://cdn.discordapp.com/attachments/529347626475323402/707050977877950484/InfamousFunnyFootage.mp4
dain: https://cdn.discordapp.com/attachments/529347626475323402/707052225087930438/59fps_audio_InfamousFunnyFootage.mp4
And these were created from youtube videos at half res. Interpolation like this could work both as a "fake 60 fps mode" and as framerate compensation, even if it produces some minor artifacts.
I'm not sure however, if there's anything of even remotely comparable quality that can be applied realtime. The potential usefulness of it is not really deniable however imo.
Unfeasible, closing.
Most helpful comment
I thought maybe we could utilize that extra frame for it, since:
Of course, if no velocity data is available (based on what kd-11 said), TAA goes out of the question. Other methods however are still a possibility, but I'm definitely not well-versed enough in the topic to pick a different one that would fit.
Actually, if I take into account that a better-than-TV interpolation technique is not realistically possible, I'd rather think of this as more of an experiment rather than an actual feature request, since it's doubtable if it will ever even work well enough to be playable and enjoyable.