Retroarch: AI Upscaling integration (ESGRAN for example)

Created on 11 Sep 2020  路  5Comments  路  Source: libretro/RetroArch

First and foremost consider this:

  • Only RetroArch bugs should be filed here. Not core bugs or game bugs
  • This is not a forum or a help section, this is strictly developer oriented

Description

AI upscaling as a video setting
[NO AI upscaling]

Expected behavior

Nice AI upscaled for al games.
[What you expected to happen]

Actual behavior

Not implemented yet
[What is actually happening]

Steps to reproduce the bug

  1. [First step]Run RetroArch
  2. [Second step]Go to Settings > Video
  3. [and so on...] No AI upscale setting

Bisect Results

[Try to bisect and tell us when this started happening]

Version/Commit

You can find this information under Information/System Information

  • RetroArch: [version/commit] latest web version as of this issue.

Environment information

  • OS: Linux
  • Compiler: gcc-10.2.0-203-g127d693955

Most helpful comment

That only works in cores that can dump games textures, and each console and core dumps them with their particular ideas and limitations. There is no standard for this that would work to recognize a 'texture' in all platforms, and there isn't even a need for one, since particular projects that are not automatic will always be better.

Just go to the pages where you can find texture replacement projects for particular emulators/cores (like the recent resident evil 2 and 3 project), there is no possible general solution here that isn't realtime, and realtime is impossible right now (and probably forever the way the world is going) and would be inferior anyway for various reasons.

All 5 comments

AI upscaling is typically slower-than-realtime. However, the NNEDI3 shader is an implementation of neural net-derived weights.

How about some type of running it and caching it "before" it runs?

That only works in cores that can dump games textures, and each console and core dumps them with their particular ideas and limitations. There is no standard for this that would work to recognize a 'texture' in all platforms, and there isn't even a need for one, since particular projects that are not automatic will always be better.

Just go to the pages where you can find texture replacement projects for particular emulators/cores (like the recent resident evil 2 and 3 project), there is no possible general solution here that isn't realtime, and realtime is impossible right now (and probably forever the way the world is going) and would be inferior anyway for various reasons.

If it was standardized then there wouldn't be a problem. Good point though.

That's the point, it can't be standardized because the input is too different from console to console (sometimes from game to game on certain older consoles). The reason why other runtime filters are standardized is because they're runtime and only need to operate on the final image that goes to the screen - this is too slow to be runtime, so it needs to dump textures.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

meepingsnesroms picture meepingsnesroms  路  4Comments

fr500 picture fr500  路  4Comments

Chocobubba picture Chocobubba  路  3Comments

blackman91 picture blackman91  路  3Comments

rrooij picture rrooij  路  3Comments