Imgui: Implementing into Unreal Engine?

Created on 23 Apr 2016  Â·  28Comments  Â·  Source: ocornut/imgui

Hey, awesome library, I really like it!! One question:
I'd like to implement this into the Unreal Engine 4, and I had a look on the directx11 sample. In the example it creates the swap chain and does all that d3d related stuff. In UE, the device, the swapchain, and all that is already existing. Is there a way of using imgui without having all that stuff? I don't want to dig inside the source code for it..

backenbinding

Most helpful comment

Another Unreal 4 backend courtesy @segross
https://github.com/segross/UnrealImGui

"This is a work in progress but it support all the key features of Unreal Editor, like separate contexts for sessions with multiple worlds/viewports. There are things that need more work, like better documentation and usability and right now I'm cleaning a change that allows invisible context switching. But in general I feel that it can be already opened... it should give me some boost forward as closed repository is a bit prone to stalls."

(Coincidentally @sronsse and @segross are both Seb and have similar looking logins and Unreal repos but they are two different persons/backends)

All 28 comments

The render function you tell imgui to use just passes a set of vertices with position and texture coordinate each frame. So that's all you need to tell the unreal engine to render.

Just choose an example/ folder you are most confortable with reading and adapt it for Unreal. It is very little code. Then you can submit the unreal_example: :)

So in General ImGui only creates a texture each frame? And the rendering is in the imgui_impl_dx11 files? So I could basicially just do a pretty cheap render code which puts it inside a Texture2D and draw that to screen using Unreal's methods?

No, it creates a rgb array once, and for each frame a vertex and index
buffer, with a bunch of command data. You create that one texture after
asking for the pixel data. Look at the code in the samples. Very straight
forward. ImGui doesn't care about which renderer you use (whether opengl,
directX, unreal or otherwise).

On Sat, 23 Apr 2016 at 17:32 Johnny [email protected] wrote:

So in General ImGui only creates a texture each frame? And the rendering
is in the imgui_impl_dx11 files? So I could basicially just do a pretty
cheap render code which puts it inside a Texture2D and draw that to screen
using Unreal's methods?

—
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub
https://github.com/ocornut/imgui/issues/605#issuecomment-213835279

Yeah and if I iterate through that array, get the color value, add it into a new array containing pixels, and convert that array to a Texture2D shouldnt it be enough? I think I misunderstood the way it works..

They are triangles to render, not per-pixel manipulation.
" ImGui outputs vertex buffers and simple command-lists that you can render in your application. The number of draw calls and state changes is typically very small. "

Read the documentation at the top of imgui.cpp and follow one of the exemple code eg opengl_example/.

Hmm, it's very hard to render something in UE4 like this.. I found a function which takes a command list, but I think it's too hard for me ..

I think @paultech wrote such a wrapper in UE4. It this probably simple to write if you hit the engine at low-level.

I found a function DrawPrimitiveUP which takes a FRHICommandList, an uint32 PrimitiveType, an uint32 NumPrimtives, an const void* VertexData, and an uint32 VertexDataStride. The full implementation:

/**
 * Draw a primitive using the vertices passed in.
 * @param PrimitiveType The type (triangles, lineloop, etc) of primitive to draw
 * @param NumPrimitives The number of primitives in the VertexData buffer
 * @param VertexData A reference to memory preallocate in RHIBeginDrawPrimitiveUP
 * @param VertexDataStride Size of each vertex
 */
inline void DrawPrimitiveUP(FRHICommandList& RHICmdList, uint32 PrimitiveType, uint32 NumPrimitives, const void* VertexData, uint32 VertexDataStride)
{
    void* Buffer = NULL;
    const uint32 VertexCount = GetVertexCountForPrimitiveCount( NumPrimitives, PrimitiveType );
    RHICmdList.BeginDrawPrimitiveUP(PrimitiveType, NumPrimitives, VertexCount, VertexDataStride, Buffer);
    FMemory::Memcpy( Buffer, VertexData, VertexCount * VertexDataStride );
    RHICmdList.EndDrawPrimitiveUP();
}

Could that be the one I search for?

Possibly! I'm sorry I don't know anything about Unreal so it's hard to guess. I don't know what RHI stands for, but yeah, you want to render triangles primitives with texturing and vertex coloring.

Basically to get ImGui going to need to upload a texture, render textured triangles and wire mouse/keyboard inputs.

Maybe I can get some feedback from @paultech on the rendering stuff.. Thanks anyways!

I know you're not into UE4 development, but take a look on this. This seems to draw triangles on the canvas (the screen) with a given texture and an array of FCanvasUVTri. If you take a look on that, one FCanvasUVTri has V0_Color, V0_Pos, and V0_UV (same as V1, and V2). Could that work?

Try it? :)

Im on it, I'm actually just a bit confused with locations of the vertices and what the vertex color means..

Few caveats with Unreal engine I ran into:

  • Game & Render thread frame timing. You will input data to be rendered via the game thread, The render thread may not show your content for 1-2 frames. This was not a issue for me but if strict requirements on frame timing is required you may need to find another method.
  • Due to the engine setup, I could not do the normal simple xxx__render cpp files and had to wrap everything in a plugin interface in order to hook into pre/post tick() (Tick is where i insert into ImGUI)
  • Textures in UE4 are best referenced by a path string (As they end up as their own format after import). ImGUI is friendly in the manner that textures are just a naked pointer so we can use it a a char pointer. This is a area I want to look more as its not overly flexible.

You are using UE4.11.2 ?

_Edit_ This is paultech on the wrong account! :)

  • You were correct in using the RHI (Render hardware interface) series of calls in FRHICommandList. You are looking for SetShaderTexture,*IndexedPrimitive methods

Yeah I'm using 4.11.2! When I started trying out ImGui + Ue4 I already created a plugin for that, and I wanted to have the logic in two blueprintable functions: InitImGui and TickImGui.

  • Not a problem for me as I think
  • I also used a plugin, but as you describe it, you hook it inside some RHI interface as I think
  • Currently not fully understanding that :joy:

I will look into SetShaderTexture, but to run that, I need a shader and something to draw it on (the canvas).. How did you do that?

I'm still stuck with it.. @techcompliant could you help me a bit with the RHI implementation? Would be really kind :)

Did you ever get somewhere with the UE4 integration ?

@githubChrys Not really, but I also didn't try anymore. I'll try that at some point again and report here ^^

@iUltimateLP @githubChrys @techcompliant
Sebastien started working on integration for Unreal Engine 4 here:
https://github.com/sronsse/UnrealEngine_ImGui

Reading the README here it seems like the thing is currently super clunky (rendering individual triangles...), @sronsse have you looked at DrawPrimitiveUP ? I think @techcompliant used imgui in their project maybe they'd have pointer of what to do use to make this good perf wise :)

disclaimer this is off the top of my head and not tested

I think a better way to approach imgui rendering in unreal would be to either:

1) Make a custom FPrimitiveSceneProxy that accepts dynamic geometry each frame. This is kinda involved to set up, but if you look at how the custom mesh component works that will get you started. You then need to look at examples of using MarkRenderDynamicDataDirty() to set up a way to update the data each frame. I'm not sure how you would get the scissor rect working with this approach though.

2) A possibly better approach that I know less about is to make a custom Slate widget, this can accept geometry each frame and would tie into the ui system better. BUT I have not investigated this approach so there could be roadblocks

Hey @techcompliant, some insights might be helpful indeed!

@beepdavid Thanks for the feedback as well. I'll look into the FPrimitiveSceneProxy stuff, my issue is that there seems to be no example whatsoever on this - maybe I am just blind though. I haven't looked into Slate either but came across it as I was struggling to figure out ways to get an initial proof of concept. If you have any pointer on a working FPrimitiveSceneProxy example, that would be sweet, thanks!

@sronsse hi, I had a moment to have a look at what you have already. And I think actually for now it would be better to get the UCanvas pointer from your AHUD class and use the drawing interface that this has (as others have mentioned) to pass a triangle list as one function call.

If that doesn't work out I would probably try the custom Slate widget approach next. You can use the OnPaint function to try and draw what you need. (I'm still not sure this supports everything though). Here is an example of a custom Slate widget, also wrapped so you can use it in the designer tool:

https://wiki.unrealengine.com/UMG,_Custom_Widget_Components_And_Render_Code,_Usable_In_UMG_Designer

Otherwise you are back to making a FPrimitiveSceneProxy which is more complicated. Make sure you have an understanding of the Unreal Render pipeline first, and then look for examples of use in the engine source.

https://docs.unrealengine.com/latest/INT/Programming/Rendering/Overview/

Hi @beepdavid, thanks for the pointers! I indeed was using the Canvas triangle drawing mechanism before, and didn't switch back to it. I most likely would get a performance boost out of that, but that would till force me to do the clipping and list building manually. I'll update this issue and the code base shortly. Thanks for the Slate pointers as well, I'll see if this would help this scenario, otherwise I will have to dig more into the underlying rendering pipeline (I now have source access to the engine).

I updated the code base to use the Canvas K2_DrawMaterialTriangle method instead - I have not measured the performance boost fully apart from just check the number of FPS between two methods, and while it is a bit better, it's still far from optimal.

Another Unreal 4 backend courtesy @segross
https://github.com/segross/UnrealImGui

"This is a work in progress but it support all the key features of Unreal Editor, like separate contexts for sessions with multiple worlds/viewports. There are things that need more work, like better documentation and usability and right now I'm cleaning a change that allows invisible context switching. But in general I feel that it can be already opened... it should give me some boost forward as closed repository is a bit prone to stalls."

(Coincidentally @sronsse and @segross are both Seb and have similar looking logins and Unreal repos but they are two different persons/backends)

Closing this old thread.
AFAIK the defacto Unreal back-end today is https://github.com/segross/UnrealImGui
(I haven't used either myself and don't know how they fare)

Was this page helpful?
0 / 5 - 0 ratings

Related issues

DarkLinux picture DarkLinux  Â·  3Comments

ghost picture ghost  Â·  3Comments

noche-x picture noche-x  Â·  3Comments

NPatch picture NPatch  Â·  3Comments

ocornut picture ocornut  Â·  3Comments