Godot: Status of C++ in Godot

Created on 18 Jul 2017  ·  65Comments  ·  Source: godotengine/godot

In Godot we are still using C++ 98/03. The aim of this issue is trying to decide whether we must still stick to it or either we can start using some modern C++ (C++11/14/17 and even 20) stuff.

When Godot was initially written, C++03 was the most current version and years later that style (version of the standard, avoidance of the standard library, etc.) was kept to avoid compatibility problems on certain compilers (some consoles, as far as I know). But we should recheck these concerns.

Items to consider (this list is intended to keep growing or having some entries discarded; just brainstorming at first):

  • Smart pointers: Is Godot custom pointer/object mechanism compatible with that? Would Godot need to be rewritten in a big part to use them? Would the benefits be bigger than the cost?
  • Move semantics: Questions to ask here are more or less the same as for smart pointers.
  • auto, constexpr, lambdas, etc.: In general, any modern C++ feature that makes writing code easier and/or provides some chance of optimization without disrupting the current core.
  • Standard multiprocessing primitives: C++ now provides standard threading, atomics, memory barriers, etc. Do we have to keep custom implementations and maintain them for different platforms?
  • STL/Boost/other: Would we get some benefit by switching from custom containers to a good well-known implementation? Dis/advantages rergarding maintenance, performance, compatibility, etc.
  • register, inlining "tricks", etc.: Stuff added to the code to try to make the compiler generate more performant code. Some of them either are deprecated or have no actual impact, or even can worsen performance. Which to drop? Which to keep?
  • etc,

Small changes could be done early. But when should big changes (in its case) done? Godot 4.0? Or as soon as Godot 3.0 becomes stable?

Please let me invite some people explicitly: @reduz, @punto-, @akien-mga, @karroffel, @bojidar-bg, @BastiaanOlij, @Faless.

discussion

Most helpful comment

I'd like to offer my 2 (or 20) cents as someone new to Godot and its codebase. I'm currently overseeing and working on the effort to port _Battle for Wesnoth_ to Godot. Now, the front end (the editor and GDScript API) is great! Besides a few rough edges, it's so far allowed us to progress at a good pace. But we also imagined we (the team) would contribute patches for the back end (the engine) at some point. To that end, earlier this week I cloned the git repo and started poking around in the C++, and honestly... I'm a bit dismayed.

I do have experience managing a large C++ codebase in the form of Wesnoth's old custom engine. It too started off as C++03, but was modernized to use C++11 and later features and now is C++14 compliant. I spearheaded that modernization effort (often a bit too zealously) and I feel it made our codebase a lot more readable and easier to work with. Granted, I never worked extensively with a purely C++03 codebase; I learned C++ using modern C++ features. But to me, the idea that things like auto, range-for, and lambdas make your code less readable is just... very odd indeed.

Take auto is it certainly possible to abuse it and overuse it, but it also has a ton of legit uses. One of the most common places we deployed auto when we updated the Wesnoth codebase was in for loops using iterators. Previously, we would have something like this:

for(std::vector<T>::iterator foo = container.begin(); foo != container.end(); ++foo) {}

Which is messy! In the cases we actually needed an iterator, we did this:

for(auto foo = container.begin(); foo != container.end(); ++foo) {}

Yes, now you don't know the explicit iterator type, but you almost never need to know that. I read a comment here some posts up saying it makes it harder if the container is declared some files away, but really, with modern code editors and intellisense that's not much of a problem.

In most cases we'd instead just switch to range-for:

for(const auto& foo : container) {}

Much quicker to type and more concise too. You don't need to worry about dereferencing iterators inside the loop or about keeping track of indices. And it's abundantly clear that you're looping over the entire container. With iterators, people unfamiliar with the code need to double check that the loop is indeed going from beginning to end.

Using auto in range-for loops here also has an additional benefit. It makes one less thing you need to remember to update if you ever change the container's type! I really cannot understand Juan's argument that these things make your code less readable or less easy to understand. To me, it's the exact opposite.

In the State of Godot video he also mentioned lambdas. Again, it is certainly possible to abuse them, but they're also an incredibly useful tool! Here's a common paradigm I saw in Wesnoth's codebase prior to using C++11:

struct sort_helper {
    operator()(const T& a, const T& B) {
        return a < b;
    }
}

void init() const {
    std::vector<T> foo;
    foo.push_back(T(1));
    foo.push_back(T(2));
    foo.push_back(T(3));

    std::sort(foo.begin(), foo.end(), sort_helper);
}

Long, messy, code bloat. And here's what we used with C++11:

void init() const {
    std::vector<T> foo {
        T(1),
        T(2),
        T(3),
    };

    std::sort(foo.begin(), foo.end(), [](const T& a, const T& b) { return a < b; });
}

That's just the most common case! And yes, I know you can also implement operator< for T and that std::sort uses that by default, but it illustrates my point. Again, maybe it's just having only learned and almost exclusively worked with modern C++, but I think disregarding tools like auto, range-for, and lambdas where appropriate is shooting yourself in the foot.

Which leads me to my next point. I've noticed Godot internally uses many of its own custom types instead of the STL ones. If your concern is code readability with regards to things like auto, using custom core types over STL ones absolutely a detriment! A few days ago, I was browsing through the codebase and came across a lot of code that looked like this:

container.push_back(T(args));

Now, this is inefficient. push_back (at least in terms of std::vector) takes a const reference; therefor, this operator results in an unnecessary copy. I wanted to make a patch to make them use emplace_back... but then I realized the entire codebase was using custom container types 😐

One of the big problems with Wesnoth's design and one of the major contributing factors to deciding to go with a new engine (in this case, Godot) was that Wesnoth suffered from Not Invented Here syndrome to a large degree. While we did use libraries like Boost, our rendering pipeline was custom, our UI toolkit was custom, our data/scripting language was custom.... you get the gist. Before we started work on the Godot port I tried (and failed) to implement hardware-accelerated rendering (up until this point we had been using CPU-based surface/sprite rendering. In 2019. Yes, I know X_X). SDL's Texture API didn't have shader support, and shader support was needed. In the end, I decided that implementing our own renderer, while possible, would impose an unnecessary maintenance burden on the project down the line. We already had few core devs, and finding someone able to write OpenGL (or Vulkan if we ever needed to drop OGL) would have just been an unnecessary pain when an engine like Godot has a perfectly good, well-maintained renderer that we could use instead.

I bring that up since I think it's a good example why implementing everything in-house can be a bad idea. Yes, you might reduce your binary size a bit by not using the standard library, but you also incur a massive maintenance burden. Converting those push_back calls to emplace_back is a very low-hanging fruit that can make the code cleaner and more performant. But because you have a custom vector type, if you want in-place construction someone will need to implement it manually. And in all the other custom types too!

An even bigger issue is it raises the barrier to entry. I looked at the Godot codebase expecting C++ STL types. Not finding those means I or anyone else now needs to learn exactly which types the engine provides and what API goes along with each. I want std::map? No, I need to use dictionary. It just makes life harder for the maintainers and complicates things for new contributors. And really, is not code that looks like one thing but is actually another... unreadable?

About a year ago, as we closed in on Wesnoth's 1.14 release and launch on Steam, I undertook a refactoring project to eliminate a custom container type that was essentially std::list except using erase() didn't invalidate iterators. Here's the principle commit in question. It needed further work after that to get right, but the result was code that was a lot simpler, easier to understand, and less opaque.

In conclusion, I think outlawing certain very useful C++11 features and sticking with custom types over STL ones will be a hindrance for Godot in the long term. I know refactoring things takes a long time (trust me, I know), but the way things are now it seems very likely you'll end up with a catch-22. By trying to avoid the downsides of using the STL (lager binary sizes, etc), you'll end up making it harder and harder to switch to better-performing and cleaner code in newer C++ standards. I'm sure no one is particularly looking forward to implementing in-place construction in all the custom types. 😬

I know my opinion doesn't mean much here, but I figured I'd give my thoughts from the perspective of someone who has worked with a large C++ codebase that moved to modern C++ from C++03. It is a lot of work, but in the long term I feel it's worth it.

Excuse the massive textwall!

All 65 comments

This was discussed many times, the usual answers are:

1) No desired to move the codebase to anything above 03. Features are not worth it. For GDNative C++, it's perfectly possible to use it though.
2) No desire to use STL/Boost/Etc. The rationale has always been the same: a) Godot templates do small things they usualy don't (ie, atomic refcounts and copy on write) b) STL generates huge debug symbols and debug binaries due to their extremely long mangled symbols

We keep the same way as everything is.

I got some opinions on private conversations, but I wanted to make the
discussion more public.

I didn't know it had been so much discussed and closed already.

Do you even discard auto, constexpr and such?

Drop 'register'?

El 18 jul. 2017 1:54 p. m., "Juan Linietsky" notifications@github.com
escribió:

This was discussed many times, the usual answers are:

  1. No desired to move the codebase to anything above 03. Features are
    not worth it. For GDNative C++, it's perfectly possible to use it though.
  2. No desire to use STL/Boost/Etc. The rationale has always been the
    same: a) Godot templates do small things they usualy don't (ie, atomic
    refcounts and copy on write) b) STL generates huge debug symbols and debug
    binaries due to their extremely long mangled symbols

We keep the same way as everything is.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
https://github.com/godotengine/godot/issues/9694#issuecomment-316041201,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ALQCtipKmepD_1Xw6iRXZ7aGoQlLfiwFks5sPJzqgaJpZM4ObOio
.

I raised this issue it would be safer to use smart pointers and that this code suffers from NIH syndrome, but higher-ups have no intention to upgrade, so I suggest you to also give up.

It would probably take many manhours to upgrade the code, which devs prefer to spend on implementing new features. I somehow understand this - engine users prefer to get new features than "refactoring" (without understanding stuff like technological debt).

@Marqin Sorry but smart pointers are a bad idea because:

1) the only useful one is the refcounted one, the rest is mental masturbation for features already present in the language.
2) Using smart pointers everywhere is a terrible idea, you risk having reference cycles that leak memory
3) we already have something similar, Ref<> , with the added advantage that we control how the referenting counting works, so we can add special cases to handle bindings to languages such as C#, with have their own gc

So, pleaae before arguing that we decide stuff on a whim, just ask. We usually take informed decisions and share the process with everyone.

What could be easly done if you want to improve quality, is to start writing unit tests (add some optional job to scons for that, and just make dev install gmock/gtest themselves[to not clog ./thirdparty])

the rest is mental masturbation for features already present in the language

@reduz, how is unique_ptr with it's explicit ownership and RAII memory freeing already present in the language?

Also not only memory management, but also threading gets easier (eg. lock_guard).

@RandomShaper I'm not against upgrading at some point in the future but, even though there are many useful features, newer C++ versions have a tendency to make the programmer write less explicit code. This results in code generally being more difficult to read by others (typical with auto keyword abuse).

The other advantage of not using more advanced features (ig, we don't use exceptions and rtti can be disabled with little cost) is that compilers produce much smaller binaries.

That's fine. I respect your word. Only I find discussing about this is a healthy thing.

I know some big videogame companies (Ubisoft, IIRC) have migrated big code bases to modern C++ so it must be perfectly suited for Godot as well. Of course, as you pointed out, that requires work/time.

That's why we could:

  • pick a reasonable subset for a painless as possible all-code migration;
  • or at least pick a reasonable subset as "approved" for new code.

In both cases a coding style would probably need to be defined so not to abuse features, as you said it happens with auto.

Of course, now there are things with a higher priority, but maybe when 3.0 becomes stable or at some other point in the future it would be good to have decided already.

@Marqin As mentioned, Godot uses it's own memory allocation scheme for nodes, which makes more sense than anything provided by the new standard libraries.

Also lock guard is virtually 3 lines of code and we already have it implemented without using C++11+

Again, why should we assume that anything "standard" 1) needs to be better than what we already have 2) is better if we use it anyway? .I think it's a common phallacy.

What's next, dropping our print functions and use ostream/istream? Change our String class, which is pretty awesome, and replace it with the much more crippled std::string or std::wstring?

Standard libraries don't serve all purposes and don't work better than everything else just because they are standard libraries. They are just there for those who need them, and it's fine to ignore them and write your own implementations if you have a valid reason to do it. We do and we are confident about it.

@RandomShaper The problem is that:

1) It's more cost than benefit.
2) Will open the window to a lot of arguments on how the standard way of writing C++11+ is. It may be worth having them at some point, and rewriting large part of the engine to take advantage of these features may be useful one day, but I think there are way more important things to focus on.
3) Also as mentioned before, it may be possible that as cool as porting to a new C++ version sounds, this might result in a much larger binary size. In such case it may be undesired.

It's seriously not worth it at the moment, we can discuss it again a in a few years

What I think would make sense is to make sure that Godot compiles with options like --std=c++17, so you can easily bring in a library written in modern C++ if you need to. For example, I would vote for the removal of register keyword from the codebase https://github.com/godotengine/godot/issues/9691

Somehow related, I remember having read somewhere that gcc-6.3 is not supported (google
says it was in https://github.com/godotengine/godot/issues/7703 ). It bothers me since gcc-6.3 is the default compiler in my distro (debian stable). Can anyone confirm that? And why is that?

@efornara some third party libraries already require newer C++ versions, and that's fine because Scons takes care of that with cloned build environments. Check the etc2comp thridparty code to see how it works.

@karroffel Thanks, I didn't know that.

It would just be a nice feature then, but it isn't needed for importing a library (though, if for the glue code you need to include more of godot you might stumble across a header file that doesn't compile).

By the way, if anyone needs to do something similar and has found this post, the relevant file is: https://github.com/godotengine/godot/blob/master/modules/etc/SCsub . I have found it using grep and it looks like the only place where this is needed at the moment.

It's not that safe to link c++11 with not-c++11 code - http://gcc.gnu.org/wiki/Cxx11AbiCompatibility

@Marqin Unless I misunderstand the link, this actually seems to support the case of godot _not_ starting to use components from the Standard Library, but sticking to custom components instead:

The C++98 language is ABI-compatible with the C++11 language, but several places in the Standard Library break compatibility.

Mixing languages looks fairly safe (not nice, I admit, and it might stop to work in the future), but mixing stuff like pairs, vectors, lists, etc... is known to cause problems.

@efornara The link is about linking some third-party libs that use C++11 with Godot that use C++03.

@Marqin Yes, but the way I understand the link is that you can do it. What you cannot do is, for example, to pass a std::list<> from godot to the third-party library.

@reduz where I can find documentation of Godot's internal Ref<>? Where I can find documentation on how Godot internal containers differ from those in STL?

@Marqin For containers there isn't much:
http://docs.godotengine.org/en/stable/development/cpp/core_types.html#containers
For Ref<> I realized there isn't any. Should probably be added.
Any suggestions on how to improve the doc to add these things is very welcome.

Personally I find the new C++ standard quite awesome, but I wouldn't propose any Godot internal refactoring because it would require too much efforts for too little gain.

Also one of the backward compatibility is one of the hallmarks of C++ for this exact reason. But still I would like to be able to use it for new Godot features and in GDNative.
I'd prefer Godot to be compiled with modern C++ support for this reason.

@reduz As you said C++11/14/17 lets you write less explicit code. Even tough this is a drawback for C++ newbies it is a good thing for C++ power users.
As for "auto" I personally think if is good for power users too. It can not only let you avoid typing types over and over when not strictly necessary, but it can also avoid typing some bugs.

FYI, I did compile a recent master (godot3) with:

scons platform=x11 target=debug tools=yes builtin_openssl=true CCFLAGS=-std=c++17

on a debian stretch (gcc-6.3). Annoyingly, the option is also set when compiling C files, so you are flooded with warnings if you enable them, but, apart from that, it all went smoothly. Even the register keyword doesn't seem to cause trouble.

I wouldn't go as far as to suggest that the official builds to be compiled this way, but it's good to know that the option is there if you need it in your project. What I would suggest is that any changes that break this are treated as a regression.

EDIT: Corrected some typos, making the statement about warnings clearer.

Annoyingly, the option is also set when compiling C files

Did you try with CPPFLAGS?

@Hinsbart No, I didn't. Maybe there's a way, but since I don't know scons that well, I simply went with what seemed possible without tinkering:

$ scons platform=x11 builtin_openssl=true -h | grep FLAGS
CCFLAGS: Custom flags for the C and C++ compilers
CFLAGS: Custom flags for the C compiler
LINKFLAGS: Custom flags for the linker

EDIT: By the way, I don't know how it's used in the godot build system, but CPPFLAGS would make me think of the prepocessor options. Personally, I've always used CXXFLAGS for C++.

Did you try with CPPFLAGS?

I think CPPFLAGS affects every language that uses the preprocessor, so both C and C++ are included.

@m4nu3lf you can use whatever form of C++ you want with GDNative.

In my experience, any opportunity to delete code is a good opportunity.

Perhaps we could setup some sort of wiki page documenting which files could be deleted and replaced with c++11 variants. This would probably include threading primitives and such.

Change for the sake of change, is no good (proverbial koolaid), but in this case the LLVM project and many other FOSS projects have moved on in favor of some of the clearer syntax patterns, i.e. the newer for-iterator notation, but also offloading separation of concerns to the respective language runtimes, because (let's be honest) maintaining as little platform specific code as possible is ideal for a game engine.

The best code you'll ever write is the code you un-write. :)

Out of curiosity, is it possible to write code that compiles with both older C++ and newer C++? Are there significant breaking changes between C++ versions?

Out of curiosity, is it possible to write code that compiles with both older C++ and newer C++? Are there significant breaking changes between C++ versions?

Newer C++ is backwards compatible with older C++ in 99.99...% of cases (only things that should not have been used or were mal-defined were defined as no longer supported but in general that will only cause compile warnings but still work nowadays).

However, newer C++ has significant features, and those will obviously not work in older C++ versions, and those features aren't just useability features like variadic macros and auto and lambdas, but also efficiency features like moveing and rvalue references and, of course, variadic macros, as well as ones for safety such as the newer ownership pointers (especially with the C++ Core libraries) among many others.

Considering that even Visual Studio supports modern C++17 now, there's really no reason not to use newer versions.

I wanted to comment. I was able to compile godot with the C++17 flag with the only issue being one of the third_party stuff that used auto_ptr (wich is removed from C++17 due to how bad it is)

The problem is not that it doesn't compile in c++17, the problem is people who want to use c++17 features, or even worse, start PRs using those... only to discover that the project is on c++03.

The lack of lambdas alone is a HUGE issue that i don`t see here.
Several times, godot has to store a temporal list of data, to then iterate on that.
In every single one of those cases, it can be done as a "for_each" or similar structure that iterates by itself, simplifying code a huge amount, lowering memory usage, and improving performance thanks to the lambda getting inlined/optimized. (and this is C++11, used everywhere). Same exact thing for all of the general loops on linked list.

Move operators would also allow better data structures, which could replace the current absolutely terrible ones by some better optimized.

String datatypes (like "mystring"_hs) or similar, could give us a good way to spread hashed strings everywhere in the code, making string checking (very common in script and event code) much faster.

Even if std::thread is not used (i think its a fairly terrible abstraction myself), the std atomics library is amazing, and extremelly useful. std::atomic and the likes.

And lets not even talk about how much forcing people to C++03 hurts the project itself, when people cant easily integrate modern libraries themselves, because godot is like the only non-abandoned open source C++ project on such an old version of C++ (as far as i know)

Personally, i agree with being conservative and not going to the absolute latest C++ standard, but i think something like C++11 with some vetted features of C++14 is what would work best, and improve Godot significantly. C++11-14 is good enough for Unreal Engine, which ports to ps4, xbox, switch, pc, low end pc, android, IOS, and HTML5 webassembly. I dont see why godot should limit itself when it supports a much lower amount of platforms.

String datatypes (like "mystring"_hs) or similar, could give us a good way to spread hashed strings everywhere in the code, making string checking (very common in script and event code) much faster.

Plus it lets you use them in things like switches. Like I made an Atom type years ago to replace a flyweight string implementation:
https://github.com/OvermindDL1/OverECS/blob/master/StringAtom.hpp

It has both a 32-bit (5 character max, or 6 if you recode the table a bit), and a 64-bit version (10 or 12 character max, depending on whether choosing tight encoding or not). It is fully reversible, happens at compile-time or dynamically at run-time in either direction, fully useable in switch's, etc... etc... Example usage of that file:

switch(blah) {
  case "UPDATE"_atom64: ... pass
  case "PHYSUPDATE"_atom64: ... pass
  ...
}

When interacting with LUA code I just used strings and performed the conversion on the bounds. That file isn't the "latest" version of it, I added functions to do the conversions with no memory allocation when going back from integer to string (it's alloc-less on string->integer regardless, since it runs at compile time). It is trivial to make a Visual Studio or GDB filter that reverses the encoding when displaying the Atom64/Atom32/whatever type as well (which is just an integer underneath) so you can see the string type instead of some weird hashed value.

But things like this are extremely useful, especially in performance sensitive code and for making code easy to read, and this just required C++11, not even anything newer.

At the very least I'd say C++14 should be the Godot standard. C++17 would be nice (some very useful performance enhancments in some new code), but C++14 is rather a universal minimum now. But of course since gcc/clang/VisualStudio and whatever else all support C++17 fine now (and even large chunks of C++20), C++17 seems good as well. I'd probably still opt for C++14 for 'just-in-case'.

@OvermindDL1 That Atom thing is awesome, love it. It would definitely fit really well with godot, where its doing that exact thing quite a lot of times.

That Atom thing is awesome, love it. It would definitely fit really well with godot, where its doing that exact thing quite a lot of times.

@vblanco20-1 Well if Godot wants it they are free to absorb the code. It (and it's predecessor flyweight string) have seen a long use in my old C++ engine. It was so useful for little event tags, just short strings to use as 'keys' into things, and so trivially easy to move back and forth over Lua/LuaJit, plus the debugger filters were a massive help.

The lack of lambdas alone is a HUGE issue that i don`t see here.

Am I the only one who thinks lambdas make the code hard to read in most cases?

@Faless: No, you're not the only one, I think lambdas being hard to read is one of the reasons Akien hasn't upped the c++ version.

Lambdas are just small inline functions. I dont know what you comment "hard to read" on them. But they allow things like Sort algorithm where you send the comparison function, or the rest of the std algorithms library. They also allow you to save memory and improve performance significantly thanks to removing the need for temporal arrays (which happens multiple times in the source code with the octree and other systems)

And with the std algorithms library, thats the easiest way to multithread a program you can get, just through parallel for, parallel sort, and parallel accumulate.

Its an absolute shame people write them off as "weird", when they can improve code quality, performance, and reusability so much.

Actual example of something that i already implemented:

//old linked list iteration
while (scenario->instances.first()) {
            instance_set_scenario(scenario->instances.first()->self()->self, RID());
        }
//new (just abstracts above)
scenario->instances.for_each([]( RID& item  ){
    instance_set_scenario(item, RID());
});

There is also this other case, repeated way too many times

//old. Note the 1024 array, wich is hard size, and is wasting memory

int culled = 0;
Instance *cull[1024];
culled = scenario->octree.cull_aabb(p_aabb, cull, 1024);
for (int i = 0; i < culled; i++) {

    Instance *instance = cull[i];
    ERR_CONTINUE(!instance);
    if (instance->object_ID == 0)
        continue;

    instances.push_back(instance->object_ID);
}

//NEW. not implemented yet. 0 memory usage, can be inlined, and doesnt have a maximum size. 
//Its also shorter and will be faster in absolutely every case compared to old version.

scenario->octree.for_each_inside_aabb(p_aabb, [](Instance* instance){       
    ERR_CONTINUE(!instance);
    if (instance->object_ID == 0)
        continue;
    instances.push_back(instance->object_ID);
});

Actual example of something that i already implemented:

Not sure what is the gain here...

Its an absolute shame people write them off as "weird", when they can improve code quality, performance, and reusability so much.

@vblanco20-1 well, I'll try to explain myself.
People usually end up writing things like:

my_arr.push(
    [par1, par2, par3]{
      somefunc(par1, par2, par3);
    }
);

and then, in some other places:

func = my_arr.front();
func();

Which, when you read it, gives you no clue on what function is executed, nor what to look for. It could be fine if it's in the same file, but what happens if you pass that array across multiple files is that the whole code becomes unreadable.

@Faless your example is an example of the worst you can do with lambdas (and that kind of usage should definitely be banned).

Using lambdas "across time" is not a good use of them, as they will have to allocate, and they also become a huge hazard, as the captured variables can stop being valid by the time you execute the lambda (for example, capturing a pointer and then deleting the object before calling the lambda, the captured pointer will dangle). Im only really defending lambdas for their use alongside data structures and algorithms, where you use them "instantly".

In the for_each example, the for_each is inmune to changes in the internal data structure (for example, if you reorder its variables a bit), it allows more "opaque" data structures (which allows a dev to be able to "easily-ish" change from one data structure to the other to test which one might work better. By embracing that, you can implement much more complicated data structures that work opaquely, while keeping the "use" layer easy to use.

Its also reducing boilerplate and making more "clear" what the code is actually doing (iterating the linked list). Just getting rid of the " first()->self()->self " is an improvement on itself.

Keep in mind the benefits of this actually accumulate with more usage, as you will be able to have things like "unordered_for_each" which iterates the nodes in the order they are in memory (through the allocator, or if the linked list is stored on top an array), or "reverse_for_each". Even things like "find" or "sort". (I dont recomend the std::algorithms that much, at least until C++20 when Ranges get merged. implementing your own algorithms as part of your data structure is much better to use)

Lambdas were basically the first thing Epic Games allowed from C++11 into unreal engine, alongside their own library of algorithms like Sort, Partition, Filter, RemoveAll, Find, etc. Since then they are quite often used in the source code, both in engine code and gameplay code

Cool, at least we agree that lamdbas are not some holy Grail of
programming, and we already defined one rule on how not to use them.

I'll do more research about performances.

On Sat, Dec 8, 2018, 17:06 vblanco20-1 <[email protected] wrote:

@Faless https://github.com/Faless your example is an example of the
worst you can do with lambdas (and that kind of usage should definitely be
banned).

Using lambdas "across time" is not a good use of them, as they will have
to allocate, and they also become a huge hazard, as the captured variables
would stop being valid by the time you execute the lambda (for example,
capturing a pointer and then deleting the object before calling the lambda,
the captured pointer will dangle). Im only really defending lambdas for
their use alongside data structures and algorithms, where you use them
"instantly".

In the for_each example, the for_each is inmune to changes in the internal
data structure (for example, if you reorder its variables a bit), it allows
more "opaque" data structures (which allows a dev to be able to
"easily-ish" change from one data structure to the other to test which one
might work better. By embracing that, you can implement much more
complicated data structures that work opaquely, while keeping the "use"
layer easy to use.

Its also reducing boilerplate and making more "clear" what the code is
actually doing (iterating the linked list). Just getting rid of the "
first()->self()->self " is an improvement on itself.

Keep in mind the benefits of this actually accumulate with more usage, as
you will be able to have things like "unordered_for_each" which iterates
the nodes in the order they are in memory (through the allocator, or if the
linked list is stored on top an array), or "reverse_for_each". Even things
like "find" or "sort". (I dont recomend the std::algorithms that much, at
least until C++20 when Ranges get merged. implementing your own algorithms
as part of your data structure is much better to use)

Lambdas were basically the first thing Epic Games allowed from C++11 into
unreal engine, alongside their own library of algorithms like Sort,
Partition, Filter, RemoveAll, Find, etc. Since then they are quite often
used in the source code, both in engine code and gameplay code


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
https://github.com/godotengine/godot/issues/9694#issuecomment-445474001,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ABnBbvBTxThAh0v8AfFCdGsSv2HFnEz6ks5u2_GKgaJpZM4ObOio
.

C++11 features that would improve maintainability of the codebase:

  • override and final
  • Range-based for loop
  • nullptr
  • Strongly typed enumerations
  • explicit keyword

C++11 Features that would improve quality of life:

  • Right angle bracket (no more Vector<Vector> >)

[[nodiscard]] (C++17 ?) see here

I don't think we should adopt a modern version for the sake of using new things or because there are 1 or 2 features that can be used. I wouldn't propose using something beyond C++11 because the benefit is not worth it.
As C++ evolves, the stl grows bigger and bigger and hundreds of lines are added to the project during the preprocessor stage. In most cases it has a notable impact in performance.

In most cases it has a notable impact in performance.

It absolutely shouldn't? Code that is not executed should have no performance impact unless you have a compiler bug or so, and in release it shouldn't be in the final binaries at all? In many cases just having proper move semantics alone increases performance, in some areas significantly, and though thats an older example the updated directives can have also have performance boosts among other features.

@OvermindDL1 Yes, that's how it should be in theory.
Check out this: https://twitter.com/zeuxcg/status/1085781851568914432
I've seen various cases like this one in social media recently. It's not as "zero cost abstractions" as it should.

@lupoDharkael That twitter thread (once it finally loaded, holy wow twitter is a horrible interface... I keep forgetting that...) only talks about compiletime speed because math.h in libstdc++ is larger in C++17 mode (where libc++ does not have that issue) because of a greater amount of overloads and features for better runtime speed so any compile that brings in math.h has an increased compile time of about 300ms if using that specific older stdlib instead of the newer stdlib. It states about nothing about runtime performance (of which I've only seen be faster in higher C++ modes, depending on the features used, identical speed in the worst cases). So as for It's not as "zero cost abstractions" as it should. I'm still not sure what you are referencing? Do you have any links to actual reports of runtime performance issues as the thread you linked seemed to have nothing to do with that whatsoever as it was only about a 300ms increase in compiletime when compiling with math.h in the older stdlib (I'm not sure I see the issue of a flat 300ms increase in compiletime of a compiled object anyway)?

I'll investigate more about it.
I understand that the cost has a reason but larger compilation times is really an anti-feature for me. I only have a laptop and working on features for the engine takes its time because the time I have to wait for the compile+link process after every change. increasing that even more without a really justified benefit, just for the sake of using a newer versions is not a good idea.

I only have a laptop and working on features for the engine takes its time because the time I have to wait for the compile+link process after every change.

If you have ccache installed, a lot of time in the incremental build process will be spent on linking. You could shave a second or so by using gold instead of ld for linking, and it's likely possible to optimize it even further by switching to lld.

Oh definitely, I cannot say enough good for both ccache and ninja as the backing builder (if using cmake or so for example), they both save so much time!

Plus unity builds can be surprisingly amazing, that's where you make a new unity.cpp or so file that just includes all the cpp files in the project in to it, though realistically you usually have a unity cpp file per 'module' of a project so you only have a dozen or so to keep memory down, in exchange for that extra memory at compile time it both compiles and links so much faster. Those are less useful for incremental rebuilding and more useful for release builds though.

To add one to the pile:
static_assert

For example the union SpatialMaterial::MaterialKey assumes that the struct has the same size as the uint64_t, but it's not asserted anywhere as far as I can tell.

Wanted to slap a static_assert(sizeof(MaterialKey) == sizeof(uint64_t)) in there, but couldn't.

The other thing is using unique_ptr to ensure proper cleanup on destruction without having to write too much manual boilerplate, and without using unnecessary ref-counting.

I'd like to offer my 2 (or 20) cents as someone new to Godot and its codebase. I'm currently overseeing and working on the effort to port _Battle for Wesnoth_ to Godot. Now, the front end (the editor and GDScript API) is great! Besides a few rough edges, it's so far allowed us to progress at a good pace. But we also imagined we (the team) would contribute patches for the back end (the engine) at some point. To that end, earlier this week I cloned the git repo and started poking around in the C++, and honestly... I'm a bit dismayed.

I do have experience managing a large C++ codebase in the form of Wesnoth's old custom engine. It too started off as C++03, but was modernized to use C++11 and later features and now is C++14 compliant. I spearheaded that modernization effort (often a bit too zealously) and I feel it made our codebase a lot more readable and easier to work with. Granted, I never worked extensively with a purely C++03 codebase; I learned C++ using modern C++ features. But to me, the idea that things like auto, range-for, and lambdas make your code less readable is just... very odd indeed.

Take auto is it certainly possible to abuse it and overuse it, but it also has a ton of legit uses. One of the most common places we deployed auto when we updated the Wesnoth codebase was in for loops using iterators. Previously, we would have something like this:

for(std::vector<T>::iterator foo = container.begin(); foo != container.end(); ++foo) {}

Which is messy! In the cases we actually needed an iterator, we did this:

for(auto foo = container.begin(); foo != container.end(); ++foo) {}

Yes, now you don't know the explicit iterator type, but you almost never need to know that. I read a comment here some posts up saying it makes it harder if the container is declared some files away, but really, with modern code editors and intellisense that's not much of a problem.

In most cases we'd instead just switch to range-for:

for(const auto& foo : container) {}

Much quicker to type and more concise too. You don't need to worry about dereferencing iterators inside the loop or about keeping track of indices. And it's abundantly clear that you're looping over the entire container. With iterators, people unfamiliar with the code need to double check that the loop is indeed going from beginning to end.

Using auto in range-for loops here also has an additional benefit. It makes one less thing you need to remember to update if you ever change the container's type! I really cannot understand Juan's argument that these things make your code less readable or less easy to understand. To me, it's the exact opposite.

In the State of Godot video he also mentioned lambdas. Again, it is certainly possible to abuse them, but they're also an incredibly useful tool! Here's a common paradigm I saw in Wesnoth's codebase prior to using C++11:

struct sort_helper {
    operator()(const T& a, const T& B) {
        return a < b;
    }
}

void init() const {
    std::vector<T> foo;
    foo.push_back(T(1));
    foo.push_back(T(2));
    foo.push_back(T(3));

    std::sort(foo.begin(), foo.end(), sort_helper);
}

Long, messy, code bloat. And here's what we used with C++11:

void init() const {
    std::vector<T> foo {
        T(1),
        T(2),
        T(3),
    };

    std::sort(foo.begin(), foo.end(), [](const T& a, const T& b) { return a < b; });
}

That's just the most common case! And yes, I know you can also implement operator< for T and that std::sort uses that by default, but it illustrates my point. Again, maybe it's just having only learned and almost exclusively worked with modern C++, but I think disregarding tools like auto, range-for, and lambdas where appropriate is shooting yourself in the foot.

Which leads me to my next point. I've noticed Godot internally uses many of its own custom types instead of the STL ones. If your concern is code readability with regards to things like auto, using custom core types over STL ones absolutely a detriment! A few days ago, I was browsing through the codebase and came across a lot of code that looked like this:

container.push_back(T(args));

Now, this is inefficient. push_back (at least in terms of std::vector) takes a const reference; therefor, this operator results in an unnecessary copy. I wanted to make a patch to make them use emplace_back... but then I realized the entire codebase was using custom container types 😐

One of the big problems with Wesnoth's design and one of the major contributing factors to deciding to go with a new engine (in this case, Godot) was that Wesnoth suffered from Not Invented Here syndrome to a large degree. While we did use libraries like Boost, our rendering pipeline was custom, our UI toolkit was custom, our data/scripting language was custom.... you get the gist. Before we started work on the Godot port I tried (and failed) to implement hardware-accelerated rendering (up until this point we had been using CPU-based surface/sprite rendering. In 2019. Yes, I know X_X). SDL's Texture API didn't have shader support, and shader support was needed. In the end, I decided that implementing our own renderer, while possible, would impose an unnecessary maintenance burden on the project down the line. We already had few core devs, and finding someone able to write OpenGL (or Vulkan if we ever needed to drop OGL) would have just been an unnecessary pain when an engine like Godot has a perfectly good, well-maintained renderer that we could use instead.

I bring that up since I think it's a good example why implementing everything in-house can be a bad idea. Yes, you might reduce your binary size a bit by not using the standard library, but you also incur a massive maintenance burden. Converting those push_back calls to emplace_back is a very low-hanging fruit that can make the code cleaner and more performant. But because you have a custom vector type, if you want in-place construction someone will need to implement it manually. And in all the other custom types too!

An even bigger issue is it raises the barrier to entry. I looked at the Godot codebase expecting C++ STL types. Not finding those means I or anyone else now needs to learn exactly which types the engine provides and what API goes along with each. I want std::map? No, I need to use dictionary. It just makes life harder for the maintainers and complicates things for new contributors. And really, is not code that looks like one thing but is actually another... unreadable?

About a year ago, as we closed in on Wesnoth's 1.14 release and launch on Steam, I undertook a refactoring project to eliminate a custom container type that was essentially std::list except using erase() didn't invalidate iterators. Here's the principle commit in question. It needed further work after that to get right, but the result was code that was a lot simpler, easier to understand, and less opaque.

In conclusion, I think outlawing certain very useful C++11 features and sticking with custom types over STL ones will be a hindrance for Godot in the long term. I know refactoring things takes a long time (trust me, I know), but the way things are now it seems very likely you'll end up with a catch-22. By trying to avoid the downsides of using the STL (lager binary sizes, etc), you'll end up making it harder and harder to switch to better-performing and cleaner code in newer C++ standards. I'm sure no one is particularly looking forward to implementing in-place construction in all the custom types. 😬

I know my opinion doesn't mean much here, but I figured I'd give my thoughts from the perspective of someone who has worked with a large C++ codebase that moved to modern C++ from C++03. It is a lot of work, but in the long term I feel it's worth it.

Excuse the massive textwall!

@Vultraz Fully agree.

While I don't have (almost any) experience with C++, after working some time with GDNative + C++ I share your view.

Just few days ago I posted on reddit in "Is contributing to Godot a good way to learn C++?" thread:

I wouldn't recommend it. From my experience on working with GDNative and C++, they reinvent wheel at least several times (custom collections and pointers). These custom classes are undocumented (at least from GDNative C++ bindings/headers POV - barring one guide which lead to a non-compiling demo project last time I tried, there is no documentation in code [headers, bindings] nor in official docs) and have confusing/unintuitive behaviour (e.g. wrapping some engine class late in Ref causes random crashes, even though Ref intuitively should be transparent and thus should not change behaviour of a class which is being wrapped).

I am also not a fan of preferring IMO less readable boilerplate instead of using new C++ features. I like Haskell (and its libraries), its succinctness, not pandering to beginners in order to not cripple the language for advanced users.

Because of these decisions/values I doubt I will ever contribute to the engine. (It's funny, because engine devs state the reason behind these decisions is to encourage contributions. In my case it had totally opposite effect.)

@Vultraz a well articulated write up, great read, thanks for that. It is good to hear a perspective like that.

I'm an old style C++ programmer and as such I haven't had much problems with understanding the Godot source code, I've found it well structured and much to my liking. I think having been able to add VR support to the core of this engine relatively quickly while having very little pre-existing experience with the code base is a testament to how readable and understandable this engine is. It may be old school in some ways but I'm continuously surprised by how well certain parts are build. Yes there are parts that need modernization in order to be less memory hungry and more performant but as a whole, I've been impressed by it.

When I hear people talking about newer syntax in C++, often I just feel old and really wonder what the fuss is all about. But I kinda get that when you learn more modern C++ and you then decide to contribute to Godot, it's weird that it seems to re-invent the wheel. But to people like me at least, we're so used to seeing classes like these implemented, we're just impressed by how well most are implemented here :)

Now that all said, there are so many different ways to look at this problem its not funny. From the origins of Godot predating newer C++ syntax to what the core developers feel the most at home with to limitations emposed by the cross platform nature of Godot and the devices (some not public) it can(/could) run on. I don't feel there is a right or wrong in this, it's what you're used to and where you're coming from, and the question about whether the extra learning curve learning how Godot works outweighs the benefits of refactoring a large code base.

I don't know if you've watched it but Juan held a talk at GDC that was put online: https://www.youtube.com/watch?v=C0szslgA8VY
During the Q&A towards the end he talks a bit about the decision making around this.
It's a good watch.

As for the reactions above on GDNative, common guys, GDNative is a recent addition that fills a particular need, its not indicative of how the core product is structured and build.
Try building a module (https://docs.godotengine.org/en/3.1/development/cpp/custom_modules_in_cpp.html), yes that requires compiling your changes into the core product which is the issue GDNative attempts to solve, but that gives you a much better understanding of how the engine really is structured. Many of the arguments given up above are shortcomings of GDNative, not of Godot itself.
I'd love for a dynamic module solution that allows me to build modules in the same way as we're building static modules, maybe some day that will be possible, but until then GDNative suffices.

@Vultraz @mnn I can actually see the point about STL containers, but purely because some implementations (MSVC, mostly) have horrendously slow performance in debug mode. But one could simply use EA's STL and be good (theirs is fast, and portable).

Also: I personally found the lack of RAII to be most distressing. The amount of manual cleanup that is needlessly performed all over the code is weird, since RAII isn't new to C++11.

As for the reactions above on GDNative, common guys, GDNative is a recent addition that fills a particular need, its not indicative of how the core product is structured and build.

That's of course true, but shouldn't be GDNative more user friendly than engine itself, since GDNative is used to create "scripts", so targeted at an audience which is expected to have even lower C++ skills that those willing to sink into internals of the engine?

Because of these decisions/values I doubt I will ever contribute to the engine. (It's funny, because engine devs state the reason behind these decisions is to encourage contributions. In my case it had totally opposite effect.)

I am a C++ noob (around two months working 2 days a week in C++), so I should have been the target demographic profiting from this decision. I find Godot containers lacking in basic functionality (comparing to stl which isn't too rich itself), I don't think containers are GDNative related, but I might be mistaken. I am doing my best avoiding Godot containers, because they are always pain to work with. I thought inconsistent and unexpected behaviour of Ref was engine's responsibility, but I guess I am wrong.

I realize my programming background is probably rather unusual - professionally working years in JS/TS, year in Scala, smaller hobby projects in Haskell (few thousands lines, which isn't actually that small considering how concise Haskell is - many times I write code in C++ which in Haskell would be at least 5 times shorter and more readable). I wonder if I am the only one being discouraged by usage of archaic overly verbose tech.

@mnn, GDNative was created to enable the creation of C based modules as dynamic libraries so you didn't have to recompile the entire engine. On top of that several language bindings were created so you could write modules in C++, python, rust, etc. that again did not require compiling the entire engine.

The other goal of this was for developers to be able to create modules where you just deliver the module itself and just use it with a stable build of the engine. Many modules have become defunct because they aren't being maintained yet are tied into a specific version of the engine.

So yeah, it's made it much easier and simpler to create modules from a compilation and deployment perspective. But due to its approach it has its limitations. When you stay within those limitations then writing C++ code in GDNative can remain simple, for as far as the subject matter of what you're building a module for is simple.

Try and break out of those limitations and you're in for headaches. My current headache is around trying to implement OpenGL logic while that is all encapsulated inside of the visual server architecture and not really available inside of GDNative in a way that I need it to be. But that is more a factor of me wanting to do something with GDNative it was never designed for.
When you use it for what it was designed for, you can do some really cool things with it.

Note also that GDNative was meant as a way to rewrite GDScript code that needs to be more performant. As a result you can't access anything inside of the engine that GDScript doesn't have access to either (like OpenGL)

What about the latest coroutine support, ie co_await()? From a procgen perspective, this is HUGE.

Coroutine support is part of the C++20 standard, which hasn't been finalized yet, for the record.

Allow me to chime in, I'm currently writing advanced (think source hammer editor) brush/csg tools for the 3D Spatial Editor and this talk about not allowing auto is really confusing to me. Sometimes _even use of auto can be explicit_. Consider the following:

I'm inside the SpatialEditorViewportContainer and I want to get the SpatialEditor, which is three items up the parent hierarchy (before anyone points out that there are better ways to access the SpatialEditor from the viewport container, please consider that I started looking at this codebase yesterday)

auto sp = Object::cast_to<SpatialEditor>(get_parent()->get_parent()->get_parent());

As you can see, the dynamic downcast _already explicitly states the type of SP_. Without auto, I would have to write redundant junk like:

SpatialEditor sp = Object::cast_to<SpatialEditor>(get_parent()->get_parent()->get_parent());

Please, for the love of god, allow the use of auto!

On the topic of which C++ standard to use: The proposed reflection and meta class features of C++20 and beyond would be really usefull to reduce macro clutter like

GDCLASS(SpatialEditorViewportContainer, Container);

Also, I would like to echo that these restrictions make me really second guess my decision to contribute at all. Arround 2012, I self taught myself C++11 and not being able to use this standard seems like a slap to the face. C++11 and C++03 are completly different languages, and most of the reputation that C++ is hard to learn, hard to read and hard to write are the fault of C++03 (or 98 rather). Not using at least C++11 or 14 is _detrimental_ to maintainabily and readability. I grew up with C++11 and the project lead evidently did not (when he began working on godot in 2007, I was 12 years old), so I guess this is more a case of preference and baby duck syndrome. I feel like not using C++11 is just to comfort people who are used to old school (aka, terrible) C++, at the expense of people like me who took the time to learn modern C++.

As time goes on, more and more junior programmers such as myself will be raised on modern C++11 and beyond and will find the fact that the project is forever stuck in a lanuage that doesn't even have lambdas to be rather discouraging.

In short: C++11 or bust!

To be clear, I do not advocate the use of STL. Rolling your own containers is fine, but rejecting features like lambdas and auto seems asinine.

I'll close the echo chamber for now as it's pointless to discuss further here. We already discussed months ago what kind of features of C++11 and/or some later versions we plan to use.

We're just waiting on @hpvb to have time to finalize the guidelines he's writing based on our consensus, and we can then discuss some more about those guidelines once they are published. Until then, this is not constructive.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

blurymind picture blurymind  ·  3Comments

testman42 picture testman42  ·  3Comments

nunodonato picture nunodonato  ·  3Comments

gonzo191 picture gonzo191  ·  3Comments

mefihl picture mefihl  ·  3Comments