There are recipes out there, even merge to Conan-center-index like this
Absent is activating the c++17 standard in the CmakeLists.txt, but it is not defined in any input.
The recipe is doing tricks as:
def _supports_cpp17(self):
supported_compilers = [("gcc", "7"), ("clang", "5"), ("apple-clang", "10"), ("Visual Studio", "15.7")]
compiler = self.settings.compiler
version = Version(self.settings.compiler.version)
return any(compiler == e[0] and version >= e[1] for e in supported_compilers)
To check if the compiler provides capabilities for that C++ standard. But cppstd is not there at all.
It can be deduced that in reality different C++ standards for the same compiler version are binary compatible:
In that case, our model would be inaccurate. Maybe we should:
package_info()
helpers to define compatibility between them, or do erasure of the input setting.Comes from https://github.com/conan-io/conan/pull/5997/files cc / @uilianries
We are revising our approach to Conan-ify our repositories (and find out we have been doing several things wrong, see the #6176 you already linked ; )
In our opinion, the C++ standard should likely still introduce package-variability by defaut (i.e., change the package ID). Even though it was very instructive to read the linked SO answer regarding the major 3 compilers, it might not be future proof to remove compiler.cppstd
completely:
It feels like Conan should map its model on the standard, not on specific implementations, so we think your model is accurate here : )
You also make the argument that some recipe might _pretend_ to use this setting (by declaring it in the settings
attribute), but do not actually forward it to the tool-chain which hardcodes a value. This is exactly what we did wrong.
Yet, the same argument might be made for most settings the implementer has to actively forward: for example, we also realized we were disregarding the compiler.libcxx
setting from our recipe, which came back to bite us. This setting remains very useful nonetheless.
Taking a step back, maybe this is a more general problem, and probably a tricky one.
As we understand the current model, Conan recipes define _package-variability_ via settings and options.
Through that, the Conan tool offers highly consistent build of a project alongside its complete dependency graph.
On the opposite end, there is the build system (CMake, b2, makefiles, ...) for each repository in the dependency graph, which actually builds the binaries to produce packages.
In the middle sits the recipe: it is notably responsible to transform each setting and option into the correct input for build system
It is very easy for the recipe to lie about that: novice recipe implementers (as ourselves) will not always properly forward the variability.
Conan provides a great Python CMake
helper to ease some of those forwarding, and the cmake
generator offers conan_basic_setup()
CMake macro to handle some more forwarding.
Yet, we are reluctant to use this macro for a few reasons:
cmake_paths
instead]In our situation, we ended up with just some of the settings/options forwarded (the one that the CMake
Python helper maps to CMAKE_
variables, not the ones mapped to CONAN_
variables).
Maybe this actual discussion regarding the conan_basic_setup()
approach should actually go in a separate question?
This new blog post by @christoph-cullmann is very interesting: https://cullmann.io/posts/cpp-standard-version-mix-up/
They found a binary incompatibility in Windows MSVC for c++14 and c++17. Seems our hypothesis that C++ std should be consistent among dependencies holds at least for Windows MSVC.
As a feedback to @Adnn, yes we agree with your comments. This is the reason we are trying to move to a "toolchain" approach: https://github.com/conan-io/conan/pull/5919, less invasive and under control directly from Conan recipes, not inside generated CMake scripts. I think you want to track that effort and indeed it seems the Cppstd is something that could also be managed directly in the toolchain.
This new blog post by @christoph-cullmann is very interesting: https://cullmann.io/posts/cpp-standard-version-mix-up/
They found a binary incompatibility in Windows MSVC for c++14 and c++17. Seems our hypothesis that C++ std should be consistent among dependencies holds at least for Windows MSVC.
The problem I found could be even worked around if LLVM would move the implementation of the alloc/dealloc functions to a .cpp file instead of having them inline in the .h including the feature check macros.
There is more incompatible stuff lurking regarding alignment, thought, that is not that easy to fix, beside disabling the Zc:alignedNew stuff if you mix pre-17 with 17 stuff, see: https://docs.microsoft.com/en-us/cpp/build/reference/zc-alignednew?view=vs-2019
The problem I found could be even worked around if LLVM would move the implementation of the alloc/dealloc functions to a .cpp file instead of having them inline in the .h including the feature check macros.
There is more incompatible stuff lurking regarding alignment, thought, that is not that easy to fix
Thanks very much for the info. Yes, this seems to indicate that the default Conan model of assuming that binaries generated with different cppstd
versions are in the general case incompatible, is correct. A different story is that it is not extremely pragmatic, as people are mixing cppstd
versions without conflicts and issues in practice, and also that we cannot handle the combinatoric of different cppstd
versions, at least while generating the binaries for ConanCenter, that are already 100+ for every build. On the other hand, there are already some libraries in ConanCenter that need C++17 activated (and not the default of the compiler), which is not a default setting (the default is the default of the compiler version), so we need to discuss what to do with these cases.
Thanks again for the info and the interesting blog post!
supporting multiple C++ standards by library might be challenging. usually, each standard library feature goes into C++ standard via its own life-cycle. typical example of such life-cycle:
range v3
, fmt
, etc.), often as a boost library (boost::shared_ptr
, boost::filesystem
, boost::variant
, boost::optional
, etc.)hash_map
vs unordered_map
). often, it's provided as a compiler extension and requires specific flag to activate it (for instance, -fcoroutines-ts
)std::experimental::filesystem
, std::experimental::optional
, etc.) or in std::tr1 namespace (std::tr1::shared_ptr
, std::tr1::tuple
, etc.)-std=c++17
)given the above, library trying to support builds for different C++ standards often need to adapt, detecting available compiler features and use various implementations: own implementation, boost/3rd-party implementation, experimental implementation or final implementation, depending on compiler and flags.
in that way, even if compiler itself keeps backward compatibility between various C++ language standards, you can't safely mix random libraries built for various standards. in that way, it's easy to get situation when implementation expected std::string_view
but client supplied std::string
, or vice versa, just because library uses std::string_view
only if it's currently available.
that's the reason why cppstd
should produce different packages and different package IDs by default in conan. if specific library is guaranteed to be compatible in interfaces, it can easily erase cppstd
from its package_id
method, but it should be careful per-library choice. also, conan user always have a possibility to mix C++ standard between packages for his own needs (by writing -s fmt:compiler.cppstd=17 -s gtest:compiler.cppstd=20
or something similar)
few real-life examples:
Thanks @SSE4 for this post, however it seems that while -s fmt:compiler.cppstd
indeed works, it has huge disadvantage that its impossible to do this from a recipe perspective. Issue is open since 2018
https://github.com/conan-io/conan/issues/3000
memsharded says: Explicitly specifying things in a profile is the best way to go.
Unable to specify this in recipes means that even transitive builds would have to specify this settings, not to mention installs.
this means that if i want to mix cppstd I cannot do this on consumer recipes but rather the recipes that have some restrictions on cppstd themselves, this works, and is probably what I will go with personally so something like ..
def configure(self):
if self.settings.compiler.cppstd and int(self.settings.compiler.cppstd.value) > 14 and not self.settings.compiler.cppstd.value.contains("98") :
self.settings.compiler.cppstd = 14
Alternatively this can be hacked around by providing a dummy option that propagates into settings something like..
options = {"cppstd": "ANY"}
default_options = {"cppstd": None}
...
def configure(self):
self.settings.compiler.cppstd = self.options.cppstd
Oh well, I'll think about this and chose between these two, but both of them are pretty bad - as they require change of the source conanfile.py
Related to the https://github.com/conan-io/conan-center-index/issues/54
Hi! During the last days I've been thinking about this issue and I've prepared a POC modifying some Conan behavior to propose an implementation to deal with the cppstd
. It is highly speculative, just trying to make it work without taking care of the actual implementation: https://github.com/jgsogo/conan/pull/36
Big disclaimer.- This is something to think about, it can be very risky to assume ABI compatibility among different standards, different libraries and different vendors,... Maybe it is better for Conan to provide tools and let the users do their best.
Main ideas:
conan.conf
) and work as-if all the different standards were ABI incompatible.Issues along the way:
cppstd
.The proposal:
compatible_packages
feature to declare (automatically) that a package_ID is compatible with all the package_IDs obtained using different cppstd
values (only the stable ones). This will be done by Conan under the hood, the user doesn't need to declare these compatibilities.
The opt-out will prevent this automatization.
A new exception (or a new behavior for the ConanInvalidConfiguration
) that will be caught if raised in the configure()
method but it will be raised only if the conanfile::build()
method is executed (only if the user tries to compile the recipe).
This will allow a recipe that requires C++17 to raise if the profile provides settings.compiler.cppstd=11
. But if the user just wants to consume the package, Conan will look and find a _compatible package_ that was compiled using cppstd=17
.
A new CLI command (or argument) to retrieve the package_id given a profile, and all the _compatible packages_ ones.
This will allow Conan Center to iterate all the profiles for all the C++ standards and compile only those that generate incompatible package IDs. ConanCenter will compile the default for the compiler-version and those incompatible.
For recipes with weird ABI regarding C++ standard, something like the following could work:
class Recipe(ConanFile):
settings = "os", "arch", "compiler", "build_type"
cppstd_compatibility = [(cppstd.CPPSTD_98),
(cppstd.CPPSTD_11, cppstd.CPPSTD_14, cppstd.CPPSTD_17),
(cppstd.CPPSTD_20)]
def configure(self):
cppstd_value, _ = cppstd.get_cppstd(self)
if cppstd_value == cppstd.CPPSTD_17:
raise ConanInvalidCppstd("Cannot compile using c++17") # ConanInvalidCppstd will raise in build method
In this recipe we have:
configure()
method the user declared that this recipe doesn't compile with C++17cppstd_compatibility
declares which standards are compatible among them (https://github.com/conan-io/conan/issues/6157#issuecomment-570552095):cppstd=98
is only compatible with itself11
, 14
and 17
are ABI compatible20
defines a new ABII would love to hear your feedback. Thanks!
some random notes on @jgsogo feedback
A recipe requires C++17 to compile, but it can be consumed/linked by libraries compiled with a lower standard.
I don't think it's safe use-case. if library used e.g. std::filesystem
was provided by separate library (-lc++fs), and if you consume such C++17 code in the regular C++11/C++14 application, you will get many undefined references (unresolved symbols).
moreover, in many cases, it might be problematic to mix, as compiler using lower standard may use older standard library implementation (they are separately shipped and versioned very often on Linux distros, both libstdc++ and libc++).
probably, introducing such a use case is like opening a pandora's box, we will have way more issues and reports around strange linker errors.
on Apple platforms, I have seen even more compatibility breaking things! e.g. usage of std::future
(C++11) was impossible when compiling for macOS 10.8 and earlier. that was because std::future
implementation used API functions introduced in macOS 10.8 only. therefore, it influenced on OS version (-mmacosx-version-min
), and that simply disallowed to mix C++11 code with arbitrary code for the earlier standard (it just didn't link). (for the record - exact error was error: 'future<void>' is unavailable: introduced in macOS 10.8
, it can be googled). I can easily imagine more cases like that on different platforms. it seems like C++ standard is not just that simple switch on its own, but it may affect other compatibility aspects. I am sure the C++ standard has very strong implications on Mobile platforms as well (iOS, Android).
for the other things suggested, I personally find it very complicated, I would personally use Occam's razor here - don't add new exceptions, compatibility modes, etc., unless really needed. from my point of view it is:
cppstd
value, raise ConanInvalidConfiguration
, as early as possible. such implementation may look like:def configure(self):
if self.settings.compiler == "gcc" and self.settings.cppstd == "11":
raise ConanInvalidConfiguration("the recipe doesn't support GCC and -std=c++11")
cppstd
values, either just remove cppstd
setting completely (as we do for libcxx
for pure C projects), or use approach described at Defining Package ABI Compatibility.def configure(self):
del self.settings.compiler.cppstd # I swear I am fully ABI compatible across different C++ standards
or the second one:
def package_id(self):
if self.settings.compiler == "gcc" and self.settings.compiler.cppstd in ["11", "14"]:
self.info.settings.compiler.cppstd = "11/14"
# I swear I am fully ABI compatible for GCC -std=c++11/-std=c++14!
both cases are strongly discouraged IMO, and only for very experienced users (not a generic thing to be recommended for daily usage). for most of usages we should assume by default that C++ standards aren't compatible across random recipes.
I've commented on the idea in private, here my general points.
I think that the compatible_packages
feature should be the tool that we should try to build upon in next iterations instead. I'd very much like to have wrapping helper functions for common compatibility patterns. It'd be much less intrusive to the current Conan model, less complex and an opt-in.
While I do understand that we'd very much like to have more friendly defaults for compatibility issues - the problem is the same as with package_id_modes for different library types. It depends on many things that Conan ideally should support and resolve natively, but it's too complex to do now.
Quick note: any ABI compatibility between C++ standards will require other settings to be the same (all the compiler.*
)
I've made a tool solution to the compatible_packages
problem:
https://github.com/conan-io/conan/blob/e642aef902b649b6e757b44c99bf7ba934032d01/conans/client/tools/settings.py#L271
at the moment, we are going to try to generate more binaries for different cppstd in ConanCenter, and leaving the current default model in Conan, which is one binary per cppstd, which is the most robust and safe. Not doing nothing now for Conan 1.26, removing from milestone.
Most helpful comment
supporting multiple C++ standards by library might be challenging. usually, each standard library feature goes into C++ standard via its own life-cycle. typical example of such life-cycle:
range v3
,fmt
, etc.), often as a boost library (boost::shared_ptr
,boost::filesystem
,boost::variant
,boost::optional
, etc.)hash_map
vsunordered_map
). often, it's provided as a compiler extension and requires specific flag to activate it (for instance,-fcoroutines-ts
)std::experimental::filesystem
,std::experimental::optional
, etc.) or in std::tr1 namespace (std::tr1::shared_ptr
,std::tr1::tuple
, etc.)-std=c++17
)given the above, library trying to support builds for different C++ standards often need to adapt, detecting available compiler features and use various implementations: own implementation, boost/3rd-party implementation, experimental implementation or final implementation, depending on compiler and flags.
in that way, even if compiler itself keeps backward compatibility between various C++ language standards, you can't safely mix random libraries built for various standards. in that way, it's easy to get situation when implementation expected
std::string_view
but client suppliedstd::string
, or vice versa, just because library usesstd::string_view
only if it's currently available.that's the reason why
cppstd
should produce different packages and different package IDs by default in conan. if specific library is guaranteed to be compatible in interfaces, it can easily erasecppstd
from itspackage_id
method, but it should be careful per-library choice. also, conan user always have a possibility to mix C++ standard between packages for his own needs (by writing-s fmt:compiler.cppstd=17 -s gtest:compiler.cppstd=20
or something similar)few real-life examples: