I researched the current situation with finding shared library dependencies in Conan (as of version 0.22.3) a bit, and found that there are some major inconsistencies and areas where the experience could probably be improved.
This is a summary of the current situation, as I understand it (please correct me if something is wrong):
B.cpp_info.libs) are used even when linking/running temporary programs in the configure phase, so failure to find L at run time simply prevents A from being built at all.import() can be an alternative solution, except that: it does not work on Linux by default (dynamic linker searches ONLY system-wide paths unless told otherwise) and half-broken on OS X (#1238). It also seems to be a non-optimal solution (#776), except for deploying self-contained final applications. It also is the only way to go on Windows, probably.import()), because due to format limitations, any dependency information must be propagated out-of-band. Legacy libtool archives (*.la) are related for Autotools. This could likely be handled by storing their dependency info in autogenerated pkg-config files (#334), including proper RPATH ldflags.$ORIGIN on Linux/Solaris, @loader_path on Mac OS X (see #337).So based on these observations, I suggest the following behavior:
import() libraries with relative RPATHs, it could be patched by using chrpath or patchelf utilities (either by the package itself, or automatically).import() is only way that should work, as far as I understand. Maybe it can somehow be used by default there?With these changes, depending on shared libraries should work seamlessly in many cases, and the goal of supporting both reusable and relocatable binary packages should not be compromised.
Hi @himikof
Your analysis is good. In the CMake case, we set by default in OSX the set(CMAKE_SKIP_RPATH 1). That way CMake don't mess with rpaths. In the case of Linux, the problem still exists, but the different with OSX is that it doesn't crash if the library is not found in a declared rpath, just continues searching in the specified paths.
We (as package creators, in our recipes) also perform replacements in autotools script files of the install_nameparameter to remove the absolute paths.
About the tools to alter the rpaths, yes, definitely is a possibility, but we have not wanted to mess with external tools and rpaths, maybe we could study if we can provide some python helper to alter the rpaths.
In general, we don't force to use any strategy, you can package your libraries using relative rpaths if it fits better for you. In our experience, there is no a single/perfect approach so we try to keep it simple as possible and let the user take the decision. But of course, we can think about tools, or document different alternatives. It's certainly a pending task ;).
So thank you for your comments, it will help.
Another possible solution that I thought of is to make it easy to create symlinks/hardlinks when doing import() instead of copying, and then import() all the dependencies everywhere (and use trivial relative RPATHS, like $ORIGIN or $ORIGIN/../lib for binaries). Maybe packages could declare some of their files to be 'auto-imported'.
Also, regarding python helpers to alter the rpaths: there are at least https://github.com/rmcgibbo/pypatchelf and https://github.com/jimporter/patchelf-wrapper to help with the distribution/dependency.
Most helpful comment
Another possible solution that I thought of is to make it easy to create symlinks/hardlinks when doing
import()instead of copying, and thenimport()all the dependencies everywhere (and use trivial relative RPATHS, like$ORIGINor$ORIGIN/../libfor binaries). Maybe packages could declare some of their files to be 'auto-imported'.Also, regarding python helpers to alter the rpaths: there are at least https://github.com/rmcgibbo/pypatchelf and https://github.com/jimporter/patchelf-wrapper to help with the distribution/dependency.