Dear conan community,
I think this has come up a few times, it is about development of packages vs deployment of them and the fact that there is only one conanfile.py which serves for both.
This especially applies to recipes, that have their conanfile in source.
Consider I want to develop a library.
When I develop my package I need to / want to build a whole lot of stuff:
When I deploy my package, I just want to build the library itself.
Needless to say, when I develop the library I almost certainly have additional requirements, such as gtest, doxygen, ...
Options I have considered to model this in the conanfile:
build_development = ON / OFF), and set requirements / cmake options based on thatmissuses what options are forrequires and build_requires, such as development_requires. conan install would have to gain an option development and then installs all additional development requirements. Development requirements are only considered for conan install, but not for build or anything else.How do you tackle this problem in your development workflow?
During development, do you get your dependencies also with Conan?
I'm not sure, personally, I think that options are a fine approach, kinda.
But you can try out: https://docs.conan.io/en/latest/reference/conanfile/attributes.html#in-local-cache
Same for build_requires, you can add additional ones based on the in-local-cache flag in the https://docs.conan.io/en/latest/reference/conanfile/methods.html#build-requirements function.
I think that the approach with build_requires and options should be enough. IMO, there are three different scenarios for Conan packages:
You want to build the library itself, build and run tests, documentation, samples,... You can build them all (think about local development flow where you would want to run examples) and choose what to include into the package: probably just the library and the docs.
If you like to, you can use options to speed up your CI by not compiling the samples.
Everything needed to build the tests and docs should be a build_requires.
The package is already created and binaries are available, the consumer should just build the graph taking into account _public_ dependencies and download the binaries/docs for that configuration. build_requires are not included in the graph if the binary is available, so everything related to build tests or docs won't exist from the POV of the consumer.
Here you will choose from the packages already available what to include into the final bundle: libraries, license files, some docs (probably not all of them). Conan doesn't provide a built-in feature to generate this bundle, but you can build a script on top of the deploy generator (it retrieves the content of all the packages in your graph) to select the proper artifacts.
Thanks for the elaborate answer, but for me there are not three but four different scenarios, because for me, development and package creation are indeed different scenarios.
The only contact of the developer with Conan in this stage should be a "conan install" to retrieve all packages necessary. From there on the developer will work in their IDE of choice. All files (sources / build files) should not be in the Conan cache directory but in their own workspaces.
It does have build_requires (such as CMake), and additional requirements, such as maybe GTest or Doxygen.
CI will also build like this to ensure everything compiles directly.
In general this is not done by the developer, but by the CI. It builds the package and deploys it for a given release so that other users can depend on it.
It still has build_requires (such as CMake), but none of of the development_requires.
For me it's quite important that development and package creations are indeed different steps.
Let's take the test/sample example: if I add build_tests and build_samples as an option, it doesn't make my library any different from a consumer point. But the build with/without tests generates different package hashes (unless, of course I remove this option from computing the hash).
The in-local-cache cache flag might indeed help out here.
When I manually call conan install from source code checked out somewhere, it evaluates to false?
And when I do a conan create from souce code checked out somehwere, it evaluates to true?
In which functions is this variable available?
@jgsogo:
Is it not true that there is a fuzzy boundary between the Package creation and Consuming packages step? You mention there are binaries available in the Consuming packages step but this is not guaranteed to be the case - when a rebuild is triggered for whatever reason is it always necessary to build the docs, tests, samples etc.?
Hi!
@KerstinKeller , I totally agree with you that development of the library and creation of the conan package are two different steps done by different people in the company: first one is done by developers while the second one is done by build engineers (write the recipe, CI scripts,...) and usually run by CIs (actual creation of the binaries and upload to server).
We try to make Conan usage as transparent as possible to the developers, that's the reason we introduced local workflow, editable packages and workspaces, and we are trying to reinvent them all because we think that they are not good enough for developers. Every feedback related to this will be more than welcome.
I'll share with the team the idea of adding development_requires, but right now the best option is to go with build_requires and play with the options as you have commented before. The attribute in_local_cache might be useful too to differentiate between a recipe being in a local folder (only local commands) or in the Conan cache (every conan create ... or conan install <ref>)
@acmeijer
Every library is consumed from the Conan cache (unless you are working with _editable packages_ or _workspaces_), so the binaries will always be there: sometimes you will download them to the cache from the server, sometimes you will build them locally (this process takes place in the cache) and copy them to the proper cache folder. That's what I mean when I say the _"binaries are always available"_.
Of course, if the binaries are not available in the cache, you should build them in order to be able to consume the libraries, but usually the CI should provide all the package-ids that should be consumed by the developers in the company... unless you are developing the library itself, then you don't want to consume the package but work with the sources locally (to help with this we try to provide _local workflow commands_ and _workspaces_, although current approach probably can be improved).
@jgsogo: thanks for your comment.
Regarding your point that CI should usually provide all package-ids that should be consumed - I totally agree but it seems difficult to know in advance what these could be. Especially in a case where there are many dependencies, the combinatorics quickly explodes (different dependency versions, build types, options, compilers etc.).
For example if you are using version ranges, an update to a lower level library needs to trigger a rebuild of all libraries that depend on it - is there a mechanism to figure out which those are? (we are currently using nightly builds for this, but it's not a great compromise).
Another example is the conan-opencv package, which has 25 options. Even just those options alone lead to at least 225 different package-ids which is completely infeasible for the CI to build.
Not sure if there is a good/easy answer to this type of problem. If CI does not build the package you could argue that it is not an "official" version (i.e. built on a controlled environment, all tests, docs etc created). However it also seems infeasible to me for CI to build all possible package-IDs. Maybe some kind of method of requesting CI to build a package would be the answer? I.e. a "conan upload" which does not upload but could notify/query Artifactory about a missing packageId and which Artifactory in turns requests from CI.... (Disclaimer: I have no idea if anything like that is remotely possible)
@KerstinKeller: sorry, this is a little off-topic and I don't want to hijack your issue, if the discussion continues I will create a new issue instead. I placed my initial comment as it was supporting your issue.
So I think, with the local conan command in combination with the in_local_cache variable, we can have a proper workflow for now.
E.g the developers can prepare their environment with
cd _build && mkdir _build
conan install --pr=XXXX ..
conan build --configure ..
Then they have a usable enviornment in which they can start devloping away.
CI can also test development builds, not only package builds.
I have not yet tested what implications this has on the runtime / debugging on Windows. E.g. if all dlls will be found, but I will experiment with this.
However, I am not sure if using the in_local_cache variable will fail us at some point. Maybe we do want to test the actual package creation out of source?
It doesn't have to be development_requires, but some indication (maybe injected from the outside) that this is a development, not a package build / install, could be helpful for the future.
How do other package managers handle this (Java, Python, ...)? Or do they not handle it at all?
Python developers usually have two different files in their repo: requirements.txt and requirements_dev.txt that install a different set of dependencies. Sometimes the setup.py script (the one used to install/build the package) has _different flavors_ so, according to some options, it will install some additional requirements and enable some features. Other languages have a standardized ecosystem so they don't have to deal with compilers, cmakes,... (I think this is the case of Go or Rust). And Java might be the closest to C++, it's more standardized but there are alternatives, anyway I don't have enough experience with it to have/give an opinion 馃槥
We are trying to solve the developers' pain, take it for granted, but it is hard to find a solution that fits _every_ use-case. We keep iterating and learning, and I'm sure that we will be able to improve the solution that we actually have.
@acmeijer, the combinatorial explosion is true and it is untractable, but IMO usually in a company you have a fixed set of dependencies and options are _hardcoded_, compiler version is fixed, flags,... so you only generate a few binaries of all possible ones. At least that's my experience when I've worked in big companies: developers are not free to choose dependencies, options, platforms or compilers, those are fixes and usually it requires some bureaucracy to change any of them.
About version ranges, dependencies being updated and package-ids being affected by them (there are several package-id modes), you might find interesting the usage of _lockfiles_, they can help you to maintain under control version ranges (and package revisions).
We are trying to solve the developers' pain, take it for granted, but it is hard to find a solution that fits every use-case. We keep iterating and learning, and I'm sure that we will be able to improve the solution that we actually have.
And many, many thanks for that!
The requirements.txt vs. requirements_dev.txt files already show, that we're not alone :wink:
If you Conan guys keep this issue in mind and adress it in the future, I will be more than happy, with whichever solution you think is best. In the meanwhile I will stick with in_local_cache.
Thanks as always for fast and constructive feedback!
@acmeijer Yeah maybe we should move your issue to another thread, because it's not completely related. But as @jgsogo mentioned, in our cooperate environment it's the same, and we will "offer" the open source libraries (and also our inhouse ones) in a certain set of flavors. E.g. for OpenCV, we won't have the 25 options that they specify in the conan-opencv recipe, but we will decide on one configuration which the developers will be using. These are offered for a standard set of compilers, and maybe there is a shared option. But often, we set that one as fixed, too.
On the upside, it greatly simplifies our recipes. The developers can't specify other options any more and that greatly simplifies our lives.
How and what to rebuild if we had previously deactivated some features which are essential for some developers, we yet have to figure out.