Conan: Questions about development workflow

Created on 18 Mar 2019  路  10Comments  路  Source: conan-io/conan

Hi,

I have some questions about how to implement our development workflow on top of Conan. Here is what we have currently:

  • A large number of so-called 'bundles' which are separate components that compile to a static library each. They form a dependency chain on header level: B depends on A, C depends on B and A, D depends on C, and so on.
  • Bundles use CMake to build and cross-compile to 10 different platforms, including RTOS targets.
  • Currently 'dependency management' is done on source code level by using SVN externals which is of course far from ideal.

I'd like to use Conan to build packages for each of the bundles. The conanfile.py should be committed with the sources, i.e. in the same repository. However, header file organization is not standard-conformant in the current bundle project setups and I cannot change it (for compatibility reasons) so the preferred exports_sources approach does not work for me. The technical reason is that headers are always located in the include/ folder, directly in the repo root, but include statements in the code start with bundleName/include/ (reusing the working copy folder name which I cannot fake with the exports_sources method)... Currently I'm using the scm way and add the subfolder property to get that stupid extra folder into place:

scm = {
  "type": "svn",
  "url": "auto",
  "revision": "auto",
  "subfolder": bundleName,
}
no_copy_source = True

In order to make local Conan builds and builds from the exported package file work at the same time I needed the following workaround in the build() method:

if self.source_folder.endswith("/" + bundleName):
  real_source_folder = "src"
else:
  real_source_folder = bundleName + "/src"

cmake.configure(source_folder=real_source_folder)

Now the questions:

  1. Is there a way to avoid this workaround and/or make exports_sources work for my non-standard project layout? If one could specify the layout which is used when unpacking the sources exported with exports_sources I could insert the extra bundleName/ folder to fix our includes.
  2. How to do local development in this scenario? Often times it is needed to modify sources from multiple bundles at the same time because of the header-level interdependencies. My preferred workflow would be something like this:

    • Check out/clone application repo which contains a conanfile.txt with dependencies to a number of bundles (application developer case) OR check out/clone bundle repo which contains a conanfile.py with dependencies to other bundles (bundle developer case)

    • Execute a Conan command which sets up local development environment including the sources of a selected range of dependencies which can compile the whole project without having to go through the packaging workflow during development

  3. Which cmake generator approach would be the best for that workflow? I would say cmake_find_package together with cmake_paths so that I don't have to inject the module paths manually.
  4. I don't get the versioning thing in Conan. It confuses me even more when I think about how it interacts with our packaging workflow. Usually we would like to export packages only for bundle releases so we'd use a version string like bundle/1.2.3.4@company/releases. But on the other hand we have cases where a specific set of bundle revisions need to be built over and over again, e.g. if two developers are working on the same base revision of bundle B which depends on a specific revision of bundle A which never changes. So developer 2 could reuse the binary package for bundle A that developer 1 already built before. What version number/user/channel would developer 1 export this package with? Maybe something like bundle/development-rev12345@company/testing?
  5. How would you go about switching from development package to release package then? The channel would have to be adapted for the packages itself and all of their dependencies, e.g. I'd have to replace all company/testing in the requirements to company/releases.

That was a lot of text and many questions. I'm sure that all of this can be done with Conan, probably I'm just too stupid to extract the required information from the docs.

Best regards,
Johannes

question

Most helpful comment

I think we could implement it on top of the channels. I'd still prefer a built-in workflow because it's a common problem and there should be a standard pattern for it. The lockfile mechanism you mentioned looks interesting. Looking forward to seeing it in action.

I have enough information now to continue so I'll close this issue now. Thank you for taking the time explaining these things to me and keep on the good work!

All 10 comments

  1. Could self.copy("*.h", dst="include/bundle_name/include", src="bundle_name/include") work? (inside the package() method).
    EDIT: NVM you need this done before build.

Regarding question number 3: We are now using cmake_find_package only. Setting cmake_paths additionally does not help because the conan_paths.cmake has to be injected then instead of setting the module path which is not much of an improvement. If I understood correctly we could also switch to cmake_paths exclusively once our component CMakeLists.txt export their targets properly using install(EXPORT...).

Hi, sorry for the late response.
Many questions and different aspects of your issue!

Hope it helps!

Thank you for your response!

The workspace feature looks promising, however, I find it quite awkward that I have to create a workspace definition file. I would expect that the workspace is already defined sufficiently by the requirements of my top-level/consumer package. Package references can be deduced from the requirements definition (transitively?), local paths could be chosen according to the package names. So why not provide a default which can be overloaded by an optional workspace definition?

I don't understand what the layout is for. It looks like it's just duplicating information that is already available somewhere else because without that information Conan would be unable to build the packages standalone in the first place...?

I would expect that the workspace is already defined sufficiently by the requirements of my top-level/consumer package

Well, you need to specify where is the code to work with, for every dependency you want to develop. That's also why you need to specify a layout. You don't know in advance where are the build/bin/include folders when you develop the package locally.

That's true but the dependency recipes know how to retrieve the source repo and can be built independently so why not check out the dependencies into subfolders of the consumer directory tree in that case and build it locally? This should work as long as the packages have source() methods that are checking out some repository. If the recipes are in the same repo as the sources and use exports_sources you have no way of obtaining a working copy that you can commit back to...

As a sidenote: I think that Conan needs a mechanism like the snapshot dependencies in Maven. That way you could have your dependencies set to 'development versions' and make Conan warn during the release phase if a recipe is exported that still references snapshot versions in its requirements. I don't think that the revisions feature mentioned earlier is sufficient for that use-case.

Yes, That's an idea we had in mind but still not there. Something like a "conan clone" of a dep.

+1 for the conan clone feature. What about the snapshot thing I mentioned in my last post? Does it make sense to have something like that built-in or is there a possibility to implement it with what's there already?

Can't the channel could be used for that?
Also in our roadmap is to develop a lockfile mechanism so you get exactly the same packages when use it.

I think we could implement it on top of the channels. I'd still prefer a built-in workflow because it's a common problem and there should be a standard pattern for it. The lockfile mechanism you mentioned looks interesting. Looking forward to seeing it in action.

I have enough information now to continue so I'll close this issue now. Thank you for taking the time explaining these things to me and keep on the good work!

Was this page helpful?
0 / 5 - 0 ratings