Conan: [question] How to handle quality artefacts?

Created on 7 Mar 2019  路  6Comments  路  Source: conan-io/conan

Short:
When creating Conan packages for own components, do you have best practices how to handle (=store, distribute, trace) quality artefacts, like Test Reports or compiled doxygen Pages?

More details:
I want to release my C components via Conan packages. I see how I can achieve traceability between a Conan package and the sources in git.
But I also have to ship Quality Artefacts like Test Reports or Doxygen HTML Pages together with my release.

So my initial idea was:
Wouldn't it be cool, to add these artefacts to the Conan package, similar as for pre-compiled binaries? But just using the package method does not 'feel' right (mixing up compiler artifacts with quality artifacts).
What do you think about this? Or do you even have better ideas or approaches?

I could not find information about this topic in the docs of Conan 1.12

To help us debug your issue please explain:

  • [x] I've read the CONTRIBUTING guide.
  • [x] I've specified the Conan version, operating system version and any tool that can be relevant.
  • [x] I've explained the steps to reproduce the error or the motivation/use case of the question/suggestion.
Feedback please! triaging

Most helpful comment

I would talk about _assets_, about having a conan_assets.tgz file that doesn't affect the package_id and it is built from an export_assets member in the conanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)

I would explore the _hooks_ approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in:

  1. post_package (maybe pre_upload) hook: generate quality artifacts, test reports, doxygen,... if some of these processes fail you can block the upload of the package itself.
  2. post_upload hook: now you can gather all the files and artifacts, compress them and upload it to Artifactory using the JFrog CLI.
  3. post_download hook: your QA team will have this hook activated, it will retrieve from Artifactory the file you uploaded in the previous hook, and make it available for them to consume it.

I think that implementation through hooks could be more flexible and satisfy what you are trying to achieve, what do you think about it?

All 6 comments

Hi @mjbayer!

I had the same situation some time ago. But in my case I had Gitlab, where I could post all result as CI artifact.

I don't think is a good idea publish your artifacts with the packages, since it's just a result for quality and your users are more interested in libraries and headers. But you can use Hooks to upload them after to package.

If you have a website where you need to upload the artifacts, you could use the event post_upload and add some lines to update your results.

Regards!

Hi @uilianries
Thanks for the fast feedback! 馃憤

In my case the packages would be distributed just within one company (via our artifactory servers). For this we have some (automotive) requirements to fulfill.
You're right, many users (e.g. developers, integrators) might not be interested in these artefacts, some other users (e.g. QA) insists on having it. So I thought having them directly in the pre-compiled "packages" is not a good idea, cause then, the developers get the test reports and the QA guy gets the library files :D
With Hooks we would have this issue, right?

On the other hand, having them inside the package _context_ has a big advantage: Tests often run just for specific combinations of Conan settings/options ("variants") - we should make it transparent for which variant the tests run.

Other reasons for not putting it into git:

  • The reports and documentation is often in non-text/binary format. To put this on Artifactory feels right.
  • By checking in the test-reports/docs into git, you create a new commit or alter an existing commit. But we have to specifically release the commit that we tested.

One idea could be:

  • For the component recipes, add an optional package_docs and package_reports method. These artefacts go inside the package folder in the conan cache / artifactory as conan_docs.tgz and conan_reports.tgz
  • The package structure in the Conan cache / artifactory could look like this:
owner/component/version/channel/package/hash-sum1/
                                            conan_packages.tgz
                                            conaninfo.txt
                                            conanmanifest.txt
                                            conan_docs.tgz           <- optional
                                            conan_reports.tgt        <- optional
  • When running conan install, everything should behave as it is right now (this command is usually used in the build process - you don't want the docs / reports there)
  • Add additional option to conan download -p command: --docs and --reports, to retrieve the docs and reports as well (maybe also add a --skip-package option, for the people who are not interested in the pre-compiled binaries)

What do you think?

In my case the packages would be distributed just within one company (via our artifactory servers). For this we have some (automotive) requirements to fulfill.

So, are you not able to expose the results in a web site, are you? You really need to distribute as a standalone product.

For the component recipes, add an optional package_docs and package_reports method. These artefacts go inside the package folder in the conan cache / artifactory as conan_docs.tgz and conan_reports.tgz

I don't recommend change anything in the cache. All conan commands are prepared to handle specific files. What you can do is creating generic repo and upload those artifacts. It could work now, but you don't have sure about the next Conan version. It could break on the next release and probably we won't fix because it's an workaround aside.

If it's mandatory share the reports, you can use the package method in the recipe to upload all files, or even add a hook keepping the recipe separated. Also, any QA or developer will be able to add a hook to download all those artifacts when installing a package.

Add additional option to conan download -p command: --docs and --reports, to retrieve the docs and reports as well (maybe also add a --skip-package option, for the people who are not interested in the pre-compiled binaries)

Since your case is very specific, I don't think adding new options is a good idea.

@memsharded any ideas how to handle report files (test result, docs) with package, without changing the package id?

I would talk about _assets_, about having a conan_assets.tgz file that doesn't affect the package_id and it is built from an export_assets member in the conanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)

I would explore the _hooks_ approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in:

  1. post_package (maybe pre_upload) hook: generate quality artifacts, test reports, doxygen,... if some of these processes fail you can block the upload of the package itself.
  2. post_upload hook: now you can gather all the files and artifacts, compress them and upload it to Artifactory using the JFrog CLI.
  3. post_download hook: your QA team will have this hook activated, it will retrieve from Artifactory the file you uploaded in the previous hook, and make it available for them to consume it.

I think that implementation through hooks could be more flexible and satisfy what you are trying to achieve, what do you think about it?

Thanks again for the very fast feedback, I really like how you react and how the project evolves 馃槏

So, are you not able to expose the results in a web site, are you? You really need to distribute as a standalone product.

Yes that's basically the point. Well there is no hard requirement not to put it on a website, but there is e.g. the requirement to store it for next 15 years. Having everything together (e.g. in one package) makes it easier, in my opinion.

I don't recommend change anything in the cache. All conan commands are prepared to handle specific files. What you can do is creating generic repo and upload those artifacts. It could work now, but you don't have sure about the next Conan version. It could break on the next release and probably we won't fix because it's an workaround aside.

You're right, that only make sense if you support this use case.

Since your case is very specific, I don't think adding new options is a good idea.

I'm not sure if it is so specific. I guess all automotive companies (and other regulated industries) have similar requirements.

I would talk about _assets_, about having a conan_assets.tgz file that doesn't affect the package_id and it is built from an export_assets member in the conanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)

Perfect! "Assets" is the name that I was searching for ;)
You're right, the assets shall not change the package_id.

I can understand your doubts about the new command.
@ all: Does anybody else need such a feature?

I would explore the hooks approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in

I like the idea of the hooks approach, that both of you suggested. Thanks! But I'm not sure if I understand it completely. Is this right?

  • I create a new generic artifactory repostiory for my assets
  • In that repo, I need the same folder structure as the conan repo, i.e. owner/component/version/channel/package/hash-sum/ (I need the hash sum, in order to be able to map a test report to the specific combination of settings/options that were used during the test)
  • Does this have any interference with upcoming Conan changes regarding revisions?

For the creators of a package:

  • I have a pre_upload_package hook in place (because I only want the assets on the server, when somebody is calling conan upload -p /--all. I don't want the assets when somebody is just calling conan create)
  • In this hook, I create for each package that shall be uploaded above folder structure in artifactory and upload the assets. Afterwards, conan will upload my conan package.

    • What if uploading the assets succeeds but uploading the package fails? I would somehow have to remove the assets again. Is there an "error hook" or something similar that I could use?

    • When using separate Conan repositories per channel (e.g. to protect overwrite/deletion for certain channels), I would also have to create separate generic artifactory repositories for the assets, in order to have the same protection mechanism. That's not optimal. Or do you have another idea?

For the consumers of a package:

  • A consumer calls conan download aa/0.1.0@bb/cc -p 123abc command. I have a post_download_package hook in place, that downloads from the artifactory assets repo the assets for given package

If I understood it correctly, then this would be feasible but I'd still prefer having it directly in the Conan package as conan_assets.tgz, in case this is a use case that other people also have 馃槃

But I also have to ship Quality Artefacts like Test Reports or Doxygen HTML Pages together with my release.

Isn't that what build info does? It links artifacts with CI job / build produced it so you can trace it back.
(As described here: https://docs.conan.io/en/latest/howtos/generic_ci_artifactory.html).

It is also true that most of CI systems do some kind of processing of artifacts (for example, parse jUnit reports or highlight uncovered code in the pull / merge request) so having them in a Conan package will just make them invisible.

Besides, the consumers of Conan package and Quality Artifact are different (Except when you need to finally aggregate everything into a "deliverable") and there is no 1:1 matching between Quality Artifact and Conan package (for example a static analysis shouldn't run for every binary).

Was this page helpful?
0 / 5 - 0 ratings