Short:
When creating Conan packages for own components, do you have best practices how to handle (=store, distribute, trace) quality artefacts, like Test Reports or compiled doxygen Pages?
More details:
I want to release my C components via Conan packages. I see how I can achieve traceability between a Conan package and the sources in git.
But I also have to ship Quality Artefacts like Test Reports or Doxygen HTML Pages together with my release.
So my initial idea was:
Wouldn't it be cool, to add these artefacts to the Conan package, similar as for pre-compiled binaries? But just using the package method does not 'feel' right (mixing up compiler artifacts with quality artifacts).
What do you think about this? Or do you even have better ideas or approaches?
I could not find information about this topic in the docs of Conan 1.12
To help us debug your issue please explain:
Hi @mjbayer!
I had the same situation some time ago. But in my case I had Gitlab, where I could post all result as CI artifact.
I don't think is a good idea publish your artifacts with the packages, since it's just a result for quality and your users are more interested in libraries and headers. But you can use Hooks to upload them after to package.
If you have a website where you need to upload the artifacts, you could use the event post_upload and add some lines to update your results.
Regards!
Hi @uilianries
Thanks for the fast feedback! 馃憤
In my case the packages would be distributed just within one company (via our artifactory servers). For this we have some (automotive) requirements to fulfill.
You're right, many users (e.g. developers, integrators) might not be interested in these artefacts, some other users (e.g. QA) insists on having it. So I thought having them directly in the pre-compiled "packages" is not a good idea, cause then, the developers get the test reports and the QA guy gets the library files :D
With Hooks we would have this issue, right?
On the other hand, having them inside the package _context_ has a big advantage: Tests often run just for specific combinations of Conan settings/options ("variants") - we should make it transparent for which variant the tests run.
Other reasons for not putting it into git:
One idea could be:
package_docs and package_reports method. These artefacts go inside the package folder in the conan cache / artifactory as conan_docs.tgz and conan_reports.tgzowner/component/version/channel/package/hash-sum1/
conan_packages.tgz
conaninfo.txt
conanmanifest.txt
conan_docs.tgz <- optional
conan_reports.tgt <- optional
What do you think?
In my case the packages would be distributed just within one company (via our artifactory servers). For this we have some (automotive) requirements to fulfill.
So, are you not able to expose the results in a web site, are you? You really need to distribute as a standalone product.
For the component recipes, add an optional package_docs and package_reports method. These artefacts go inside the package folder in the conan cache / artifactory as conan_docs.tgz and conan_reports.tgz
I don't recommend change anything in the cache. All conan commands are prepared to handle specific files. What you can do is creating generic repo and upload those artifacts. It could work now, but you don't have sure about the next Conan version. It could break on the next release and probably we won't fix because it's an workaround aside.
If it's mandatory share the reports, you can use the package method in the recipe to upload all files, or even add a hook keepping the recipe separated. Also, any QA or developer will be able to add a hook to download all those artifacts when installing a package.
Add additional option to conan download -p command: --docs and --reports, to retrieve the docs and reports as well (maybe also add a --skip-package option, for the people who are not interested in the pre-compiled binaries)
Since your case is very specific, I don't think adding new options is a good idea.
@memsharded any ideas how to handle report files (test result, docs) with package, without changing the package id?
I would talk about _assets_, about having a conan_assets.tgz file that doesn't affect the package_id and it is built from an export_assets member in the conanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)
I would explore the _hooks_ approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in:
post_package (maybe pre_upload) hook: generate quality artifacts, test reports, doxygen,... if some of these processes fail you can block the upload of the package itself.post_upload hook: now you can gather all the files and artifacts, compress them and upload it to Artifactory using the JFrog CLI. post_download hook: your QA team will have this hook activated, it will retrieve from Artifactory the file you uploaded in the previous hook, and make it available for them to consume it.I think that implementation through hooks could be more flexible and satisfy what you are trying to achieve, what do you think about it?
Thanks again for the very fast feedback, I really like how you react and how the project evolves 馃槏
So, are you not able to expose the results in a web site, are you? You really need to distribute as a standalone product.
Yes that's basically the point. Well there is no hard requirement not to put it on a website, but there is e.g. the requirement to store it for next 15 years. Having everything together (e.g. in one package) makes it easier, in my opinion.
I don't recommend change anything in the cache. All conan commands are prepared to handle specific files. What you can do is creating generic repo and upload those artifacts. It could work now, but you don't have sure about the next Conan version. It could break on the next release and probably we won't fix because it's an workaround aside.
You're right, that only make sense if you support this use case.
Since your case is very specific, I don't think adding new options is a good idea.
I'm not sure if it is so specific. I guess all automotive companies (and other regulated industries) have similar requirements.
I would talk about _assets_, about having a
conan_assets.tgzfile that doesn't affect the package_id and it is built from anexport_assetsmember in theconanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)
Perfect! "Assets" is the name that I was searching for ;)
You're right, the assets shall not change the package_id.
I can understand your doubts about the new command.
@ all: Does anybody else need such a feature?
I would explore the hooks approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in
I like the idea of the hooks approach, that both of you suggested. Thanks! But I'm not sure if I understand it completely. Is this right?
owner/component/version/channel/package/hash-sum/ (I need the hash sum, in order to be able to map a test report to the specific combination of settings/options that were used during the test)For the creators of a package:
pre_upload_package hook in place (because I only want the assets on the server, when somebody is calling conan upload -p /--all. I don't want the assets when somebody is just calling conan create)For the consumers of a package:
conan download aa/0.1.0@bb/cc -p 123abc command. I have a post_download_package hook in place, that downloads from the artifactory assets repo the assets for given packageIf I understood it correctly, then this would be feasible but I'd still prefer having it directly in the Conan package as conan_assets.tgz, in case this is a use case that other people also have 馃槃
But I also have to ship Quality Artefacts like Test Reports or Doxygen HTML Pages together with my release.
Isn't that what build info does? It links artifacts with CI job / build produced it so you can trace it back.
(As described here: https://docs.conan.io/en/latest/howtos/generic_ci_artifactory.html).
It is also true that most of CI systems do some kind of processing of artifacts (for example, parse jUnit reports or highlight uncovered code in the pull / merge request) so having them in a Conan package will just make them invisible.
Besides, the consumers of Conan package and Quality Artifact are different (Except when you need to finally aggregate everything into a "deliverable") and there is no 1:1 matching between Quality Artifact and Conan package (for example a static analysis shouldn't run for every binary).
Most helpful comment
I would talk about _assets_, about having a
conan_assets.tgzfile that doesn't affect the package_id and it is built from anexport_assetsmember in theconanfile.py... this is just a suggestion, not sure about it, it scares me a little bit (new file, new commands, new integrity checks, people mixing the content of both zipped files,...)I would explore the _hooks_ approach along with the Artifactory JFrog CLI, this is a more flexible one as you can have different hooks activated for developers and QA team, each team will upload/download the files it is interested in:
post_package(maybepre_upload) hook: generate quality artifacts, test reports, doxygen,... if some of these processes fail you can block the upload of the package itself.post_uploadhook: now you can gather all the files and artifacts, compress them and upload it to Artifactory using the JFrog CLI.post_downloadhook: your QA team will have this hook activated, it will retrieve from Artifactory the file you uploaded in the previous hook, and make it available for them to consume it.I think that implementation through hooks could be more flexible and satisfy what you are trying to achieve, what do you think about it?