Conan: CONAN_USER_HOME issues with the Artifactory Jenkins plugin

Created on 28 Mar 2018  路  12Comments  路  Source: conan-io/conan

Conan version: 1.1.1
Jenkins version: 2.89.4
Artifactory plugin version: 2.15.0

In each scenario that follows I mention what changes are made to this pipeline:

pipeline {
  agent none
  stages {
    stage('Build') {
      parallel {
        stage('Build Linux') {
          agent {
            docker {
              image '<internal_image>'
              label 'linux'
            }
          }
          environment {
            CONAN_USER_HOME = "${env.WORKSPACE}/conan_home".toString()
          }
          steps {
            script {
                def server = Artifactory.server env.ARTIFACTORY_ID
                def client = Artifactory.newConanClient userHome: "${env.WORKSPACE}/conan_home".toString()
                def server_name = client.remote.add server: server, repo: env.ARTIFACTORY_REPO

                client.run(command: "config install ${CONAN_SETTINGS_URL}".toString())
                client.run(command: "create --profile Linux-Release . team/unstable")
            }
          }
        }
      }
    }
  }
  environment {
    CONAN_SETTINGS_URL = '<git_repo_URL>'
    ARTIFACTORY_ID     = '<some_ID>'
    ARTIFACTORY_REPO   = '<conan_repo>'
  }
}

Scenario 1:

  • Set CONAN_USER_HOME to ${env.WORKSPACE}/conan_home
  • Don't set userHome in Artifactory.newConanClient

Fails at: def client = Artifactory.newConanClient

Stacktrace:

java.nio.file.NoSuchFileException: <Jenkins_job_Workspace>/conan_home/conan_log.log
    at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
    at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
    at java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:434)
    at java.nio.file.Files.newOutputStream(Files.java:216)
    at hudson.FilePath$22.invoke(FilePath.java:1475)
    at hudson.FilePath$22.invoke(FilePath.java:1470)
    at hudson.FilePath.act(FilePath.java:997)
    at hudson.FilePath.act(FilePath.java:975)
    at hudson.FilePath.touch(FilePath.java:1470)
    at org.jfrog.hudson.pipeline.steps.conan.InitConanClientStep$Execution.getConanClient(InitConanClientStep.java:86)
    at org.jfrog.hudson.pipeline.steps.conan.InitConanClientStep$Execution.run(InitConanClientStep.java:61)
    at org.jfrog.hudson.pipeline.steps.conan.InitConanClientStep$Execution.run(InitConanClientStep.java:38)
    at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousStepExecution.start(AbstractSynchronousStepExecution.java:42)
    at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:229)
    at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:153)
    at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:108)
    at org.jfrog.hudson.pipeline.dsl.ArtifactoryPipelineGlobal.newConanClient(ArtifactoryPipelineGlobal.java:137)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
    at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
    at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213)
    at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022)
    at org.codehaus.groovy.runtime.callsite.PojoMetaClassSite.call(PojoMetaClassSite.java:47)
    at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
    at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
    at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:157)
    at org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
    at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:133)
    at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:155)
    at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:159)
    at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
    at WorkflowScript.run(WorkflowScript:19)
    at ___cps.transform___(Native Method)
    at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:57)
    at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:109)
    at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixName(FunctionCallBlock.java:77)
    at sun.reflect.GeneratedMethodAccessor509.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
    at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
    at com.cloudbees.groovy.cps.Next.step(Next.java:83)
    at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
    at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
    at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:122)
    at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:261)
    at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
    at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:19)
    at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:35)
    at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:32)
    at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:108)
    at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:32)
    at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:174)
    at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:331)
    at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$200(CpsThreadGroup.java:82)
    at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:243)
    at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:231)
    at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:112)
    at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Scenario 2:

  • Don't set CONAN_USER_HOME
  • Don't set userHome in Artifactory.newConanClient

Fails at client.run(command: "config install ${CONAN_SETTINGS_URL}".toString())

Stacktrace:

Also:   hudson.remoting.Channel$CallSiteStackTrace: Remote call to mt-jenkins-slh1
        at hudson.remoting.Channel.attachCallSiteStackTrace(Channel.java:1693)
        at hudson.remoting.UserResponse.retrieve(UserRequest.java:310)
        at hudson.remoting.Channel.call(Channel.java:908)
        at hudson.FilePath.act(FilePath.java:986)
        at hudson.FilePath.act(FilePath.java:975)
        at hudson.FilePath.touch(FilePath.java:1470)
        at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.persistBuildProperties(RunCommandStep.java:121)
        at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.run(RunCommandStep.java:84)
        at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.run(RunCommandStep.java:59)
        at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousNonBlockingStepExecution$1$1.call(AbstractSynchronousNonBlockingStepExecution.java:47)
        at hudson.security.ACL.impersonate(ACL.java:260)
        at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousNonBlockingStepExecution$1.run(AbstractSynchronousNonBlockingStepExecution.java:44)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.nio.file.NoSuchFileException: /tmp/conan5616832223008880803/.conan/artifacts.properties
    at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
    at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
    at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
    at java.nio.file.spi.FileSystemProvider.newOutputStream(FileSystemProvider.java:434)
    at java.nio.file.Files.newOutputStream(Files.java:216)
    at hudson.FilePath$22.invoke(FilePath.java:1475)
    at hudson.FilePath$22.invoke(FilePath.java:1470)
    at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2760)
    at hudson.remoting.UserRequest.perform(UserRequest.java:207)
    at hudson.remoting.UserRequest.perform(UserRequest.java:53)
    at hudson.remoting.Request$2.run(Request.java:358)
    at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
Caused: java.io.IOException: remote file operation failed: /tmp/conan5616832223008880803/.conan/artifacts.properties at hudson.remoting.Channel@66bf50d8:<Jenkins slave Hostname>
    at hudson.FilePath.act(FilePath.java:993)
    at hudson.FilePath.act(FilePath.java:975)
    at hudson.FilePath.touch(FilePath.java:1470)
    at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.persistBuildProperties(RunCommandStep.java:121)
    at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.run(RunCommandStep.java:84)
    at org.jfrog.hudson.pipeline.steps.conan.RunCommandStep$Execution.run(RunCommandStep.java:59)
    at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousNonBlockingStepExecution$1$1.call(AbstractSynchronousNonBlockingStepExecution.java:47)
    at hudson.security.ACL.impersonate(ACL.java:260)
    at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousNonBlockingStepExecution$1.run(AbstractSynchronousNonBlockingStepExecution.java:44)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

Scenario 3:

  • Don't set CONAN_USER_HOME
  • Set userHome in Artifactory.newConanClient to ${env.WORKSPACE}/conan_home

This seems to work. However there is this bur report for this scenario: https://github.com/lasote/skynet_example/issues/2

I've also reported this bug report for this issue: https://issues.jenkins-ci.org/browse/JENKINS-50218

low Jenkins plugin high queue look into

Most helpful comment

Also created this PR (https://github.com/jfrog/jenkins-artifactory-plugin/pull/179), it adds a force and remoteName arguments to client.remote.add, working the same as conan remote add ... so a Jenkinsfile reusing the cache in the workspace would work 馃槈

def server = Artifactory.server artifactory_name
def client = Artifactory.newConanClient(userHome: "${env.WORKSPACE}/conan_home".toString())
def remoteName = client.remote.add server: server, repo: artifactory_repo, force: true

All 12 comments

I have the same issue. I'm using directly the workspace directory (without your 'conan_home').

My current workaround:

//////////////////////////////////////////////////////////////////////////////
// hack because of a bug in artifactory plugin when using custom conan user home
dir('.conan') {
    echo "Created .conan directory"
    touch file: "artifacts.properties"
}
touch file: "conan_log.log"
//////////////////////////////////////////////////////////////////////////////

This will create the missing files and directories (which should be part of the plugin) and the plugin is working.

But after that you will facing the next problem using the conan config install command.
When using def server_name = client.remote.add server: server, repo: env.ARTIFACTORY_REPO you will get a generated UUID as server name and the conan remote add ... and conan user ... command will be invoked with the UUID and the remote URL. So far a very nice feature because you don't have to care about credentials and user handling. But if there is a remotes.txt file in your settings.zip file which defines your private remotes, you will get the error from conan, that a remote with the same URL already exists and can't be added.
Not sure if one can split conan configs into different packages (e.g. one zip for profiles and one for CI settings and one for developer setting) and install them consecutively depending on the current need.

So you have to first install your config, remove your remotes with client.run and then call the client.remote.add.

Btw. if you plan to publish the build info later, you have to reset the trace file path with conan config set "log.trace_file=<YOUR_PATH>" AFTER conan config install ..., because this config will be removed.

Best
Aalmann

Thanks so much for your feedback too @Aalmann

You are welcome. ;-)

Set userHome in Artifactory.newConanClient to ${env.WORKSPACE}/conan_home

Is there an attribute to set the CONAN_USER_HOME_SHORT on the newConanClient construction?

Update: It seems like the combination:

    environment {
        CONAN_USE_ALWAYS_SHORT_PATHS = 'True'; 
        //CONAN_USER_HOME = "${env.WORKSPACE}\\.conan"
        CONAN_USER_HOME_SHORT = "${env.WORKSPACE}\\.co"
        CONAN_NON_INTERACTIVE = 1
    } // environment

   [...]

                    def server = Artifactory.server artifactory_server_id

                    def conanHome = "${env.WORKSPACE}\\.conan".toString()
                    def conanClient = Artifactory.newConanClient userHome: conanHome

works.

Hi, there are a couple of issues here (or more), let's try to answer and organize some of them:


Original issue reported by @ovidiub13, we'll move the conversation to JIRA. If it is an easy fix I'll open a PR to the jenkins-artifactory-plugin 馃憤


@Aalmann says (https://github.com/conan-io/conan/issues/2690#issuecomment-379701231):

But after that you will facing the next problem using the conan config install command.
When using def server_name = client.remote.add server: server, repo: env.ARTIFACTORY_REPO you will get a generated UUID as server name and the conan remote add ... and conan user ... command will be invoked with the UUID and the remote URL. So far a very nice feature because you don't have to care about credentials and user handling. But if there is a remotes.txt file in your settings.zip file which defines your private remotes, you will get the error from conan, that a remote with the same URL already exists and can't be added.
Not sure if one can split conan configs into different packages (e.g. one zip for profiles and one for CI settings and one for developer setting) and install them consecutively depending on the current need.

It is the same issue (or feature request) as the one reported here: https://github.com/jfrog/jenkins-artifactory-plugin/issues/122. I've already opened a PR to the repo, I think it is something easy and nice to have. I'm waiting for feedback from the maintainers.

With this change you can install your configuration using conan config install and, in order to use Artifactory credentials from the plugin, you can call just the new function that will execute only the conan user ... command.


@michaelmaguire , if you feel like adding an attribute to set CONAN_USER_HOME_SHORT using the newConanClient constructor is useful on its own, tell me. If it is just a workaround to bypass the original issue reported in this thread, then I think it is enough if we fix it before adding more arguments to what we already have.


Thanks everyone for the feedback about the plugin, Jenkins and Artifactory, we will do our best to move this plugin forward.

@jgsogo thanks for asking. I don't think Windows path length issues are going away any time soon, so I think it would still be useful to add an attribute to set CONAN_USER_HOME_SHORT using the newConanClient constructor

I'm moving this topic about CONAN_USER_HOME_SHORT here: https://github.com/conan-io/conan/issues/5541 Thanks for the quick feedback.

Hi @jgsogo
at first of all thanks for moving this issue forward. Your fix in jfrog/jenkins-artifactory-plugin#122 looks good so far and will be more readable having the "normal" remote naming instead of UUID naming.
Hoping that it will be merged and published as soon as possible.

Concerning CONAN_USER_HOME_SHORT: I'm completely with @michaelmaguire .
Currently there are too many things which will require the setup of CONAN_USER_HOME_SHORT.

Another important point concerning .conan directory and artifacts.property is already commented at https://github.com/conan-io/conan/issues/2690#issuecomment-379701231
Would be nice to get it fixed too.

Best Aalmann

Hi, @Aalmann . I think I'm not understanding the issue with the artifacts.property and the conan_log.log files you mention in your comment. Having a look at the current sources of the plugin and Conan this is what I see:

conan_log.log:

Given the implementation in this file I can see that for every client.run the CONAN_LOG_FILE is touched: the function getConanClient is called at the very beginning of the run method and it is touching the file:

conanHomeDirectory.child(ConanClient.CONAN_LOG_FILE).touch(Calendar.getInstance().getTimeInMillis());

Anyway, I'm reproducing the issue reported by @ovidiub13 to the Jenkins JIRA to see what is happening there. I get the same failure, but I need to debug the plugin to know what is happening there.

artifacts.properties

Conan is creating this file if it doesn't exist before running any command: this line of code is executed for every command and it is creating the file if it doesn't exist.

So, from my understanding, this file is always created if it is not already there. Am I missing something (probably)? Is it an issue corresponding to a previous version?

I've submitted a PR for the issue related to conan_log.log: https://github.com/jfrog/jenkins-artifactory-plugin/pull/178 馃憤 The problem was the directory not being created, not the file itself.

Also created this PR (https://github.com/jfrog/jenkins-artifactory-plugin/pull/179), it adds a force and remoteName arguments to client.remote.add, working the same as conan remote add ... so a Jenkinsfile reusing the cache in the workspace would work 馃槈

def server = Artifactory.server artifactory_name
def client = Artifactory.newConanClient(userHome: "${env.WORKSPACE}/conan_home".toString())
def remoteName = client.remote.add server: server, repo: artifactory_repo, force: true

Hi! The new release (3.4.0) of the Artifactory Jenkins plugin contains a couple of PR that should resolve the issues posted here:

An important warning for those of you sharing the same cache for different concurrent jobs: Conan cache can store only one revision, so if those jobs require different revisions of the same recipe/package the best that can happen is a failure, but it is probably UB.

I'm closing this issue, if something is not working as expected with the new 3.4.0 version, please open a new one.

Thanks you alll for your patience.

Was this page helpful?
0 / 5 - 0 ratings