Cypress: XVFB is already in use. When running in parallel with Jenkins declarative pipeline

Created on 31 Oct 2018  路  3Comments  路  Source: cypress-io/cypress

Current behavior:

When running two tests in parallel in Jenkins pipeline, one of the test groups fail although the other group works fine. The agent is using a docker image based on cypress/base:8.

Getting the following logs when I run with DEBUG='cypress:xvfb,xvfb':

$ cypress run --spec 'cypress/integration/e2e/mobile_spec.js' --group e2e-mobile --parallel --record

It looks like this is your first time using Cypress: 3.1.0



[17:08:21]  Verifying Cypress can run /root/.cache/Cypress/3.1.0/Cypress [started]

2018-10-31T17:08:21.985Z xvfb lock filename /tmp/.X99-lock

2018-10-31T17:08:21.985Z xvfb lock filename /tmp/.X99-lock

2018-10-31T17:08:21.986Z xvfb setting DISPLAY :99

2018-10-31T17:08:21.988Z xvfb all Xvfb arguments [ ':99' ]

2018-10-31T17:08:21.998Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.010Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.021Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.032Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.042Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.053Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.063Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.074Z xvfb checking if started by looking for the lock file /tmp/.X99-lock

2018-10-31T17:08:22.075Z xvfb lock file /tmp/.X99-lock found after 70 ms

2018-10-31T17:08:25.286Z xvfb restoring process.env.DISPLAY variable

2018-10-31T17:08:25.287Z xvfb lock filename /tmp/.X99-lock

2018-10-31T17:08:25.287Z xvfb lock file /tmp/.X99-lock

2018-10-31T17:08:25.298Z xvfb lock file /tmp/.X99-lock not found when stopping

[17:08:25]  Verifying Cypress can run /root/.cache/Cypress/3.1.0/Cypress [completed]



Opening Cypress...

2018-10-31T17:08:25.304Z xvfb lock filename /tmp/.X99-lock

2018-10-31T17:08:25.304Z xvfb setting DISPLAY :99

2018-10-31T17:08:25.304Z xvfb spawn process error

2018-10-31T17:08:25.305Z xvfb Error: Display :99 is already in use and the "reuse" option is false.

    at Object._spawnProcess (/project/node_modules/@cypress/xvfb/index.js:161:15)

    at /project/node_modules/@cypress/xvfb/index.js:36:16

    at FSReqWrap.cb [as oncomplete] (fs.js:312:19)

Your system is missing the dependency: XVFB



Install XVFB and run Cypress again.



Read our documentation on dependencies for more information:



https://on.cypress.io/required-dependencies



If you are using Docker, we provide containers with all required dependencies installed.

----------



Error: Display :99 is already in use and the "reuse" option is false.

----------



Platform: linux (Debian - 8.10)

Cypress Version: 3.1.0

error Command failed with exit code 1.

The issue looks similar to this.
Although our Jenkins is set up with 2 hosts available for testing. But it seems that sometimes both parallel scripts start on the same host. Is there a way to prevent cypress from sharing the same XVFB dependency between two containers?

Desired behavior:

  • Provide a better error message. Because XVFB is actually installed.
  • Allow the parallel tests to run from one host.

Most helpful comment

Workaround for parallel stages in Jenkins is to add sleep 5 to N-1 of the parallel calls so that the first pod can take ownership of the Xvfb setup and the others can simply connect to it once running:

stage('Quality') {
    agent {
        docker {
            reuseNode true
            args '--ipc=host'
            image 'quay.io/clearscore/jenkins-node-browsers:cypress_v3_4_0'
        }
    }
    steps {
        parallel(
            'test: functional-1': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-2': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-3': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-4': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            }
        )
    }
}

All 3 comments

Just got this issue too

It doesn't seem to be consistent though, when re-running the build it passed

[2019-08-07T14:10:25.315Z] Your system is missing the dependency: Xvfb
[2019-08-07T14:10:25.316Z] Install Xvfb and run Cypress again.
[2019-08-07T14:10:25.316Z] Read our documentation on dependencies for more information:
[2019-08-07T14:10:25.316Z] https://on.cypress.io/required-dependencies
[2019-08-07T14:10:25.316Z] If you are using Docker, we provide containers with all required dependencies installed.
[2019-08-07T14:10:25.316Z] ----------
[2019-08-07T14:10:25.316Z] Error: Display :99 is already in use and the "reuse" option is false.

Am using docker file: quay.io/clearscore/jenkins-node-browsers:cypress_v3_4_0

Our jenkins file (for this step) looks like:

stage('Quality') {
    agent {
        docker {
            reuseNode true
            args '--ipc=host'
            image 'quay.io/clearscore/jenkins-node-browsers:cypress_v3_4_0'
        }
    }
    steps {
        parallel(
            'test: functional-1': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-2': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-3': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-4': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            }
        )
    }
}

(I have edited the Jenkinsfile a bit for simplicity, [*] replaced with internal envs)

Workaround for parallel stages in Jenkins is to add sleep 5 to N-1 of the parallel calls so that the first pod can take ownership of the Xvfb setup and the others can simply connect to it once running:

stage('Quality') {
    agent {
        docker {
            reuseNode true
            args '--ipc=host'
            image 'quay.io/clearscore/jenkins-node-browsers:cypress_v3_4_0'
        }
    }
    steps {
        parallel(
            'test: functional-1': {
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-2': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-3': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            },
            'test: functional-4': {
                sh 'sleep 5'
                sh 'CYPRESS_BASE_URL=http://localhost:8080 yarn cypress run  --record --key [key] --parallel --ci-build-id $BUILD_TAG --spec [list of specs]'
            }
        )
    }
}

It looks like @daviddyball solution is good workaround in this situation.

We've updated our docs to include better instructions on working with Xvfb that we hope will better help with the setup: https://on.cypress.io/continuous-integration#Xvfb

I'll be closing this issue since this is more about the setup required on the machine than an actual bug in Cypress.

Was this page helpful?
0 / 5 - 0 ratings