Shields: Fix jenkins cobertura coverage

Created on 17 Nov 2018  路  9Comments  路  Source: badges/shields

This may be a legitimate failure. If nothing else, it should prompt a change to the error handling.

  12) JenkinsCoverage
       cobertura: valid coverage

    [ GET /c/https/updates.jenkins-ci.org/job/hello-project/job/master.json ]:

      AssertionError: expected { Object (name, value) } to deeply equal { name: 'coverage', value: '64%' }
      + expected - actual

       {
         "name": "coverage"
      -  "value": "invalid response data"
      +  "value": "64%"
       }

      at Object.pathMatch.matchJSON (node_modules/icedfrisby/lib/pathMatch.js:138:38)
      at _expect (node_modules/icedfrisby/lib/icedfrisby.js:583:10)
      at IcedFrisbyNock._invokeExpects (node_modules/icedfrisby/lib/icedfrisby.js:1261:26)
      at start (node_modules/icedfrisby/lib/icedfrisby.js:1244:12)
      at Request.runCallback [as _callback] (node_modules/icedfrisby/lib/icedfrisby.js:1131:16)
      at Request.self.callback (node_modules/request/request.js:185:22)
      at Request.<anonymous> (node_modules/request/request.js:1161:10)
      at IncomingMessage.<anonymous> (node_modules/request/request.js:1083:12)
      at endReadableNT (_stream_readable.js:1064:12)
      at _combinedTickCallback (internal/process/next_tick.js:139:11)
      at process._tickDomainCallback (internal/process/next_tick.js:219:9)

  13) JenkinsCoverage
       cobertura: valid coverage (badge URL without leading /job after Jenkins host)

    [ GET /c/https/updates.jenkins-ci.org/hello-project/job/master.json ]:

      AssertionError: expected { Object (name, value) } to deeply equal { name: 'coverage', value: '64%' }
      + expected - actual

       {
         "name": "coverage"
      -  "value": "invalid response data"
      +  "value": "64%"
       }

      at Object.pathMatch.matchJSON (node_modules/icedfrisby/lib/pathMatch.js:138:38)
      at _expect (node_modules/icedfrisby/lib/icedfrisby.js:583:10)
      at IcedFrisbyNock._invokeExpects (node_modules/icedfrisby/lib/icedfrisby.js:1261:26)
      at start (node_modules/icedfrisby/lib/icedfrisby.js:1244:12)
      at Request.runCallback [as _callback] (node_modules/icedfrisby/lib/icedfrisby.js:1131:16)
      at Request.self.callback (node_modules/request/request.js:185:22)
      at Request.<anonymous> (node_modules/request/request.js:1161:10)
      at IncomingMessage.<anonymous> (node_modules/request/request.js:1083:12)
      at endReadableNT (_stream_readable.js:1064:12)
      at _combinedTickCallback (internal/process/next_tick.js:139:11)
      at process._tickDomainCallback (internal/process/next_tick.js:219:9)

Ref: #1359

bug good first issue keep-service-tests-green service-badge

All 9 comments

So one thing I've found is that if I remove the Conditionals coverage object from the elements array in the mocked reply object that these two tests will start passing again.

I.e. the current response fails

.reply(200, {
  results: {
    elements: [
      {
        name: 'Conditionals',
        ratio: 95.146,
      },
      {
        name: 'Lines',
        ratio: 63.745,
      },
    ],
  },
})

but the same response object with just the one Lines object in elements will pass:

.reply(200, {
  results: {
    elements: [
      {
        name: 'Lines',
        ratio: 63.745,
      },
    ],
  },
})

When I hit a live instance of the API with the same query params Shields uses, I get the following JSON response with various objects in the elements array:

{  
   "_class":"hudson.plugins.cobertura.targets.CoverageResult",
   "results":{  
      "elements":[  
         {  
            "name":"Packages",
            "ratio":92.47312
         },
         {  
            "name":"Files",
            "ratio":90.55556
         },
         {  
            "name":"Classes",
            "ratio":90.94693
         },
         {  
            "name":"Methods",
            "ratio":77.76264
         },
         {  
            "name":"Lines",
            "ratio":74.3159
         },
         {  
            "name":"Conditionals",
            "ratio":65.21124
         }
      ]
   }
}

And i also see that the joi schema for the cobertura response here has name: 'Lines'. I don't have a ton of experience with joi, so should that line filter the objects in elements to exclude those with a name other than Lines? Or does that schema expect the name for _each_ object to be Lines?

Ooh, thanks for looking into this!

I see. Yea, array.items() wants each item to match the schema.

I think the thing to do is change that name: 'Lines' to name: Joi.string().required(). We could chain on an array.has(schema) to make sure name: 'Lines' for one of the elements.

Then we can filter the results in the handler.

That's what I was thinking as well. I'll happily take this one and implement that change

I didn't want to open a new issue for this (yet) but as an FYI..

The live test I added failed in the most recent daily build but i'm positive this was simply due to the test taking longer than the mocha timeout period of 5 seconds. I've seen this a couple times when I run the test locally where it occasionally takes 5-7 seconds.

I'll try to keep an eye on this test and if it keeps intermittently failing I can try to find a different live jenkins endpoint that will hopefully respond more quickly, unless there's a possibility of increasing the mocha timeout period for the service tests in the daily build? I think this particular test would be fine as is with a timeout period of 7500ms vs. the current 5000ms

(5) JenkinsCoverage
       cobertura: job found

    [ GET /c/https/builds.apache.org/job/olingo-odata4-cobertura.json ]:
     Error: Request timed out after 5000 ms
      at Timeout.handleRequestTimeout [as _onTimeout] (node_modules/icedfrisby/lib/icedfrisby.js:1098:35)

@calebcartwright you can increase the timeout on a per test basis. Simply add .timeout(10000) as we do in the F-Droid service for example.

Excellent! Semi-random question for you all.. do you care if I add on a change like this to a different, but existing, PR I already have open (say #2477 ), or do you prefer separate PRs for different changes?

I would recommend opening a new PR, as the currently opened one is for an unrelated service. 馃槈

Gotcha, thanks @PyvesB!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

agemooij picture agemooij  路  27Comments

calebcartwright picture calebcartwright  路  37Comments

paladox picture paladox  路  72Comments

tankerkiller125 picture tankerkiller125  路  42Comments

guylepage3 picture guylepage3  路  30Comments