Dep: panic: canary - shouldn't be possible package

Created on 26 Jan 2017  Â·  17Comments  Â·  Source: golang/dep

checkout github.com/kubernetes/kubernetes @ db2dc78e6398ba673e68e94bd17d28a97dd4a902
run godep restore to make GOPATH == what's in vendor
inside kubernetes rm -rf vendor Godeps
dep init from version 21f357ac6ce7cb25b95bc5b8e4216bb2639084d4

panic: canary - shouldn't be possible package github.com/chai2010/gettext-go/gettext depends on some other package within github.com/chai2010/gettext-go with errors

goroutine 1 [running]:
panic(0x784ce0, 0xc421249dd0)
    /usr/local/go/src/runtime/panic.go:500 +0x1a1
github.com/golang/dep/vendor/github.com/sdboyer/gps.(*solver).selectAtom(0xc4209f8f00, 0xc42043d0a0, 0x1e, 0x0, 0x0, 0x9ba760, 0xc4216cf2f0, 0xc4211bf160, 0x1, 0x1, ...)
    /storage/gopath/src/github.com/golang/dep/vendor/github.com/sdboyer/gps/solver.go:1046 +0x10d5
github.com/golang/dep/vendor/github.com/sdboyer/gps.(*solver).solve(0xc4209f8f00, 0x0, 0x0, 0xc420b26e10)
    /storage/gopath/src/github.com/golang/dep/vendor/github.com/sdboyer/gps/solver.go:389 +0x5aa
github.com/golang/dep/vendor/github.com/sdboyer/gps.(*solver).Solve(0xc4209f8f00, 0x25, 0xc4200100a8, 0x11, 0xc420158ff0)
    /storage/gopath/src/github.com/golang/dep/vendor/github.com/sdboyer/gps/solver.go:317 +0x8e
main.(*initCommand).Run(0x9f41c8, 0xc42000c380, 0x0, 0x0, 0x0, 0x0)
    /storage/gopath/src/github.com/golang/dep/init.go:153 +0x102f
main.main()
    /storage/gopath/src/github.com/golang/dep/main.go:101 +0x68b
bug

Most helpful comment

@ericchiang @cblecker looks like its time to try dep again!

All 17 comments

btw there was an old issue too from when I tried: https://github.com/golang/dep/issues/110

https://github.com/chai2010/gettext-go/blob/master/gettext/hello.go has 2 interesting things:
import "github.com/chai2010/gettext-go"
and it has:
// +build ignore

So the repo is not really broken, since the hello.go file doesn't actually ever get built, but that file does claim to import its parent directory. It's parent does not, in fact, have any .go files.

So while I'm guessing this could be fixed by some changes in hello.go I'm guessing the solver shouldn't panic...

So while I'm guessing this could be fixed by some changes in hello.go I'm guessing the solver shouldn't panic...

For sure not :) We're trying to hash out a bunch of questions right now related to which imports we include from static analysis. It'd probably be good for me/someone to open an issue for this more specifically on gps.

On balance, though, this is kinda good - I hit that panic once six months ago while experimenting with k8s, but I hadn't yet eliminated nondeterminism in solve order, so I couldn't replicate it. It's rankled me ever since, but without a panic sitting in front of me, other things took priority. This gives me a concrete case that hits it, so I can finally get rid of that sucker :)

@eparis we've made a bunch of improvements here, and I _think_ this one should be fixed now. Could you give it another whirl, and let me know if this particular problem still exists? Thanks!

@ericchiang @cblecker looks like its time to try dep again!

Well, the following script didn't panic

#!/bin/bash -x

export GOPATH=$( mktemp -d)

# trap "{ rm -rf $GOPATH ; exit 255; }" EXIT

git clone --depth 1 https://github.com/kubernetes/kubernetes.git $GOPATH/src/k8s.io/kubernetes

cd $GOPATH/src/k8s.io/kubernetes

./hack/godep-restore.sh
rm -rf Godeps vendor

dep init

But dep init has just been stuck hanging for about half an hour. Output here: https://gist.github.com/ericchiang/afaa2a5a0dbbcfe4bc9544a7d6eab9b4

Anything I can run to help debug?

@ericchiang try it with dep init -v. That should at least provide some indication as to where it's hanging.

The first run is often quite expensive, as there's a ton of data it's cloning down. Subsequent runs should be faster. 30 minutes seems much too long, though.

@sdboyer it hung for about an hour after seeming to download the results. For some reason rerunning prevented the hanging.

He's the results form a second run:

$ time dep init -v 2> dep.txt
         select-root: 846.899885ms
         b-list-pkgs:   3.645943ms
  b-deduce-proj-root:    143.729µs
     b-list-versions:     72.594µs
             satisfy:      54.56µs
            new-atom:     41.166µs
               other:      7.871µs
          b-pair-rev:      4.912µs
           b-matches:       2.99µs
     b-source-exists:      2.584µs

  TOTAL: 850.876234ms

solve error: No versions of bitbucket.org/bertimus9/systemstat met constraints:
    1468fd0db20598383c9393cccaa547de6ad99e5e: failed to create repository cache for https://bitbucket.org/bertimus9/systemstat with err:
Unable to get repository: Cloning into '/tmp/tmp.yn03IL1QS4/pkg/dep/sources/https---bitbucket.org-bertimus9-systemstat'...
error: copy-fd: write returned No space left on device
fatal: cannot copy '/usr/share/git-core/templates/description' to '/tmp/tmp.yn03IL1QS4/pkg/dep/sources/https---bitbucket.org-bertimus9-systemstat/.git/description': No space left on device

    master: Could not introduce bitbucket.org/bertimus9/systemstat@master, as it is not allowed by constraint 1468fd0db20598383c9393cccaa547de6ad99e5e from project k8s.io/kubernetes.
    dev: Could not introduce bitbucket.org/bertimus9/systemstat@dev, as it is not allowed by constraint 1468fd0db20598383c9393cccaa547de6ad99e5e from project k8s.io/kubernetes.

real    0m25.968s
user    0m25.766s
sys 0m38.168s

Log file attached.

dep.txt

Sorry ran this outside my /tmp dir to avoid the disk space issue.

Result is: (logs here: https://gist.github.com/ericchiang/40820536d099af696622d50a21f99457)

  ✗ solving failed

Solver wall times by segment:
         b-list-pkgs: 9.473040424s
         select-root: 601.381685ms
              b-gmal: 435.467961ms
             satisfy:  35.575819ms
            unselect:  31.706354ms
         select-atom:  30.368119ms
            new-atom:   5.066916ms
           backtrack:   2.900965ms
     b-list-versions:    957.612µs
  b-deduce-proj-root:    512.472µs
               other:    174.433µs
     b-source-exists:     76.279µs
            add-atom:      44.08µs
          b-pair-rev:      4.701µs
           b-matches:       3.57µs

  TOTAL: 10.61728139s

solve error: No versions of github.com/chai2010/gettext-go met constraints:
    c6fed771bfd517099caf0f7a961671fa8ed08723: "github.com/chai2010/gettext-go/gettext" imports "github.com/chai2010/gettext-go", which contains malformed code: no buildable Go source files in /home/eric/work/dep/tmp/pkg/dep/sources/https---github.com-chai2010-gettext-go
    master: Could not introduce github.com/chai2010/gettext-go@master, as it is not allowed by constraint c6fed771bfd517099caf0f7a961671fa8ed08723 from project k8s.io/kubernetes.
Cached github.com/lpabon/godbc
Cached k8s.io/kubernetes
No versions of github.com/chai2010/gettext-go met constraints:
    c6fed771bfd517099caf0f7a961671fa8ed08723: "github.com/chai2010/gettext-go/gettext" imports "github.com/chai2010/gettext-go", which contains malformed code: no buildable Go source files in /home/eric/work/dep/tmp/pkg/dep/sources/https---github.com-chai2010-gettext-go
    master: Could not introduce github.com/chai2010/gettext-go@master, as it is not allowed by constraint c6fed771bfd517099caf0f7a961671fa8ed08723 from project k8s.io/kubernetes.

Seems like the original issue is solved, but the solver still reports un-resolvable constraints with the Kubernetes repo. Probably a different issue.

"github.com/chai2010/gettext-go/gettext" imports "github.com/chai2010/gettext-go"

Well that's weird; it clearly doesn't. The only thing I see even vaguely like that is the ignored main. But that's just importing itself, which I'm pretty sure we got handled months ago. I'll investigate!

it hung for about an hour after seeming to download the results. For some reason rerunning prevented the hanging.

Very weird. Seems we need more instrumentation. Do you have the output from the long-hanging run?

Oh, yeah, so - the revision y'all currently have locked in - 1468fd0db20598383c9393cccaa547de6ad99e5e - contains uncompilable code in that ignored main file. We do currently look at a small subset of such files - in particular, ignored mains - which causes it to choke. master then gets rejected because it's trying to honor that commit SHA. The overzealous (?) enforcement of that SHA1 will be remedied by the changes in #277; on the other one...well, maybe we should change our rule about not ignoring ignored things, but what dep's telling you is fundamentally true about that dependency.

Also, you need to actually put it in the real GOPATH-relative basedir (we're not rid of GOPATH yet) - so, $GOPATH/src/k8s.io/kubernetes. Otherwise, it'll mistake all of k8s' internal packages for external ones.

(Once that problem is solved, there's a panic with https://bitbucket.org/ww/goautoneg that I'm working out - my hg implementation was wrong, I didn't realize it was possible to _only_ have the "tip" version).

The issue with goautoneg turned out to be a totally different bug than I thought (fixed by #514); my original model of hg versions was OK.

Where it's currently sticking for me sticks for me now is with golang.org/x/exp, because you're relying on golang.org/x/exp/inotify. That's now a dummy package, as the pkg has been moved out and elsewhere. This would likely be addressed by doing a godep restore, which would put a commit on disk that still has the package there, and dep would then pick it up. Just, haven't had a chance to try that yet.

@sdboyer thanks I can see about updating the inotify dependency.

I'm confused why that would impact dep's analysis though, since godep will restore it in the GOPATH at the commit specified by Kubernetes. Mind elaborating a bit?

sure - i was just saying that because i hadn't _yet_ run godep restore yet to repopulate the GOPATH, the solver wasn't being given any information that it should try a previous revision. so, it just went with the tip of master, which obviously doesn't work, then didn't have any more versions to try, so it failed out.

Once godep restore finished, though, dep init -v made it all the way through to a solution 😄

k8s.io/client-go did have some missing package complaints about what was in k/k (my local k/k may be a week out of date now, so...), and that showed some definite bugginess in the trace output. But it still fundamentally appeared to work. I didn't try compiling anything, though.

Closing this out, I don't think this qualifies as an actual bug anymore.

Was this page helpful?
0 / 5 - 0 ratings