A helpful error message.
I can't tell if it's coming from skaffold or docker cli:
$ skaffold dev -f foo/skaffold.yaml
WARN[0001] Error processing base image for onbuild triggers: Get https://auth.docker.io/token?scope=repository%3Alibrary%2Fpython%3Apull&service=registry.docker.io: exit status 1. Dependencies may be incomplete.
WARN[0001] Error processing base image for onbuild triggers: Get https://auth.docker.io/token?scope=repository%3Amicrosoft%2Fdotnet%3Apull&service=registry.docker.io: exit status 1. Dependencies may be incomplete.
WARN[0001] Error processing base image for onbuild triggers: Get https://auth.docker.io/token?scope=repository%3Alibrary%2Fnode%3Apull&service=registry.docker.io: exit status 1. Dependencies may be incomplete.
Skaffold version: v0.5.0
Operating system: macOS
Content of skaffold.yaml:
apiVersion: skaffold/v1alpha2
kind: Config
build:
artifacts:
- imageName: vote
workspace: voting-app/vote/
- imageName: worker
workspace: voting-app/worker/
- imageName: result
workspace: voting-app/result/
deploy:
kubectl:
manifests:
- ./voting-app/k8s-specifications/*
~/.docker/config contents:{
"credHelpers" : {
"us.gcr.io" : "gcloud",
"asia.gcr.io" : "gcloud",
"gcr.io" : "gcloud",
"staging-k8s.gcr.io" : "gcloud",
"eu.gcr.io" : "gcloud"
},
"auths" : {
"https://gcr.io" : {
},
"us.gcr.io" : {
},
"https://staging-k8s.gcr.io" : {
},
"k8s.gcr.io" : {
},
"https://eu.gcr.io" : {
},
"staging-k8s.gcr.io" : {
},
"gcr.io" : {
},
"asia.gcr.io" : {
},
"https://k8s.gcr.io" : {
},
"https://asia.gcr.io" : {
},
"https://us.gcr.io" : {
},
"eu.gcr.io" : {
}
},
"HttpHeaders" : {
"User-Agent" : "Docker-Client/18.03.0-ce (darwin)"
},
"experimental" : "enabled",
"credsStore" : "osxkeychain",
"orchestrator" : "swarm"
}
skaffold dev command above.~/.docker/config.json made things proceed, but I get different warnings:skaffold dev -f 10-local-build-run-with-minikube/skaffold.yaml
2018/05/01 23:31:46 Unable to read "/Users/ahmetb/.docker/config.json", falling back on anonymous: open /Users/ahmetb/.docker/config.json: no such file or directory
2018/05/01 23:31:47 manifest digest: "sha256:ee8b362b5afc1e9db00b79c4db7d7744986507a8ba1d88f53e531e46e93454ec" does not match Docker-Content-Digest: "sha256:a346d13efede199fe161e5146acf080a205256431d82cc26f845eeb9c47b7bf7" for "index.docker.io/library/python:2.7-alpine"
2018/05/01 23:31:48 Unable to read "/Users/ahmetb/.docker/config.json", falling back on anonymous: open /Users/ahmetb/.docker/config.json: no such file or directory
2018/05/01 23:31:49 manifest digest: "sha256:01528b7048c72d7a36f4c216ce1b2c60751308b27538c140f12e0b10ba392246" does not match Docker-Content-Digest: "sha256:f4ea9cdf980bb9512523a3fb88e30f2b83cce4b0cddd2972bc36685461081e2f" for "index.docker.io/microsoft/dotnet:2.0.0-sdk"
2018/05/01 23:31:49 Unable to read "/Users/ahmetb/.docker/config.json", falling back on anonymous: open /Users/ahmetb/.docker/config.json: no such file or directory
2018/05/01 23:31:50 manifest digest: "sha256:14b627a91c92566d489d9d9073e465563be0e0c598c9537aa32e871a812018f5" does not match Docker-Content-Digest: "sha256:6bb963d58da845cf66a22bc5a48bb8c686f91d30240f0798feb0d61a2832fc46" for "index.docker.io/library/node:8.9-alpine"
Starting build...
Found [minikube] context, using local docker daemon.
WARN[0004] run: build: build step: running build for artifact: running build: read auth configs: docker config: opening docker config: open /Users/ahmetb/.docker/config.json: no such file or directory
Watching for changes...
/cc @dekkagaijin
This is happening to me too.
@ahmetb are you using a multi-stage Docker build by any chance?
That error is emitted from function GetDockerfileDependencies() which is documented as:
// GetDockerfileDependencies parses a dockerfile and returns the full paths
// of all the source files that the resulting docker image depends on.
I'm using a multistage build which is something like:
FROM node:10.1.0-alpine AS dependencies
...
FROM dependencies AS builder
...
FROM dependencies
COPY --from=builder a b
Here is the debug output I'm getting:
Starting build...
Found [minikube] context, using local docker daemon.
DEBU[0004] Running docker build: context: ., dockerfile: Dockerfile
DEBU[0004] Checking base image node:10.1.0-alpine for ONBUILD triggers.
DEBU[0004] Found onbuild triggers [] in image node:10.1.0-alpine
DEBU[0004] Checking base image dependencies for ONBUILD triggers.
WARN[0005] Error processing base image for onbuild triggers: Get https://auth.docker.io/token?scope=repository%3Alibrary%2Fdependencies%3Apull&service=registry
.docker.io: exit status 1. Dependencies may be incomplete.
DEBU[0005] Checking base image dependencies for ONBUILD triggers.
WARN[0005] Error processing base image for onbuild triggers: Get https://auth.docker.io/token?scope=repository%3Alibrary%2Fdependencies%3Apull&service=registry
.docker.io: exit status 1. Dependencies may be incomplete.
INFO[0005] Found dependencies for dockerfile [src package.json yarn.lock tsconfig.json]
DEBU[0005] Recursively adding src
INFO[0005] deps [src/index.test.ts src/index.ts package.json yarn.lock tsconfig.json]
This made me do a double-take to verify that this is actually valid (referencing a previous stage in a FROM), and it is:
FROM can appear multiple times within a single Dockerfile to create multiple images or use one build stage as a dependency for another.
My guess is that skaffold is treating these as real dependencies from a registry, so it fails when it queries the registry's API. If so, I'm not sure what the specific rules are for when/how/whether a previous stage name shadows a real registry dependency.
I am using this https://github.com/dockersamples/example-voting-app. So not multi-stage really.
@blaenk looks to me like skaffold isn't correctly handling FROM <previous-stage>.
I think you're getting to here: https://github.com/GoogleContainerTools/skaffold/blob/cb6c48c35a8fb27ad0c9f89e2a4409846f87e782/pkg/skaffold/docker/parse.go#L221
But we should probably check to see if it's the name of a previous stage before calling out to a registry... maybe around here? https://github.com/GoogleContainerTools/skaffold/blob/cb6c48c35a8fb27ad0c9f89e2a4409846f87e782/pkg/skaffold/docker/parse.go#L86
Maybe worth opening a separate issue to track it? I don't think this is the same thing ahmetb is hitting.
Thanks, I agree. If someone doesn't beat me to it, I'll write up an issue later today and possibly look into it further.
In my case, it's due to https://github.com/google/go-containerregistry/issues/156
This should be fixed when we updated the library right? @dgageot
Yes. Closing this
Seeing this again in 0.9.0 (and 0.8.0 I believe). Maybe we didn't do dep right at some point and clobbered the changes?
I can confirm that this is still an issue in v0.12.0.
The Dockerfile structure looks like:
FROM microsoft/dotnet:2.1-aspnetcore-runtime AS base
...
FROM microsoft/dotnet:2.1-sdk AS build
...
FROM build AS publish
...
FROM base AS final
...
Once my service comes online, Skaffold will output the following two errors periodically until I stop the process:
Error processing base image for onbuild triggers: getting remote config: getting image: Get
https://auth.docker.io/token?scope=repository%3Alibrary%2Fbuild%3Apull&service=registry.docker.io: exit status 1. Dependencies may be incomplete.
Error processing base image for onbuild triggers: getting remote config: getting image: Get https://auth.docker.io/token?scope=repository%3Alibrary%2Fbase%3Apull&service=registry.docker.io: exit status 1. Dependencies may be incomplete.
@danwkennedy that looks like what I described in my comment. Skaffold is trying to fetch the image library/base from Docker Hub instead of using the previous stage.
@jonjohnsonjr I agree. I just wanted to confirm that it was happening in my project running the latest Skaffold (and that it's kind of annoying). Luckily we didn't need to have all those extra steps so the issue is mitigated for me. However, the Dockerfile was auto-generated by Visual Studio so anyone that wants to use VS + Skaffold will see this issue.
Oh sorry @priyawadhwa I had not seen that you were assigned to this issue and I submitted a PR
@dgageot no worries! :)
Still seeing this on v0.16.0
Also in the v0.25.0
Most helpful comment
Still seeing this on v0.16.0