The amazon/aws-sam-cli-build-image-python3.7 Docker image cannot be downloaded on Travis while running sam build --use-container. This issue doesn't happen on macOS Catalina (10.15.5) and Ubuntu Eoan (19.10).
The docker pull amazon/aws-sam-cli-build-image-python3.7 works okay everywhere and if we download it explicitly in this way on Travis, the sam build --use-container will do its job.
Source: https://github.com/BR0kEN-/aws-sam-cli--build-travis
Example build: https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/176709868
The https://github.com/BR0kEN-/aws-sam-cli--build-travis was created especially to reproduce the behavior.
See the build results at https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/176709868.
The sam build --use-container should download Docker image on Travis.
sam --version: 1.0.0This is reproducible for at least two runtimes: python3.7 and node_js12.x. See https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/176727250
This also happens on the latest Docker version available (for the moment of writing) - 19.03.12. See https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/jobs/363867522.
The build that proves docker pull before sam build helps to "overcome" the problem. See https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/176733726
Hello,
hope to add some information to this issue because I am experiencing the same or very similar issue.
I tried a simple hello world example in the Bitbucket pipeline (Probably user mapping related issue) and get the following log output.
6b6e3df282b0: Download complete
6b6e3df282b0: Pull complete
docker: failed to register layer: Error processing tar file(exit status 1): Container ID 1230228 cannot be mapped to a host ID.
@BR0kEN- Can you try to put this command docker run amazon/aws-sam-cli-build-image-python3.7 echo 'hello world' into your Travis pipeline and see if you get the same or similar output then this is surely related otherwise I will open a new issue. A docker pull on the Bitbucket cloud does not fix the issue.
Found a good description of the issue here: https://circleci.com/docs/2.0/high-uid-error/
Cheers,
Tobias
I am afraid that is two different issues @tobwiens.
In this run https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/jobs/363879074, you can find docker pull "amazon/aws-sam-cli-build-image-$SAM_RUNTIME" that downloads amazon/aws-sam-cli-build-image-python3.7:latest.
The problem I have in a nutshell:
sam build --use-container doesn't work on Travis since 1.0.0.docker pull "amazon/aws-sam-cli-build-image-$SAM_RUNTIME" before sam build --use-container helps to avoid the above problem.Before the sam:1.0.0 I was using sam:0.53.0 and the Docker images were https://hub.docker.com/r/lambci/lambda instead of https://hub.docker.com/r/amazon.
The above issues result in a very similar error message that I see - RuntimeError: Container does not exist. Cannot get logs for this container.
I added the tests of sam:0.53.0 to show the problem does not exist on that version. See https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/176831893.
@BR0kEN- Thank you your input. Yes it seems these are not related.
Opened a new issue here: https://github.com/awslabs/aws-sam-cli/issues/2125
@tobwiens did you try to reproduce the behavior outside CI service? I start thinking there may be a single root cause. Both I and you observed the error on CI (Bitbucket & Travis) and I didn't manage to reproduce it anywhere else.
@BR0kEN- Exactly the same in the Bitbucket cloud: Cannot be reproduced anywhere else.
I supposed that it is linked with:
Docker user namespace configurations are stored within the following two files '/etc/subuid' and '/etc/subgid', which looks as follows on Bitbucket Pipelines (https://jira.atlassian.com/browse/BCLOUD-17319?error=login_required&error_description=Login+required&state=8ed7d80a-9e08-407f-9a65-ed972944869e)
I have setup a workaround image here: https://hub.docker.com/repository/docker/tobwiens/aws-sam-cli-build-image-python3.7-fix2125/general
I added RUN chown -R root:root /var && chown -R root:root /THIRD-PARTY-LICENSES.txt at the end of the docker file.
A search with find / \( -uid 1230228 \) -ls 2>/dev/null reveals no files with that UUID anymore. Though the same error failed to register layer: Error processing tar file(exit status 1): Container ID 1230228 cannot be mapped to a host ID persists inside the bitbucket pipeline.
Running into the same issue with python 3.6.
Additionally, when using the pre-downloaded container to run tests with sam local invoke, I had to explicitly include boto3 in my custom dependency manifest file when running sam build.
Running into the same issue with python 3.6.
This seems does not depend on the runtime type and happens to all of them. See https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/177681130.
Additionally, when using the pre-downloaded container to run tests with sam local invoke, I had to explicitly include boto3 in my custom dependency manifest file when running sam build.
I didn't experience such issue. Looks weird since the boto3 is the dependency of aws-sam-cli.
We have a proposed fix for this that we're working on. It will not require an upgrade to the SAM CLI, as the issue would be with the images themselves.
@awood45 That's good news. Could you please share more details? I am eager to understand the background: the image is not downloadable only on cloud CI. Why? What is the trait the image could gain to behave like this?
Can you please re-test this with Python 3.7? We have a proposed fix for this out for that image now. You should not need to change your version of AWS SAM CLI. (This also has been updated for Python 3.6, Python 2.7, and .NET Core 2.1)
Still doesn鈥檛 work. Cron triggers a build on a daily basis so you can open Travis next time to see whether the issue is fixed.
The last build is https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds/179466969
@BR0kEN- we think this should be fixed, yet I can see that your Travis builds continue to fail. We want to eliminate the possibility that images are somehow being cached and that you're getting the latest images. Can you add this command to your build before AND after you run your sam commands?
docker images
Just want to see what the output is.
@awood45 added. sam build doesn't download the image. The docker images after sam build shows nothing. You can compare with the other builds in the matrix (sam 0.53.0 and explicit docker pull) where the output exists.
This may happen if the system doesn't have enough space. Make sure you have 3+ GB of free space to be sure (1-2 GB is not enough in some cases).
I see this has been resolved 5 days ago - https://travis-ci.com/github/BR0kEN-/aws-sam-cli--build-travis/builds. Don't know what happened though, but now this issue can be closed.