Question, Bug, or Feature?
Type: Question
Enter Task Name: CacheBeta
Can we use CacheBeta task to cache docker images instead of pulling the images every time? I have tried with cache path set to /var/lib/docker on a Hoster Ubuntu Agent. As expected it failed with permission denied. Is there any way to do this or is there any documentation?
I would also like a mechanism for speeding up the acquisition of my docker image layers by either using the cache pipeline task or something else.
FWIW I had started experimenting with this and there are a few options I came across:
buildctl build ... --export-cache type=local,dest=path/to/output-dir
buildctl build ... --import-cache type=local,src=path/to/input-dir
The directory layout conforms to OCI Image Spec v1.0.
--export-cache options
mode=min (default): only export layers for the resulting image
mode=max: export all the layers of all intermediate steps
ref=docker.io/user/image:tag: reference for registry cache exporter
dest=path/to/output-dir: directory for local cache exporter
--import-cache options
ref=docker.io/user/image:tag: reference for registry cache importer
src=path/to/input-dir: directory for local cache importer
digest=sha256:deadbeef: digest of the manifest list to import for local cache importer. Defaults to the digest of "latest" tag in index.json
FWIW I had started experimenting with this and there are a few options I came across:
- docker save/load (I've found the performance of this to be pretty bad)
- use buildctl --import-cache and --export-cache
```
Hey @johnterickson, any feedback regarding these options and outcome?
@Rod-Sychev - I'd recommend going the buildctl route. For whatever reason, "docker load" from a _local_ file was slower than doing a pull from a _remote_ registry. If you don't have a private registry and you have super slow-to-run layers (e.g. building lots of native dependencies) then "docker load/save" might still pay off.
The challenge with buildctl is getting it installed on the agent.
@Rod-Sychev btw here's my YML for buildctl tinkering: https://dev.azure.com/codesharing-SU0/cachesandbox/_git/Scripts?path=%2Fdocker.yml&version=GBmaster
Here's what I see:
No cache
Total ~3m
Run 1 (Cache Miss) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16043&view=results
docker build: 3 minutes + 30 seconds exporting layers
Run 2 (Cache Hit) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16048&view=logs&j=12f1170f-54f2-53f3-20dd-22fc7dff55f9
restore 'buildkit': 10s
restore docker layers: 12s
docker build: 1m 12s
Total 1m 46s
Run 3 (Deps hit, src change) https://dev.azure.com/codesharing-SU0/cachesandbox/_build/results?buildId=16049&view=results
restore 'buildkit': 9s
restore docker layers: 19s
docker build: 1m 30s
Total 2m 8s
Not sure why it's taking docker build so long when all layers are present.
@Rod-Sychev , @ygnr @tjhowse - Can you guys please refer to https://github.com/fadnavistanmay/Scripts/blob/master/docker.yml for caching the docker base image layer Let us know how it goes.
I have implemented the steps and scripting from the docker.yml file for a .NET Core microservice application, but it doesn't seem to be faster, while caching is used.
Especially the dotnet restore seems to take some time (unpacking 48.0s done), while it looks like it should've been cached (because of the CACHED output at the end):
#16 [build 5/9] RUN dotnet restore
#16 pulling sha256:e6220b144b9c6c6328e87c1889462088917561bdb4fa73a878066f3f8a89878a
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.3s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.3s done
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca 0.3s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.5s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 1.0s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 1.2s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 3.2s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 4.2s done
#16 pulling sha256:e6220b144b9c6c6328e87c1889462088917561bdb4fa73a878066f3f8a89878a 6.0s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 10.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 17.4s done
#16 unpacking
#16 unpacking 48.0s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a
#16 pulling sha256:c156b8c92f5448b42c13d4a1fb43a5517e7cdd078fc0306575c6567f6a60feca 0.0s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.1s done
#16 unpacking 0.1s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:86446b9f2ecb592aac25f6fa852b4deff130a6dd5681e84c00a55e295c08bc8a 0.0s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 unpacking 0.1s done
#16 pulling sha256:6ef14aff1139e1065ec0928ae1c07f2cff8c2b35e760f4b463df5c64e6ea1101 0.1s done
#16 pulling sha256:c527443ccfede68672bb54490c4ee256c1746af05b291928c2cdb0c97beb8545 0.0s done
#16 pulling sha256:c05081985e91e6b5326feeb7606d42deb420a10bfa800b4446158093b7b7b8f8 0.1s done
#16 pulling sha256:9a0b0ce99936ce4861d44ce1f193e881e5b40b5bf1847627061205b092fa7f1d 0.1s done
#16 pulling sha256:db3b6004c61a0e86fbf910b9b4a6611ae79e238a336011a1b5f9b177d85cbf9d 0.1s done
#16 pulling sha256:f8f0759202953be4b156f44bba90b682b61f985f9bbc60e7262b216f70dabb96 0.1s done
#16 pulling sha256:6c5e96b85e8cee8b200325cdc8d5cc1ca84678c37eac127ba163f5c94b5d2a60 0.1s done
#16 pulling sha256:d39c626fbbd181fbf5328dc0979c98498454de2180ea8a73e438f00391201759 0.1s done
#16 unpacking 0.1s done
#16 CACHED
@stefankip Can you share some YAML and Dockerfile to give some context?
I'll check if I can create a test scenario somewhere next week...
So a small update; I've did some testing with an angular project with npm as package manager.
I used this script for running the build: https://dev.azure.com/codesharing-SU0/cachesandbox/_git/Scripts?path=%2Fdocker.yml&version=GBmaster.
Thanks for the tip @stefankip! I've added mode=max to my script.
Sounds like 4m30s down to 1m30s sounds like a good improvement!
Yeah it does. But my .NET Core project didn't have such a performance improvement. I'll do some more tests with that.
@fadnavistanmay Do you have a docker sample included in your samples repo?
Good point @johnterickson . Let me add it.
Sample is added to https://github.com/fadnavistanmay/azure-pipelines-caching-yaml which we will later align with docs
@johnterickson Thank you. That worked partially for us but in our case, in order to run some integration tests we spin up some dependencies in a docker container (via docker-compose). So these are pulled every time. Any ideas on how can I cache these images?
@ygnr The hosted agents have no persistent state. The images/layers need to be pulled from somewhere each build (either from Pipeline Caching or else where)
@johnterickson Is it possible to demonstrate how we can push the built image after it's built from buildctl? We rely on pushing our image to a private registry through an Azure Pipelines service connection
Yeah I'm struggling with the same thing; getting the resulting image pushed to the ACR.
I tried using the --output type=image,name=acr-url.io/repo:tag,push=false argument and after that docker images, but the image is not listed :-(
So I found out how we can use the created image.
Add this to the docker command:
--output type=docker,name=some-registry.com/repository:tag
And replace sudo $DOCKER_COMMAND with sudo $DOCKER_COMMAND | docker load.
This way the created image is loaded by docker and you can simply push the image to the registry.
Can someone explain what the GOOFS=0 part is in the Buildkit cache key?
And what the purpose is of the CACHE_KEY_FALLBACK restoreKey?
So I found out how we can use the created image.
Add this to the docker command:
--output type=docker,name=some-registry.com/repository:tag
And replacesudo $DOCKER_COMMANDwithsudo $DOCKER_COMMAND | docker load.
This way the created image is loaded by docker and you can simply push the image to the registry.
Stefan, I'm trying to follow your indications. In this block of code, how should I add those to the docker command?
DOCKER_COMMAND="$(DOCKER_COMMAND)"
if [ -d "$(BUILD_KIT_CACHE)" ]; then
echo "Will use cached layers from $(BUILD_KIT_CACHE)"
find $(BUILD_KIT_CACHE)
DOCKER_COMMAND="$DOCKER_COMMAND --import-cache type=local,src=$(BUILD_KIT_CACHE)"
fi
if [ "$(BuildKitLayersHit)" != "true" ]; then
echo "Will store cached layers to $(BUILD_KIT_CACHE)"
DOCKER_COMMAND="$DOCKER_COMMAND --export-cache mode=max,type=local,dest=$(BUILD_KIT_CACHE)"
fi
sudo $DOCKER_COMMAND
EDIT:
When I append it to the DOCKER_COMMAND at the variables definitions I get:
error: could not read /home/vsts/work/1/buildkitcache/index.json: open /home/vsts/work/1/buildkitcache/index.json: no such file or directory
Most helpful comment
@Rod-Sychev - I'd recommend going the buildctl route. For whatever reason, "docker load" from a _local_ file was slower than doing a pull from a _remote_ registry. If you don't have a private registry and you have super slow-to-run layers (e.g. building lots of native dependencies) then "docker load/save" might still pay off.
The challenge with buildctl is getting it installed on the agent.