I'm trying to exclude multiple files/folders by giving multiple --exclude
arguments to an aws s3 sync
command, but it seems, that only first one is taken into account.
If it's not supported, then how can I put multiple exclude expressions into --exclude argument?
Here is a command I'm trying to run:
aws s3 sync /path/to/local/folder s3://bucket-name/folder-on-bucket --delete --exclude=*.svn/* --exclude=cache --exclude=.restricted --exclude=tmp --exclude=*.php
At then end .svn
folders on any deep level are not present, but others directories/files are still there.
Maybe there is some problem in dot (.) recognition in directory names and this breaks whole exclude expression.
If you're shell is bash, you'll have to quote the wildcard values, as the shell will expand these values before it's passed to the CLI. So:
aws s3 sync /path/to/local/folder s3://bucket-name/folder-on-bucket --delete --exclude="*.svn/*" --exclude=cache --exclude=".restricted" --exclude=tmp --exclude="*.php"
Can you try that and see if that works?
Actually it did work.
I've tried to quote non-wildcard values, e.g. "tmp" and as a result tmp folder was uploaded. But why I need to quote a folder if it contains "." in it? Maybe there some kind of regex support here, that isn't explained in docs?
Sorry, you don't need to quote a directory if it contains a '.' character, only special characters that are consumed by the shell, such as the wildcard '_' character, need to be quoted. To exclude a directory, you can say --exclude "tmp/_"
.
Most helpful comment
If you're shell is bash, you'll have to quote the wildcard values, as the shell will expand these values before it's passed to the CLI. So:
Can you try that and see if that works?