Aws-cli: Leading dashes in parameter values are interpreted as arguments

Created on 9 Feb 2015  路  2Comments  路  Source: aws/aws-cli

Hey there,

my command is

AWS_ACCESS_KEY_ID=**** AWS_SECRET_ACCESS_KEY=**** aws s3api put-object --bucket MY_BUCKET --no-verify-ssl --region us-east-1 --output json --body /srv/shared/tmp/upload/a53ba1bd98d6abb597d51e9784431198.png --key -YlA6-jsSVG9OFAAGsnhCw.doc --acl public-read --content-type application\/msword

and I get:

usage: aws [options] <command> <subcommand> [parameters]
aws: error: argument --key: expected one argument

So far so good, that's expected. However, when I add double-quotes around the --key's value, or escape the leading dash, then these backslashes and double quotes become part of the filename and end up like that in the bucket.

Is there a way to get a file named -YlA6-jsSVG9OFAAGsnhCw.doc into the bucket via aws-cli?

bug

Most helpful comment

Another option you have is to use the --option=value form when specifying parameters:

aws s3api put-object --key=-file-with-dash --bucket j

This is an issue with the stdlib argument parsing library we use (http://bugs.python.org/issue9334). Given the two workarounds shown here, we should wait for the bug to be fixed upstream in argparse.

All 2 comments

I can reproduce your issue. It looks like a parsing issue.

As to getting a file into a bucket, try the s3 commands. Try something like:

aws s3 cp foo.txt s3://mybucket/-foo.txt

This will transfer the local file foo.txt to the bucket mybucket as the key -foo.txt.

That was working for me. Let me know how that works for you.

Another option you have is to use the --option=value form when specifying parameters:

aws s3api put-object --key=-file-with-dash --bucket j

This is an issue with the stdlib argument parsing library we use (http://bugs.python.org/issue9334). Given the two workarounds shown here, we should wait for the bug to be fixed upstream in argparse.

Was this page helpful?
0 / 5 - 0 ratings