Made the derp of muscle memory-ing a command I use on a different machine. If the target destination of an s3 cp is specified in a nonexistent directory, you get a silent failure. Not sure if this is intended to punish me for the 30 seconds it took to realize I was on a different box with different dirs, but it would have been nice to have been told off by the cli.
(on Mac OS High Sierra)
This issue has been automatically closed because there has been no response to our request for more information from the original author. With only the information that is currently in the issue, we don't have enough information to take action. Please reach out if you have or find the answers we need so that we can investigate further.
@mtopolski - Thanks for reaching out. Sorry this issue was closed without a response. In looking into this issue, I found a few other related issues: #2430, #1069, and #1645. In order to investigate this issue better, please provide an example of the command and version of the CLI in use so I can try to reproduce the issue.
Thank you for reaching out, no worries. I experienced this specifically on Mac OS High Sierra, I haven't tried it with Mojave yet and I'm on Debian now where I get a nice little error.
aws s3 cp s3://my.stuff.com/file.tar.zstd /nonexistantpath
@mtopolski - Thanks for your feedback. The error returning appears to be expected behavior from the service which the CLI does not control however I can not reproduce the same results to confirm this. Please rerun the command but a add the --debug
option at the end followed by posting a sanatize output. I would like to analyze the CLI version in use, the actual command that is getting passed, and the output in the response body.
@mtopolski @justnance
I have tried a similar action with the two versions and found this to be working. CLI automatically creates a local folder which was not existing and then copy the file to that location.
Version verified:
shabeeb@Shabeeb-LT:/home/shabeeb/workspace/repos/aws-cli$ uname -a
Linux Shabeeb-LT 4.15.0-42-generic #45-Ubuntu SMP Thu Nov 15 19:32:57 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
shabeeb@Shabeeb-LT:/home/shabeeb/workspace/repos/aws-cli$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 18.04.1 LTS
Release: 18.04
Codename: bionic
shabeeb@Shabeeb-LT:/home/shabeeb/workspace/repos/aws-cli$ aws --version
aws-cli/1.14.44 Python/3.6.7 Linux/4.15.0-42-generic botocore/1.8.48
ubuntu@XXXXXX:/home/ubuntu/$ uname -a
Linux oscar 3.13.0-119-generic #166-Ubuntu SMP Wed May 3 12:18:55 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
ubuntu@XXXXXX:/home/ubuntu/$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 14.04.5 LTS
Release: 14.04
Codename: trusty
ubuntu@XXXXXX:/home/ubuntu/$ aws --version
aws-cli/1.11.170 Python/2.7.6 Linux/3.13.0-119-generic botocore/1.8.13
However, aws cli failed to print a failure message if it tries to copy the file to a path which is not existing and not owned by the user. Specifically, cli is unable to create the path due to permission issue, but did not print that as a message
Try the command with sudo, it works and does what did it do in the folder owned by the user.
What needs to be fixed is _print a message if cli fails to cp a file to a folder due to permission issue_
Command snippets
shabeeb@Shabeeb-LT:$ aws s3 cp s3://amagi-vault-test-storage/core/test.txt /shabeebkhalidtestnofolder/ --profile aws_profile
shabeeb@Shabeeb-LT:$ ls /shabeebkhalidtestnofolder/
ls: cannot access '/shabeebkhalidtestnofolder/': No such file or directory
shabeeb@Shabeeb-LT:$ sudo aws s3 cp s3://amagi-vault-test-storage/core/test.txt /shabeebkhalidtestnofolder/ --profile aws_profile
[sudo] password for shabeeb:
download: s3://amagi-vault-test-storage/core/test.txt to ../../shabeebkhalidtestnofolder/test.txt
shabeeb@Shabeeb-LT:$ ls /shabeebkhalidtestnofolder/
test.txt
@mtopolski @justnance Please let me know if I can help with implementing this.
This issue has been automatically closed because there has been no response to our request for more information from the original author. With only the information that is currently in the issue, we don't have enough information to take action. Please reach out if you have or find the answers we need so that we can investigate further.
@justnance Can you re-open this issue, this looks like a fair one and I have supplied some more information above as I am also facing the same issue.
@shabeebk - Thanks for your feedback. I am reopening this issue pending further review and investigation.
Most helpful comment
@justnance Can you re-open this issue, this looks like a fair one and I have supplied some more information above as I am also facing the same issue.