Aws-cli: Can't copy folders: FileNotFoundError: [Errno 2] No such file or directory

Created on 1 Jun 2020  路  5Comments  路  Source: aws/aws-cli

Describe the bug

This is the same issue as https://github.com/aws/aws-cli/issues/2690 but it has been closed.

I just install aws cli version 2 on windows 10.

I am able to upload files with aws s3 cp "C:\Users\a\Desktop\test\my.txt" "s3://mybucket/test/"
But not folders: aws s3 cp "C:\Users\a\Desktop\test" "s3://mybucket/test"

I tried with --debug 2> log.txt
The error is: FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\a\\Desktop\\test\\'
But the folder exist (I am sure that there is no typo.

I tried this fix, but I get this error: The system cannot find the path specified

Confirm by changing [ ] to [x] below to ensure that it's a bug:

version and platform
aws-cli/2.0.17 Python/3.7.7 Windows/10 botocore/2.0.0dev21

Logs/output
Relevant part of the log

2020-06-01 11:58:20,063 - ThreadPoolExecutor-0_0 - botocore.awsrequest - DEBUG - Unable to rewind stream: [Errno 2] No such file or directory: 'C:\\Users\\a\\Desktop\\test\\'
2020-06-01 11:58:20,063 - ThreadPoolExecutor-0_0 - s3transfer.tasks - DEBUG - Exception raised.
Traceback (most recent call last):
  File "lib\site-packages\botocore\awsrequest.py", line 518, in reset_stream
  File "lib\site-packages\s3transfer\utils.py", line 503, in seek
  File "lib\site-packages\s3transfer\upload.py", line 89, in seek
  File "lib\site-packages\s3transfer\utils.py", line 367, in seek
  File "lib\site-packages\s3transfer\utils.py", line 350, in _open_if_needed
  File "lib\site-packages\s3transfer\utils.py", line 261, in open
FileNotFoundError: [Errno 2] No such file or directory: 'C:\\Users\\a\\Desktop\\test\\'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "lib\site-packages\s3transfer\tasks.py", line 126, in __call__
  File "lib\site-packages\s3transfer\tasks.py", line 150, in _execute_main
  File "lib\site-packages\s3transfer\upload.py", line 692, in _main
  File "lib\site-packages\botocore\client.py", line 208, in _api_call
  File "lib\site-packages\botocore\client.py", line 514, in _make_api_call
  File "lib\site-packages\botocore\client.py", line 533, in _make_request
  File "lib\site-packages\botocore\endpoint.py", line 102, in make_request
  File "lib\site-packages\botocore\endpoint.py", line 143, in _send_request
  File "lib\site-packages\botocore\awsrequest.py", line 521, in reset_stream
botocore.exceptions.UnseekableStreamError: Need to rewind the stream <s3transfer.utils.ReadFileChunk object at 0x0000022E760B82C8>, but stream is not seekable.
closing-soon guidance

Most helpful comment

Hi @MagTun ,
When copying folders using the aws s3 cp command, you need to add the --recursive option to the command, see the Recursively copying local files to S3 section in the command reference:
aws s3 cp myDir s3://mybucket/ --recursive

Let me know if this doesn't work for you either.

All 5 comments

same problem, I can't upload folders.

I also cannot upload folders. No error just:
This is my version aws-cli/2.0.17 Python/3.7.3 Linux/5.3.0-1022-azure botocore/2.0.0dev21

To add to this. This is not working on azure release pipeline.

The following version was working previously on the azure release pipeline:
aws-cli/2.0.10 Python/3.7.3 Linux/5.0.0-1035-azure botocore/2.0.0dev14

If I download the old version 2.0.10, it still says Linux/5.3.0-1022-azure, and it will then give an error:
An HTTP Client raised and unhandled exception: Invalid header value b'AWS4-HMAC-SHA256 Credential=*\r/20200601/us-east-1/s3/aws4_request, SignedHeaders=content-md5;content-type;host;x-amz-content-sha256;x-amz-date, Signature=a522ec3f20d0af03a256fc8c48848d29fe68922d680cc06aac6914538c1367f5

I tried this on Ubuntu 20.04 Agent:
aws-cli/2.0.17 Python/3.7.3 Linux/5.4.0-1012-azure botocore/2.0.0dev21
Again no error just:

I wanted to add my command so as you can see the --recursive, and again this worked on the older version:
aws s3 cp /home/vsts/work/r1/a/_tableau-wdc-build/drop/ s3://ourbucket.tableau.dev/5c13923122f19ec060cfed11202dd065072877c1/ --recursive --exclude ".git/*"

Hi @MagTun ,
When copying folders using the aws s3 cp command, you need to add the --recursive option to the command, see the Recursively copying local files to S3 section in the command reference:
aws s3 cp myDir s3://mybucket/ --recursive

Let me know if this doesn't work for you either.

@KaibaLopez , thanks a lot for your help. Yes with --recursive, it's working... but the command isn't doing what I wanted to do.

Let's say I have this folder structure;
myDir:
-filea
-fileb
-Folder1
-----file1a
-----file1b

aws s3 cp myDir s3://mybucket/ --recursive will upload the whole myDir. But I only want to upload fileaand fileb not Folder1 so intuitively I removed the --recursive. But apparently it's not possible...

Also as a side note, it would be great to have a concrete example because myDir isn't explicite enought for beginner. Should it be a relative or absolute path with / or \ or ...

Thanks !

Hi @MagTun ,
then you add the --exclude "Folder1/*" to the command, that should work for what you're trying to do.

As for the examples, there is a lot of them on the command reference , but also you can find some under the "aws-cli/awscli/examples/" folder.

Was this page helpful?
0 / 5 - 0 ratings