The lib supports downloading files from GCS to local host.
However it doesn't support transferring the files directly to FTP server.
It would be nice if this would be possible directly.
We're unlikely to add a dependency to google-cloud-storage to support this usecase. However, Blob.download_to_file takes a file-like object, not a filename, which means you should be able to do the copy using paramiko, via something like (untested):
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.connect(hostname)
sftp_client = ssh_client.open_sftp()
with sftp_client.file('blob_copy.txt', 'w') as target:
blob.download_to_file(target)
sftp_client.close()
ssh_client.close()
We're unlikely to add a dependency to
google-cloud-storageto support this usecase. However,Blob.download_to_filetakes a file-like _object_, not a filename, which means you should be able to do the copy usingparamiko, via something like (untested):import paramiko ssh_client = paramiko.SSHClient() ssh_client.connect(hostname) sftp_client = ssh_client.open_sftp() with sftp_client.file('blob_copy.txt', 'w') as target: blob.download_to_file(target) sftp_client.close() ssh_client.close()
@tseaver Does this solution support streaming, which means I could use it to download and upload one file exceeded max memory of GCF?
Most helpful comment
We're unlikely to add a dependency to
google-cloud-storageto support this usecase. However,Blob.download_to_filetakes a file-like object, not a filename, which means you should be able to do the copy usingparamiko, via something like (untested):