Install Method: How did you install the CLI? (e.g. pip, interactive script, apt-get, Docker, MSI, nightly)
Answer here: apt-get
CLI Version: What version of the CLI and modules are installed? (Use az --version
)
Answer here:
azure-cli (2.0.6)
acr (2.0.4)
acs (2.0.6)
appservice (0.1.6)
batch (2.0.4)
cdn (0.0.2)
cloud (2.0.2)
cognitiveservices (0.1.2)
command-modules-nspkg (2.0.0)
component (2.0.4)
configure (2.0.6)
core (2.0.6)
cosmosdb (0.1.6)
dla (0.0.6)
dls (0.0.6)
feedback (2.0.2)
find (0.2.2)
interactive (0.3.2)
iot (0.1.5)
keyvault (2.0.4)
lab (0.0.4)
monitor (0.0.4)
network (2.0.6)
nspkg (3.0.0)
profile (2.0.4)
rdbms (0.0.1)
redis (0.2.3)
resource (2.0.6)
role (2.0.4)
sf (1.0.1)
sql (2.0.3)
storage (2.0.6)
vm (2.0.6)
OS Version: What OS and version are you using?
Answer here: Debian 8 (Jessie)
Shell Type: What shell are you using? (e.g. bash, cmd.exe, Bash on Windows)
Answer here: Bash
This seems useful, and we are open to contributions. Currently, this is a low priority as there is a clear workaround of piping to a file, and using the current upload command.
This feature is nice to have. Close now as the low priority.
This seems useful, and we are open to contributions. Currently, this is a low priority as there is a clear workaround of piping to a file, and using the current upload command.
It's worth noting that this is is not a good work around, because:
i just ran into this as well.
When uploading a big data set that is piped through zip and encryption, it would be great to then pipe it to blob storage and not again to disk and back out to blob.
@limingu, can this be re-opened please (instead of me creating yet another issue)? The proposed workaround totally misses the point about using pipes. I'd like to use this feature for the exact same reasons as @diepes . We have very large datasets that we would like to upload to our datalake without having to attach large temporary disks to our VMs.
I think this feature would be a great and useful addition.
add to S170
add to S172
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @xgithubtriage.
@xiafu-msft Does it support in track2 SDK?
@zezha-msft May I know if it is support in Azcopy already?
@Juliehzl yes for AzCopy.
Writing to a file is not a workaround. Due to current uh speed limitations with Azure disks, it is highly preferred to pre-size disk images. I can't keep 100GB of free space lying around to hold nearly empty images to upload to Azure.
This scenario _does not work_ for page blobs in azcopy
: See: https://github.com/Azure/azure-storage-azcopy/issues/1119 and I lost interest in trying to worm my way through the code to try to figure out how hard it would be to fix it.
blobxfer
seems nearly abandoned, uses a very old storage SDK.
Can this please be prioritized? This is a basic feature to support for disk images.
@Juliehzl please let us know how you'd like to proceed.
As @colemickens pointed out, AzCopy supports block blobs with the piping mode already. We don't have support for page blob yet.
Thanks @zezha-msft for the confirmation. We would support it soon.
Most helpful comment
It's worth noting that this is is not a good work around, because: