azure storage explorer fails to upload large files

Keith Melmon 0 Reputation points Microsoft Employee
2024-09-23T17:32:41.93+00:00

I have been consistently running into failures in AzCopy jobs when trying to upload large files to our blob container storage. From the logs I see http requests stop after reaching a certain way through the upload:

2024/09/21 18:21:25 ==> REQUEST/RESPONSE (Try=1/147.4961ms, OpTime=817.1439ms) -- RESPONSE SUCCESSFULLY RECEIVED

PUT https://gfxmltrainingstore1.blob.core.windows.net/cadmus/1600%2Fdownsample_text_bilinear.hdf5?blockid=MDAwMDCts3iZpEpEOXZG0SyjwbzcMDAwMDAwMDAwMDI4MzY0&comp=block

X-Ms-Request-Id: [49fffadd-801e-000f-0953-0c2c87000000]

2024/09/21 18:21:26 PERF: primary performance constraint is Unknown. States: X: 0, O: 0, M: 0, L: 0, R: 1, D: 0, W: 764, F: 0, B: 4, E: 0, T: 769, GRs: 4

2024/09/21 18:21:26 61.2 %, 0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total, 2-sec Throughput (Mb/s): 124.4796

2024/09/21 18:21:28 PERF: primary performance constraint is Unknown. States: X: 0, O: 0, M: 0, L: 0, R: 1, D: 0, W: 764, F: 0, B: 4, E: 0, T: 769, GRs: 4

2024/09/21 18:21:28 61.2 %, 0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total,

2024/09/21 18:21:30 PERF: primary performance constraint is Unknown. States: X: 0, O: 0, M: 0, L: 0, R: 1, D: 0, W: 764, F: 0, B: 4, E: 0, T: 769, GRs: 4

2024/09/21 18:21:30 61.2 %, 0 Done, 0 Failed, 1 Pending, 0 Skipped, 1 Total,

2024/09/21 18:21:32 PERF: primary performance constraint is Unknown. States: X: 0, O: 0, M: 0, L: 0, R: 1, D: 0, W: 764, F: 0, B: 4, E: 0, T:

The entries in the log continue like this forever. Clearly looks like an AzCopy problem.

Azure Storage Explorer
Azure Storage Explorer
An Azure tool that is used to manage cloud storage resources on Windows, macOS, and Linux.
262 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Nehruji R 7,811 Reputation points Microsoft Vendor
    2024-09-24T06:39:04.4066667+00:00

    Hello Keith Melmon,

    Greetings! Welcome to Microsoft Q&A Forum.

    I understand that you are encountering issue with uploading data to storage container using Azcopy.

    Please consider checking the below suggestions to resolve the issue,

    • Make sure that the machine that runs AzCopy is able to access the destination storage account. You might have to use IP network rules in the firewall settings of either the source or destination accounts to allow access from the public IP address of the machine.
    • If you see a large file repeatedly failing due to certain chunks failing each time, try to limit the concurrent network connections or throughput limit depending on your specific case. We suggest you lower the performance drastically at first, observe whether it solved the initial problem, then ramp up the performance again until an overall balance is achieved.

    In cases of low-bandwidth or intermittent network conditions, you can try adjusting the following values/parameters:

    1. Set the environment variable AZCOPY_CONCURRENCY_VALUE to "AUTO". That helps a lot in low-bandwidth cases, since it results in AzCopy using far fewer connections than normal.
    2. Set the environment variable AZCOPY_CONCURRENT_FILES to 1 Adjusting the concurrency for file transfers can be beneficial, especially when dealing with large or small files. Lowering the concurrency (e.g., to 1 or a small number) might reduce the chances of failures and provide better control over the transfer process.

    For further information: https://zcusa.951200.xyz/en-us/azure/storage/common/storage-use-azcopy-optimize https://zcusa.951200.xyz/en-us/troubleshoot/azure/azure-storage/storage-use-azcopy-troubleshoot

    Also, you have several options for moving data into or out of Azure Storage. Which option you choose depends on the size of your dataset and your network bandwidth. For more information, see Choose an Azure solution for data transfer.

    Have you tried uploading the same via any other method such as via Azure Portal from the same machine?

    Workaround: It is quite possible that uploading large files can be challenging due to limitations on network bandwidth and server resources. One approach to uploading large files is to split them into smaller chunks and upload them in parallel, which can help to reduce the likelihood of errors and improve upload speed.

    I would recommend the below:

    While uploading from Portal, there is an option to increase the block size to 100 MiB. You can select the same and see if that helps out?

    Please refer to this article for more information on setup verification.

    If the transfer job error has not resulted from the sas token or authentication, you could try below command line from this link.

    Show the error message of the failed job: azcopy jobs show <job-id> --with-status=Failed then execute resume command If large number of failures occasionally. A simple solution for the scoped scenario of populating a blank directory is to run the azcopy copy command again with the --overwrite=false command.

    Make sure you are using the latest version of AzCopy. Updates often include bug fixes and performance improvements.

    Hope that helps! please let us know if you have any further queries. I’m happy to assist you further.


    Please "Accept the answer” and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.