Efficiently export files to Microsoft Azure using cURL

Exporting files to Microsoft Azure Storage using cURL provides a flexible and efficient way to interact with cloud storage directly from the command line. This guide demonstrates how to export files to Azure Blob Storage using cURL, highlighting the recommended use of Microsoft Entra ID authentication, secure token management, and robust error handling practices.
Setting up Azure storage account
Before using cURL with Azure Storage, create an Azure Storage account and container through the Azure Portal or Azure CLI. We recommend enabling Microsoft Entra ID (formerly Azure AD) authentication for superior security and streamlined token management.
Authentication methods
Azure Blob Storage supports two primary authentication methods:
Microsoft entra id (recommended)
Obtain an access token using the Azure CLI. Ensure you have logged in using az login
.
token=$(az account get-access-token --resource https://storage.azure.com/ --query accessToken -o tsv)
Use the token in cURL requests:
curl -fsSL -X PUT \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
-H "x-ms-blob-type: BlockBlob" \
-H "Content-Type: application/octet-stream" \
--data-binary "@localfile.txt" \
"https://youraccount.blob.core.windows.net/container/remotefile.txt"
Shared access Signature (alternative)
For scenarios where Microsoft Entra ID is not feasible, use SAS tokens with strict security
controls. This method also supports additional integrity checks using the Content-MD5
header:
curl -fsSL -X PUT \
-H "x-ms-blob-type: BlockBlob" \
-H "x-ms-version: 2023-11-03" \
-H "Content-Type: application/octet-stream" \
-H "Content-MD5: $(openssl dgst -md5 -binary localfile.txt | base64)" \
--data-binary "@localfile.txt" \
"https://youraccount.blob.core.windows.net/container/remotefile.txt?your_sas_token"
Handling large file uploads
For files that exceed the limit of 5,000 MiB per write operation, use the Block Blob API to upload large files in chunks. With API versions 2019-12-12 and later, the maximum block size is 4,000 MiB, allowing a maximum blob size of nearly 190.7 TiB when using up to 50,000 blocks.
#!/bin/bash
file="largefile.txt"
block_size=$((4000*1024*1024)) # 4,000 MiB blocks
base_url="https://youraccount.blob.core.windows.net/container/$file"
# Split file into blocks
split -b $block_size "$file" block_
# Upload blocks
for block in block_*; do
block_id=$(echo -n "$block" | base64)
curl -fsSL -X PUT \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
-H "x-ms-blob-type: BlockBlob" \
--data-binary "@$block" \
"$base_url?comp=block&blockid=$block_id"
done
# Create block list xml
echo '<?xml version="1.0" encoding="utf-8"?><BlockList>' > blocklist.xml
for block in block_*; do
block_id=$(echo -n "$block" | base64)
echo "<Latest>$block_id</Latest>" >> blocklist.xml
done
echo '</BlockList>' >> blocklist.xml
# Commit blocks
curl -fsSL -X PUT \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
-H "Content-Type: application/xml" \
--data-binary @blocklist.xml \
"$base_url?comp=blocklist"
Error handling and retries
Implement robust error handling with exponential backoff for transient errors. The function below retries on HTTP 408, 500, 502, 503, and 504, while immediately aborting on authentication failures (HTTP 403):
upload_with_retry() {
local url="$1"
local file="$2"
local max_attempts=5
local attempt=1
local wait_time=2
while [ $attempt -le $max_attempts ]; do
response=$(curl -fsSL -w "%{http_code}" -X PUT \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
-H "x-ms-blob-type: BlockBlob" \
--data-binary "@$file" \
"$url")
status=${response: -3}
if [ "$status" -eq 201 ]; then
return 0
elif [ "$status" -eq 403 ]; then
echo "Authentication failed. Check credentials."
return 1
elif [ "$status" -eq 408 ] || [ "$status" -eq 500 ] || [ "$status" -eq 502 ] || [ "$status" -eq 503 ] || [ "$status" -eq 504 ]; then
echo "Transient error $status encountered. Retrying in $wait_time seconds..."
else
echo "Upload failed with status $status."
return 1
fi
attempt=$((attempt + 1))
sleep $wait_time
wait_time=$((wait_time * 2))
done
echo "File upload failed after $max_attempts attempts."
return 1
}
Monitoring and validation
Check the upload status and blob metadata using the following command:
curl -fsSL -I \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
"https://youraccount.blob.core.windows.net/container/file.txt"
Monitor transfer progress with the progress bar option:
curl -fsSL --progress-bar \
-H "Authorization: Bearer $token" \
-H "x-ms-version: 2023-11-03" \
-H "x-ms-blob-type: BlockBlob" \
--data-binary "@largefile.txt" \
"https://youraccount.blob.core.windows.net/container/largefile.txt"
Security best practices
- Use Microsoft Entra ID authentication whenever possible to leverage improved security and token management.
- Always use HTTPS for secure data transfers with Azure Storage.
- Restrict permissions granted by SAS tokens and use short expiration times.
- Enforce IP address restrictions to limit access to your storage account.
- Set the minimum required permissions for each operation.
- Monitor and log all operations for audit and troubleshooting purposes.
- Regularly rotate credentials to mitigate potential security breaches.
- Validate file integrity using the
Content-MD5
header.
Performance optimization
To maximize transfer performance:
- Choose appropriate block sizes (up to 4,000 MiB per block).
- Enable concurrent uploads where possible for multiple files.
- Select the closest Azure region to reduce latency.
- Monitor network bandwidth and latency to optimize transfers.
- Use compression when appropriate, considering that some files are already compressed.
- Implement comprehensive error handling and retries for robust uploads.
Conclusion
Integrating Azure Blob Storage with cURL offers a powerful strategy for managing cloud storage operations. By following these updated authentication methods, security best practices, and performance optimization tips, you can build a reliable and secure file transfer solution. For additional advanced file handling capabilities, consider exploring Transloadit's services, which support features like Uppy and Tus for enhanced file processing.