1. Introduction
Azure Blob Storage is optimized for storing massive amounts of unstructured data like files, images, and documents. For large files (100MB+), it’s best to use block blob uploads with chunked transfers (also known as block blobs with Put Block/Put Block List) to avoid browser crashes and server timeouts.
2. Core Components
To implement large file uploads from a portal:
- Frontend Web App: HTML + JavaScript/React/Angular (Power Pages, if applicable)
- Backend API or Azure Function: Generates secure upload URLs
- Azure Blob Storage: Storage account with container access
- Security: Uses Shared Access Signature (SAS) for secure, time-limited access
3. Azure Setup
Step 1: Create a Storage Account
- Go to Azure Portal → Storage Accounts → Create
- Choose Standard performance and BlobStorage or General-purpose v2
- Create a container (e.g.,
uploads
) and set access level to private
Step 2: Generate a SAS Token (Server Side)
Create a server-side endpoint (Azure Function, Node.js API, etc.) that:
- Accepts file metadata (name, size)
- Returns a SAS URL for secure, direct upload
Sample SAS Generation (Node.js):
const { BlobServiceClient, generateBlobSASQueryParameters, BlobSASPermissions } = require("@azure/storage-blob");
const sasToken = generateBlobSASQueryParameters({
containerName: "uploads",
blobName: "largefile.mp4",
permissions: BlobSASPermissions.parse("cwr"),
startsOn: new Date(),
expiresOn: new Date(new Date().valueOf() + 3600 * 1000)
}, sharedKeyCredential).toString();
4. Frontend: HTML/JS Upload Logic
Step 1: Select and Read File
<input type="file" id="fileInput" />
Step 2: Chunk the File in JavaScript
const file = document.getElementById('fileInput').files[0];
const chunkSize = 4 * 1024 * 1024; // 4MB
let start = 0;
let blockIds = [];
while (start < file.size) {
const end = Math.min(start + chunkSize, file.size);
const chunk = file.slice(start, end);
const blockId = btoa("block-" + start);
blockIds.push(blockId);
await uploadChunk(chunk, sasUrl, blockId);
start = end;
}
Step 3: Upload Each Chunk via Put Block
async function uploadChunk(chunk, sasUrl, blockId) {
const url = `${sasUrl}&comp=block&blockid=${blockId}`;
await fetch(url, {
method: "PUT",
headers: { "x-ms-blob-type": "BlockBlob" },
body: chunk
});
}
Step 4: Commit All Blocks (Put Block List)
async function commitBlockList(sasUrl, blockIds) {
const url = `${sasUrl}&comp=blocklist`;
const xmlBody = `<?xml version="1.0" encoding="utf-8"?><BlockList>${blockIds.map(id => `<Latest>${id}</Latest>`).join('')}</BlockList>`;
await fetch(url, {
method: "PUT",
headers: {
"Content-Type": "application/xml"
},
body: xmlBody
});
}
5. Power Pages Integration
In Power Pages:
- Use JavaScript/HTML custom component to allow file selection and chunk uploads.
- Use Power Automate or Azure Function as a secure SAS generator endpoint.
- Store uploaded file metadata in Dataverse post-upload.
6. Monitoring and Logging
Enable Storage Logging in Azure:
- Navigate to your storage account > Monitoring > Diagnostic settings
- Log to Azure Monitor or Storage Analytics
Use Azure Application Insights on the server/API side to log:
- File size
- Upload time
- Errors
7. Handling Timeouts, Retries, and Progress
- Implement retries with exponential backoff on failed chunk uploads.
- Use a progress bar based on file size vs uploaded size.
- Set upload timeout thresholds to detect stalled sessions.
Progress Example:
onprogress = (event) => {
const percent = (uploadedBytes / file.size) * 100;
updateProgressBar(percent);
};
8. Security Best Practices
- Use short-lived SAS tokens (15 mins – 1 hour)
- Scope SAS to a specific blob path only
- Prefer user delegation SAS when using Azure AD auth
- Validate MIME type and size before upload
9. Post-Upload Actions
After a successful upload:
- Call backend to log the upload
- Update UI to show success
- Optionally, trigger a Logic App / Power Automate to:
- Move file to another container
- Send a notification email
- Parse or process the file contents
10. Considerations for Very Large Files (1GB+)
- Use Azure Data Lake Gen2 for big data files
- Prefer AzCopy or Azure Blob Storage SDK
- Avoid client-side uploads if bandwidth is low; use Azure Data Factory or hybrid solutions