In Microsoft Power Pages (formerly Power Apps Portals), users can submit forms and upload documents. While the default storage is Dataverse (formerly CDS), many enterprises prefer Azure Blob Storage for its cost-effectiveness, scalability, and advanced storage capabilities.
This guide walks you step-by-step through how to store uploaded documents from Power Pages directly into Azure Blob Storage, using Power Automate, Dataverse, and Azure Storage APIs.
Architecture
User (via Portal)
↓
Power Pages Form (with file upload)
↓
Dataverse Table (records file metadata)
↓
Power Automate Flow (fetches file)
↓
Azure Blob Storage (stores actual file)
Why Use Azure Blob Instead of Dataverse for Files?
- Dataverse file storage is expensive and limited
- Blob Storage offers massive capacity at a fraction of the cost
- Integration with Azure Data Factory, Logic Apps, Synapse, and AI
- Advanced features like SAS Tokens, Lifecycle Management
Step-by-Step Implementation
1. Prepare Azure Blob Storage
A. Create a Storage Account
- Go to Azure Portal
- Search “Storage Accounts”
- Click + Create
- Select your subscription, resource group, and enter:
- Storage account name: e.g.,
portaluploadsstorage
- Region: As per your location
- Performance: Standard
- Redundancy: LRS
- Storage account name: e.g.,
- Click Review + Create
B. Create a Container
- Open your newly created storage account
- Under Data Storage, select Containers
- Click + Container
- Name it, e.g.,
uploaded-documents
- Set Public Access Level to Private (no anonymous access)
2. Set Up Azure Access Key or SAS Token
Option A: Access Key (for backend usage)
- Go to the storage account → Access keys
- Copy the Key 1 and Connection String
Option B: SAS Token (for limited access)
- Go to storage account → Shared Access Signature
- Configure permissions (Write, Read, Create)
- Set expiry (e.g., 1 year)
- Click Generate SAS Token & URL
- Copy the Blob SAS token and Blob service URL
3. Create a Dataverse Table for Uploads
In Power Pages:
- Go to Dataverse → Tables → + New Table
- Name:
DocumentUploads
- Columns:
- Title
- File (File type column)
- FileName (Text)
- Status (Text)
- UploadedToBlob (Yes/No)
Publish the table.
4. Create a Basic Form in Power Pages
- Go to Power Pages Studio
- Add a form connected to
DocumentUploads
- Enable the File upload field
- Add instructions to user about file size/type limits
- Set Web Roles if needed
5. Create a Power Automate Flow
This is the engine that moves the uploaded document from Dataverse to Azure Blob.
A. Trigger – “When a row is added”
- Create an Automated cloud flow
- Use trigger:
When a row is added
- Table name:
DocumentUploads
- Scope: Organization
B. Step – Get file content
Add Get file or image content
(Dataverse connector)
- Row ID: From trigger
- Column Name: File column name (case-sensitive)
C. Step – Upload to Azure Blob
Option 1: Using Azure Blob Storage Connector
- Add Action:
Create blob (Azure Blob Storage)
- Sign in with your Azure account
- Fill:
- Folder path:
uploaded-documents
- Blob name:
Document-@{triggerOutputs()?['body/Title']}.pdf
(or use fileName column) - Blob content: From
Get file content
step
- Folder path:
Option 2: Using HTTP POST with SAS URL
- Add Action: HTTP
- Method: PUT
- URI:
php-https://<youraccount>.blob.core.windows.net/<container>/<filename>?<SAS_token>
- Headers:
x-ms-blob-type
:BlockBlob
Content-Type
:application/octet-stream
- Body: File content (base64 from previous step)
D. Optional: Update Dataverse Row
- Add step
Update row
in Dataverse - Set
UploadedToBlob = true
,Status = Uploaded
6. Test Your Setup
- Open your portal
- Fill and submit form with a file
- Wait for flow to complete
- Check Azure Blob Storage:
- File should appear inside
uploaded-documents
- Metadata (date/time/size) should be visible
- File should appear inside
Security & Governance
- Use Private containers
- Use SAS tokens with expiry limits
- Log all flows and enable alerting for failed uploads
- Use Azure Policies for data classification and retention
- Monitor Blob Storage using Azure Monitor + Log Analytics
Advanced Features
A. Generate Unique Filenames
Use expression in Blob Name:
concat('Upload-', utcNow(), '-', triggerOutputs()?['body/FileName'])
B. Add Virus Scanning with Logic Apps
- Use Azure Logic Apps + VirusTotal/Cloudmersive API
- Scan files before confirming upload
C. Use Power Pages + JavaScript + Direct Upload
- Bypass Dataverse and use JavaScript to call Azure REST API
- Requires SAS Token generation and JS-based form customization
Cost Optimization Tips
- Use Hot access tier for frequently accessed files
- Use Lifecycle Management Rules to delete or archive files
- Track ingress/egress bandwidth to manage costs
- Store small documents only; move media to Azure Media Services
Troubleshooting
Problem | Solution |
---|---|
File not appearing in Blob | Check SAS permissions and folder path |
Flow fails at upload | Validate content type and blob URI |
Access denied | Review container privacy and connection credentials |
Invalid filename | Sanitize filenames in Power Automate |