Storing uploaded docs to Azure Blob Storage

Loading

In Microsoft Power Pages (formerly Power Apps Portals), users can submit forms and upload documents. While the default storage is Dataverse (formerly CDS), many enterprises prefer Azure Blob Storage for its cost-effectiveness, scalability, and advanced storage capabilities.

This guide walks you step-by-step through how to store uploaded documents from Power Pages directly into Azure Blob Storage, using Power Automate, Dataverse, and Azure Storage APIs.


Architecture

User (via Portal)

Power Pages Form (with file upload)

Dataverse Table (records file metadata)

Power Automate Flow (fetches file)

Azure Blob Storage (stores actual file)

Why Use Azure Blob Instead of Dataverse for Files?

  • Dataverse file storage is expensive and limited
  • Blob Storage offers massive capacity at a fraction of the cost
  • Integration with Azure Data Factory, Logic Apps, Synapse, and AI
  • Advanced features like SAS Tokens, Lifecycle Management

Step-by-Step Implementation

1. Prepare Azure Blob Storage

A. Create a Storage Account

  1. Go to Azure Portal
  2. Search “Storage Accounts”
  3. Click + Create
  4. Select your subscription, resource group, and enter:
    • Storage account name: e.g., portaluploadsstorage
    • Region: As per your location
    • Performance: Standard
    • Redundancy: LRS
  5. Click Review + Create

B. Create a Container

  1. Open your newly created storage account
  2. Under Data Storage, select Containers
  3. Click + Container
  4. Name it, e.g., uploaded-documents
  5. Set Public Access Level to Private (no anonymous access)

2. Set Up Azure Access Key or SAS Token

Option A: Access Key (for backend usage)

  • Go to the storage account → Access keys
  • Copy the Key 1 and Connection String

Option B: SAS Token (for limited access)

  • Go to storage account → Shared Access Signature
  • Configure permissions (Write, Read, Create)
  • Set expiry (e.g., 1 year)
  • Click Generate SAS Token & URL
  • Copy the Blob SAS token and Blob service URL

3. Create a Dataverse Table for Uploads

In Power Pages:

  1. Go to Dataverse → Tables → + New Table
  2. Name: DocumentUploads
  3. Columns:
    • Title
    • File (File type column)
    • FileName (Text)
    • Status (Text)
    • UploadedToBlob (Yes/No)

Publish the table.


4. Create a Basic Form in Power Pages

  1. Go to Power Pages Studio
  2. Add a form connected to DocumentUploads
  3. Enable the File upload field
  4. Add instructions to user about file size/type limits
  5. Set Web Roles if needed

5. Create a Power Automate Flow

This is the engine that moves the uploaded document from Dataverse to Azure Blob.

A. Trigger – “When a row is added”

  1. Create an Automated cloud flow
  2. Use trigger: When a row is added
  3. Table name: DocumentUploads
  4. Scope: Organization

B. Step – Get file content

Add Get file or image content (Dataverse connector)

  • Row ID: From trigger
  • Column Name: File column name (case-sensitive)

C. Step – Upload to Azure Blob

Option 1: Using Azure Blob Storage Connector

  1. Add Action: Create blob (Azure Blob Storage)
  2. Sign in with your Azure account
  3. Fill:
    • Folder path: uploaded-documents
    • Blob name: Document-@{triggerOutputs()?['body/Title']}.pdf (or use fileName column)
    • Blob content: From Get file content step

Option 2: Using HTTP POST with SAS URL

  1. Add Action: HTTP
  2. Method: PUT
  3. URI:
    php-https://<youraccount>.blob.core.windows.net/<container>/<filename>?<SAS_token>
  4. Headers:
    • x-ms-blob-type: BlockBlob
    • Content-Type: application/octet-stream
  5. Body: File content (base64 from previous step)

D. Optional: Update Dataverse Row

  • Add step Update row in Dataverse
  • Set UploadedToBlob = true, Status = Uploaded

6. Test Your Setup

  1. Open your portal
  2. Fill and submit form with a file
  3. Wait for flow to complete
  4. Check Azure Blob Storage:
    • File should appear inside uploaded-documents
    • Metadata (date/time/size) should be visible

Security & Governance

  • Use Private containers
  • Use SAS tokens with expiry limits
  • Log all flows and enable alerting for failed uploads
  • Use Azure Policies for data classification and retention
  • Monitor Blob Storage using Azure Monitor + Log Analytics

Advanced Features

A. Generate Unique Filenames

Use expression in Blob Name:

concat('Upload-', utcNow(), '-', triggerOutputs()?['body/FileName'])

B. Add Virus Scanning with Logic Apps

  • Use Azure Logic Apps + VirusTotal/Cloudmersive API
  • Scan files before confirming upload

C. Use Power Pages + JavaScript + Direct Upload

  • Bypass Dataverse and use JavaScript to call Azure REST API
  • Requires SAS Token generation and JS-based form customization

Cost Optimization Tips

  • Use Hot access tier for frequently accessed files
  • Use Lifecycle Management Rules to delete or archive files
  • Track ingress/egress bandwidth to manage costs
  • Store small documents only; move media to Azure Media Services

Troubleshooting

ProblemSolution
File not appearing in BlobCheck SAS permissions and folder path
Flow fails at uploadValidate content type and blob URI
Access deniedReview container privacy and connection credentials
Invalid filenameSanitize filenames in Power Automate

Leave a Reply

Your email address will not be published. Required fields are marked *