Understanding Dataverse Storage Architecture

Loading

1. Introduction to Dataverse Storage

Microsoft Dataverse (formerly Common Data Service) is the cloud-based data storage layer for Dynamics 365 and Power Platform. It provides a structured, secure, and scalable database for business applications.

Key Features of Dataverse Storage:

Relational database (tables, relationships, business logic)
Security & compliance (role-based access, encryption)
Scalability (handles large datasets with performance optimization)
Integration-ready (APIs, Power Automate, Azure Synapse)


2. Dataverse Storage Components

A. Table (Entity) Storage

  • Standard Tables: Pre-built (e.g., Account, Contact, Opportunity)
  • Custom Tables: User-defined (e.g., “Project Task,” “Inventory Item”)
  • Storage Consumption:
  • Each record consumes ~2 KB (varies by columns)
  • File/Image data stored separately (see “File & Image Storage”)

B. File & Image Storage

  • File Columns: Attach documents (PDF, Excel, etc.)
  • Stored in Azure Blob Storage (not in SQL)
  • Default 4 TB per environment (separate from DB storage)
  • Image Columns: Profile pictures, product images
  • Optimized for thumbnails & fast loading

C. Log & Audit Data

  • Audit Logs: Track record changes (configurable retention)
  • Plug-in Tracing: Debug logs for custom code
  • Consumes storage but can be archived.

D. Managed & Unmanaged Solutions

  • Solutions package customizations (forms, workflows, etc.)
  • Storage impact: Minimal (~MBs per solution)

3. Dataverse Capacity & Limits

A. Storage Allocation

Storage TypeIncluded (Per Environment)Additional Purchase
Database1 GB (default)Available in increments
File4 GBExpandable up to 4 TB
Log500 MBConfigurable retention

B. Key Storage Limits

Max Database Size: Up to 4 TB (Enterprise plans)
Max File Storage: 4 TB per environment
Record Limit: Millions per table (performance degrades after ~1M)
Column Limit: ~1,000 columns per table


4. How Dataverse Storage is Calculated

A. Database Storage (SQL-based)

  • Each record2 KB (varies by columns)
  • Example Calculation:
  • 100,000 Account records × 2 KB = 200 MB
  • 1,000,000 Contact records × 2 KB = 2 GB

B. File Storage (Azure Blob)

  • Actual file size (1 MB PDF = 1 MB storage)
  • Example:
  • 10,000 invoices (avg. 500 KB each) = 5 GB

C. Log Storage

  • Audit logs1 KB per change
  • Example:
  • 100,000 audit entries = 100 MB

5. Managing & Optimizing Dataverse Storage

A. Monitoring Storage Usage

  • Power Platform Admin CenterAnalytics → Capacity
  • Key Metrics:
  • Database used vs. allocated
  • File storage consumption
  • Log storage trends

B. Reducing Database Storage

Archive old records (e.g., closed cases, historical data)
Use alternate storage (Azure SQL, Data Lake) for analytics
Delete unused tables (custom entities)

C. Optimizing File Storage

Set retention policies (auto-delete old files)
Use SharePoint for large document libraries
Compress files before upload

D. Managing Log Storage

Adjust audit log retention (default = 90 days)
Export & purge old logs (Power Automate)


6. Advanced Storage Strategies

A. Tiered Storage (Hot/Cold Data)

  • Hot data: Frequently accessed (keep in Dataverse)
  • Cold data: Archived (move to Azure Blob or Data Lake)

B. Data Partitioning (Large Tables)

  • Vertical partitioning: Split columns into related tables
  • Horizontal partitioning: Archive old records

C. External Data Sources

  • Virtual Tables: Query external DBs (SQL, SharePoint)
  • Azure Synapse Link: Replicate to data warehouse

7. Troubleshooting Storage Issues

IssueSolution
“Storage Full” errorsPurchase more capacity or archive data
Slow performanceOptimize queries, index columns
File upload failuresCheck file storage limits
Audit logs missingAdjust retention policy

Key Takeaways:

Dataverse storage is split between Database (SQL), Files (Blob), and Logs
Monitor usage in Power Platform Admin Center
Optimize by archiving, compressing, and partitioning

Best Practices:

  1. Regularly audit storage (avoid surprises)
  2. Use external storage (SharePoint, Azure) for large files
  3. Purge unused data (old logs, test records)
  4. Plan for growth (scale storage as needed)

Leave a Reply

Your email address will not be published. Required fields are marked *