Performance Load Testing for Dataverse

Loading


As organizations increasingly adopt Microsoft Dataverse—the underlying data platform for Power Apps, Dynamics 365, and other Power Platform solutions—ensuring performance at scale has become more crucial than ever. Applications built on Dataverse often handle critical business functions, and users expect them to respond quickly and scale seamlessly.

This is where performance and load testing comes into play. These types of tests help evaluate how well Dataverse-based apps behave under expected (and unexpected) workloads, ensuring system reliability, scalability, and optimal user experience.

In this guide, we’ll explore what performance/load testing means in the context of Dataverse, tools to use, methodologies, metrics, and best practices for executing meaningful tests.


What Is Dataverse?

Microsoft Dataverse is a cloud-based, low-code data platform that stores data in tables and supports business logic, workflows, integrations, and security models. It powers apps built with Power Apps, Dynamics 365 apps (like Sales, Service), and custom connectors or API integrations.

Because Dataverse is a shared platform in the Microsoft cloud (hosted on Azure), performance testing must consider factors like:

  • API rate limits
  • Server response times
  • Custom plugins or Power Automate flows
  • Security roles and data volume
  • Background operations

Why Performance and Load Testing Is Important for Dataverse

Here are some key reasons to include performance testing in your Dataverse ALM (Application Lifecycle Management) strategy:

  • Validate Scalability: Will your solution handle a growing user base?
  • Identify Bottlenecks: Are custom plugins, business rules, or flows slowing things down?
  • Improve User Experience: Users expect fast responses across apps and portals.
  • Avoid Throttling: Stay within Microsoft’s API limits and service protection policies.
  • Optimize Integrations: Ensure APIs and background services perform well under load.
  • Support Business Continuity: High-performing apps reduce downtime and increase reliability.

Types of Tests to Run

1. Load Testing

  • Simulates normal to peak user traffic
  • Measures response time, resource usage, and throughput

2. Stress Testing

  • Pushes the system beyond its limits to observe failure behavior

3. Soak Testing

  • Sustains load over an extended time to identify memory leaks or performance degradation

4. Spike Testing

  • Applies sudden increase in load to see how the system reacts and recovers

5. Concurrency Testing

  • Tests multiple users performing the same or different tasks simultaneously

Tools for Performance Testing Dataverse

Although Dataverse doesn’t offer a built-in performance test suite, you can use external tools with API-based and browser-based strategies.

Common Tools:

ToolPurpose
Postman + Collection RunnerBasic API load simulation
JMeterScripted performance and stress testing
Azure Load TestingScalable cloud-based load testing for web/API
K6.ioLightweight scripting for API load testing
Power Platform Performance ToolkitPreview tool for performance benchmarks
Playwright or SeleniumUI-based end-to-end performance testing
Application InsightsTelemetry for Dataverse plugins and flows

API-Based Load Testing with JMeter (Example)

Step 1: Set Up Authentication

Dataverse APIs use OAuth 2.0 with Azure AD. You’ll need:

  • Tenant ID
  • Client ID (App Registration)
  • Client Secret
  • Resource URL (e.g., https://org.crm.dynamics.com)

Use a pre-test script in JMeter to obtain an access token via a POST request to the token endpoint.

Step 2: Define HTTP Requests

Create test cases for:

  • Retrieving records (GET)
  • Creating records (POST)
  • Updating records (PATCH)
  • Deleting records (DELETE)

Example GET request:

GET /api/data/v9.2/accounts?$top=10
Host: org.crm.dynamics.com
Authorization: Bearer <access_token>

Step 3: Set Thread Groups

Define:

  • Number of virtual users
  • Ramp-up time
  • Loop count (how many times users will run the test)

Step 4: Add Listeners and Metrics

Track:

  • Response times (min, max, avg)
  • Error rate
  • Throughput (requests/sec)
  • Latency

Using Azure Load Testing for Dataverse

Azure Load Testing allows you to simulate high user loads from Microsoft’s cloud. While it’s designed for web apps and APIs, it can be extended to test Dataverse Web API endpoints.

Benefits:

  • Integration with Azure DevOps Pipelines
  • Scalable test execution
  • Visual analysis dashboards
  • Correlation with Application Insights telemetry

Example Scenario:

  • 500 users create contact records over 5 minutes
  • Monitor plugin execution times via telemetry
  • Identify any failed or throttled requests

Key Metrics to Monitor

MetricDescription
Response TimeTime taken to receive response from Dataverse API
ThroughputNumber of requests processed per second
Error RatePercentage of failed requests
LatencyTime taken between request submission and response receipt
CPU/Memory Usage(From plugin telemetry or flow metrics)
SQL Query Execution Time(If using custom SQL via TDS endpoint or virtual tables)
Plugin Step DurationCan be captured via Plugin Trace Logs or Application Insights

🔂 Testing Flows, Plugins, and Business Logic

It’s not just about the API—Dataverse performance also depends on customizations.

✅ What to Test:

  • Synchronous Plugins: How fast do they execute under load?
  • Power Automate Flows: Do they queue or error under concurrent submissions?
  • Custom APIs: Built with Power Platform Custom Connectors or Azure Functions
  • Business Rules: Client-side logic slowing down forms?

Use Plugin Trace Logs, App Insights, or Telemetry Logging to capture real execution metrics.


Example Scenario: Load Testing a Lead Capture API

Goal:

Test how Dataverse performs when 200 users submit lead forms concurrently via a portal or custom web app.

Steps:

  1. Create a JMeter or K6 script to simulate form POSTs
  2. Use OAuth token for authentication
  3. Monitor:
    • POST request latency
    • Plugin execution times
    • Power Automate flow trigger delays
    • API failure or timeout rates
  4. Add Application Insights logging to track plugin memory and exceptions
  5. Scale to 1000 submissions/hour to test soak performance

Best Practices for Performance Testing Dataverse

  1. Use Non-Production Environments Always test on dedicated performance/UAT environments to avoid interfering with real users.
  2. Isolate Bottlenecks Separate testing between:
    • API performance
    • Custom logic (plugins/flows)
    • Frontend performance (model-driven/canvas apps)
  3. Use Telemetry Enable Plugin Trace Logs or Application Insights to capture detailed metrics.
  4. Simulate Realistic Scenarios Test expected user patterns—burst writes, report loads, and data integrations.
  5. Follow API Limits Respect Dataverse service protection limits (e.g., 6000 requests per 5 min/user)
  6. Test in Waves Gradually increase load to mimic real usage patterns and discover thresholds.
  7. Automate Load Tests Integrate JMeter or Azure Load Testing into CI/CD pipelines for regular validation.
  8. Track Changes Over Time Monitor performance across solution deployments, schema changes, or feature rollouts.

Challenges and Considerations

  • API Throttling: Dataverse may throttle clients that exceed usage limits
  • Caching Effects: First-time queries may perform differently than cached ones
  • Licensing and Entitlements: Ensure your load test doesn’t violate terms (e.g., via overuse of Dataverse API entitlements)
  • Simulated Users ≠ Real Users: Test scenarios should mimic actual workflows, not just raw requests

Reporting & Analysis

After testing, compile metrics into meaningful dashboards or reports:

  • Use Power BI to visualize throughput, errors, and response trends
  • Export JMeter or K6 test logs to Excel for deeper analysis
  • Compare performance before/after deployments

Ask questions like:

  • Which entity requests are the slowest?
  • Are plugin times increasing with more data?
  • Does API performance degrade during peak business hours?

When Should You Run Performance Tests?

  • Before go-live
  • After significant changes (e.g., new entities, plugins, flows)
  • After integrating external systems
  • Before scaling users or regions
  • Regularly as part of DevOps pipelines

Resources

Leave a Reply

Your email address will not be published. Required fields are marked *