File size exceeds limit – The file being processed exceeds the maximum allowed size

Loading

Power Automate has memory usage limits for flows. If a flow consumes too much memory, it fails with the error:

“Flow failed due to excessive memory usage.”

This guide explains:
Why this error occurs.
How to check memory usage in Power Automate.
How to optimize memory usage to prevent failures.


Step 1: Understanding Power Automate Memory Limits

Power Automate allocates a fixed amount of memory per flow execution. If the flow exceeds this limit, it crashes.

Power Automate PlanMax Memory Usage per Flow
Microsoft 365 Plan (Standard)100 MB
Per-User Plan / Per-Flow Plan500 MB
Power Automate Process PlanHigher Limits
Azure Logic Apps (Alternative)Scalable Limits

1.1. Common Causes of Excessive Memory Usage

Processing too much data (large lists, tables, or arrays).
Handling large files (e.g., PDFs, Excel sheets, JSON payloads).
Too many nested loops or conditions.
Working with complex objects or deeply nested JSON structures.
Using too many parallel executions (Concurrency).


Step 2: Identifying Memory Usage Issues

2.1. Check Flow Run History for Memory Issues

  1. Open Power Automate (https://make.powerautomate.com).
  2. Click on My Flows → Select the affected flow.
  3. Go to Run History → Open a failed run.
  4. Look for errors such as: Flow failed due to excessive memory usage.
  5. Expand the Flow Execution Details to check:
    • Action count
    • Loop iterations
    • Size of processed data

2.2. Identify Actions Consuming Excessive Memory

  1. Open the failed flow run.
  2. Look for large data payloads under Trigger Outputs and Action Outputs.
  3. Identify loops or array processing steps handling too much data.

Step 3: Optimizing Memory Usage

3.1. Limit the Amount of Data Processed

If the flow retrieves too much data, limit the amount before processing.

Example: Optimize SharePoint Data Retrieval

Inefficient:

  • Retrieving all items from a SharePoint list and filtering them in the flow.

Optimized:

  • Use OData Filters to fetch only relevant items: Filter Query: Status eq 'Approved'
  • Use the “Top Count” setting to limit the number of records (e.g., 500).

3.2. Reduce File Sizes Before Processing

If handling large files, reduce their size before using them in the flow.

Example: Compress Large Files Before Uploading

Inefficient:

  • Processing large PDFs or Excel files directly in Power Automate.

Optimized:

  • Compress files before uploading to SharePoint or OneDrive.
  • If working with Excel, extract only necessary rows and columns before processing.

3.3. Optimize Loops and Array Processing

Loops consume significant memory, especially if handling large datasets.

Example: Avoid Unnecessary Looping

Inefficient:

Apply to Each: Loops through 10,000 records and applies an action to each.

Optimized:

  • Use filtering before looping to reduce iterations.
  • Batch process data instead of handling it all at once.

3.4. Use Pagination to Process Data in Chunks

Instead of processing large datasets at once, use pagination.

Example: Enable Pagination in SharePoint or Dataverse Queries

  1. Open the “Get Items” action.
  2. Click on Settings.
  3. Turn on Pagination and set a limit (e.g., 1000).

3.5. Reduce JSON Payload Size

If a flow works with large JSON objects, reduce payload size before processing.

Example: Extract Only Required Data from JSON

Inefficient:

  • Processing entire JSON responses when only a few fields are needed.

Optimized:
Use Select action to extract only required fields before processing:

{
"Title": "Project A",
"Status": "Approved"
}

This prevents unnecessary memory usage.


3.6. Enable Concurrency Control to Limit Parallel Executions

If a flow processes too many items simultaneously, memory usage increases.

Steps to Enable Concurrency Control:

  1. Open the Apply to Each action.
  2. Click on Settings.
  3. Turn on Concurrency Control.
  4. Set a lower Degree of Parallelism (e.g., 5).

This ensures fewer simultaneous executions, reducing memory usage.


3.7. Offload Heavy Processing to Azure Logic Apps

If a flow requires more memory than Power Automate allows, use Azure Logic Apps:

  1. Open Azure Portal → Go to Logic Apps.
  2. Create a Logic App for processing large datasets or files.
  3. Use Power Automate to trigger the Logic App when needed.

Azure Logic Apps handle larger payloads and more memory-intensive operations.


3.8. Upgrade to a Higher Power Automate Plan

If the flow frequently runs out of memory, consider upgrading your plan.

  1. Open Microsoft 365 Admin Center (https://admin.microsoft.com).
  2. Navigate to BillingPurchase Services.
  3. Choose a Power Automate Per-User Plan or Per-Flow Plan for higher memory limits.

Step 4: Preventing Future Memory Issues

4.1. Best Practices to Reduce Memory Usage

Limit the data retrieved – Use filters and pagination.
Reduce file sizes – Compress large files before processing.
Optimize loops – Minimize “Apply to Each” iterations.
Extract only necessary JSON data – Avoid unnecessary fields.
Limit concurrency – Reduce simultaneous executions.
Use Azure Logic Apps for high-memory tasks – Handle large datasets efficiently.

4.2. Set Up Alerts for High Memory Usage

  1. Open Power Platform Admin Center.
  2. Set up alerts for flows consuming excessive memory.

This helps detect memory issues before failures occur.

Leave a Reply

Your email address will not be published. Required fields are marked *