Power Automate is a powerful automation tool, but inefficient flows can lead to slow execution, timeouts, and unnecessary resource consumption. Optimizing your flows ensures faster processing and better reliability.
In this guide, we’ll cover:
How to reduce execution time
Ways to optimize loops and triggers
Best practices for handling large datasets
1. Optimize Triggers for Better Performance
Why?
Flows that trigger too frequently or unnecessarily can slow down performance and consume unnecessary resources.
Best Practices
✔️ Use Trigger Conditions to filter out unnecessary runs.
✔️ Avoid Recurring Triggers unless absolutely necessary.
✔️ Use “When an Item is Modified” Instead of “Created or Modified” to prevent redundant executions.
Example Trigger Condition (Only Run When “Status” Changes to Approved):
@equals(triggerOutputs()?['body/Status'], 'Approved')
Benefit: This ensures the flow only runs when the specific condition is met, avoiding unnecessary triggers.
2. Reduce API Calls and Minimize Actions
Why?
Each action in Power Automate requires processing time. Reducing the number of actions speeds up execution.
Best Practices
✔️ Use Variables Instead of Repeated Actions
✔️ Combine Multiple Actions Using Expressions
✔️ Avoid Unnecessary “Get Items” or “Get Rows” Actions
Example: Using Variables Instead of Fetching Data Multiple Times
Inefficient Approach:
Get Item (Step 1) → Modify Value → Get Item Again (Step 2)
Optimized Approach:
Get Item Once → Store in Variable → Modify Value → Use Variable
Benefit: This eliminates redundant API calls, reducing execution time.
3. Optimize Loops (Apply to Each, Do Until)
Why?
Loops can slow down your flow, especially if they process large datasets.
Best Practices
✔️ Use Parallelism for Faster Execution
✔️ Limit the Number of Loop Iterations
✔️ Filter Data Before Looping
Example: Enabling Parallelism for Loops
1️⃣ Click on the “Apply to Each” action.
2️⃣ Go to Settings and enable “Concurrency Control”.
3️⃣ Set a higher degree of parallelism (e.g., 20).
Benefit: This allows multiple iterations to run simultaneously, reducing overall execution time.
4. Use Filter Queries Instead of Fetching All Data
Why?
Fetching all data and then filtering it within the flow is inefficient. Instead, apply filtering at the source.
Best Practices
✔️ Use OData filters for SharePoint, Dataverse, and SQL queries.
✔️ Apply “Top Count” to limit the number of records retrieved.
Example: Filtering Data Directly in “Get Items” (SharePoint List)
Inefficient Approach:
plaintextCopyEditGet all items → Apply "Condition" in Flow
Optimized Approach:
Use Filter Query: Status eq 'Approved'
🔹 Benefit: This retrieves only relevant records, reducing load time.
5. Avoid Infinite Loops and Redundant Triggers
Why?
Flows that modify data in a list and trigger themselves again can cause unnecessary execution.
Best Practices
✔️ Add a Trigger Condition to prevent unwanted re-execution.
✔️ Use Modified By System Account Check to stop infinite loops.
Example Condition to Prevent Recursive Triggers:
@not(equals(triggerOutputs()?['body/ModifiedBy'], 'System Account'))
Benefit: This prevents self-triggering loops and saves execution time.
6. Batch Processing for Large Data Sets
Why?
Processing data one record at a time is slow. Batch processing improves performance significantly.
Best Practices
✔️ Use Pagination in “Get Items” for SharePoint.
✔️ Use Power Automate’s “Batch Requests” where supported.
Example: Enabling Pagination in “Get Items”
1️⃣ Click on “Settings” inside “Get Items”
2️⃣ Enable “Pagination” and set a higher limit (e.g., 5000).
Benefit: This fetches multiple records at once, reducing API calls.
7. Use Scope and Try-Catch for Error Handling
Why?
Flows that fail due to missing data or timeouts can be optimized with proper error handling.
Best Practices
✔️ Wrap actions in “Scope” for better error tracking.
✔️ Add a “Retry Policy” to handle transient failures.
Example: Applying Retry Policy
{
"type": "exponential",
"intervalInMilliseconds": 5000,
"maxRetryCount": 3
}
🔹 Benefit: This automatically retries failed actions, reducing manual intervention.
8. Reduce Run History Log Size
Why?
Flows with extensive history logs take longer to load.
Best Practices
✔️ Set a retention policy for flow history.
✔️ Delete unnecessary logs using “Delete Flow Runs” action.
Example: Automating Log Cleanup
1️⃣ Create a Scheduled Flow
2️⃣ Use the “List Runs” API to fetch flow history
3️⃣ Delete old records programmatically
Benefit: Keeps Power Automate lightweight and fast.
9. Optimize Expressions and Conditions
Why?
Unoptimized conditions and complex expressions can slow execution.
Best Practices
✔️ Use “Coalesce()” to handle null values efficiently.
✔️ Prefer “Switch” over multiple “If” conditions.
Example: Using Coalesce to Prevent Null Errors
coalesce(triggerOutputs()?['body/Name'], 'Default Name')
Benefit: This ensures a default value is used instead of causing errors.
10. Choose the Right Connector Type
Why?
Premium connectors offer better performance compared to standard connectors in some cases.
Best Practices
✔️ If possible, use Dataverse instead of SharePoint for large datasets.
✔️ Use SQL Server instead of Excel for data-heavy operations.
✔️ Prefer Graph API over Microsoft 365 actions when handling multiple records.
Example: Using Microsoft Graph API for Bulk Processing
Instead of looping through emails in Outlook, use:
https://graph.microsoft.com/v1.0/me/messages?$filter=contains(subject,'Invoice')
Benefit: Faster retrieval and processing of data.