Blog About Authors

Power Automate & Concurrency

6 min read By Benjamin Cloughessy
cover for Power Automate & Concurrency

Power Automate & Concurrency

Understanding Flow Concurrency

Power Automate flows are configured to run with unlimited concurrency out of the box. While great for a lot of use cases, it also introduces potential risks when multiple instances of the same flow attempt to modify shared resources or require sequential processing.

Controlling Concurrency

You can control how many instances of your flow can run simultaneously through the trigger settings:

  1. Open your flow in edit mode
  2. Select the trigger action
  3. Click on the ellipsis (…) menu
  4. Choose “Settings”
  5. Configure the concurrency control options

If you choose to allow concurrency, you also have to think more carefully about how your flow uses resources.

Protecting Shared Resources

When multiple flow instances run simultaneously, protecting shared resources becomes crucial. Here are key strategies to implement:

Implementing Resource Locks

A lock mechanism ensures that only one instance of your flow can modify a shared resource at a time. Personally, I think the cleanest way to do this in Power Automate is to use a Dataverse table to manage your locks across the Power Platform.

For example, before updating a shared configuration file, your flow should:

  1. Attempt to acquire a lock
  2. Proceed only if the lock is successfully acquired
  3. Release the lock after completing the operation
  4. Include error handling and timeout mechanisms to prevent indefinite locks

Managing Shared Resources

Shared resources in Power Automate could include things like:

  • SharePoint lists and libraries
  • Dataverse tables
  • Azure resources
  • External APIs and resources

I most commonly run into these challenges when working with SharePoint.

Building Reentrant Flows

Reentrancy is a computer science term that doesn’t exactly apply to Power Automate, but the concept carries over well. A reentrant flow will behave predictably even when:

  • Cancelled mid-execution
  • Run multiple times in succession
  • Executed concurrently

Resource locks will help a lot here. Additionally, we can check to see if an action needs to be performed before performing it (has a site been created, has a group been deleted, etc…).

Another consideration is to structure your flows for reentrancy.

A bunch of nested actions and conditions make it complicated to setup reentrant flows. Instead, consider a flat flow architecture. Each action-group should sit at the top level in a scope block. At the start of each action-group, we can check if it should be performed or not. If not, we can also decide if the flow should continue to the next action-group or simply terminate

For example: If the action-group creates a SharePoint site, you might first check if the site is created. If it is, simply continue to the next action-group. However if there is some critical piece of data unavailable, or this particular action is already being processed by another flow, maybe we terminate the whole run.

Practical Implementation: The Automation State Table

One effective approach for managing things like resource locks, shared resources, and reentrancy is creating a dedicated automation state tracking system. Here’s generally how I would implement it:

Table Structure

Create a Dataverse table with some key fields:

  • processUid (unique identifier for the automation, process, or resource)
  • State (Choice: Running, Complete, Failed)
  • ErrorDetails (Text)

Example State Query

Before performing critical operations, your flow should check the automation state:

  1. Query the state table filtering by ProcessUid (this could be a value specific to that flow, or some uid generated based on the resource it’s modifying - just something to consistently identify that process)
  2. Verify if the automation is already running
  3. Update the state accordingly

It’s outside the scope of this topic, but if needed we can also handle failures and act differently based on the state of the last process run.

Running Flows in a Queue

Why a Queue

Imagine we have some flow that is receiving many calls. Some of these calls can safely be processed simultaneously, while other calls are related to each other and must be handled sequentially.

If we limit the flow concurrency, performance across the board will suffer.

This is a problem that a queue can solve.

Queue Implementation

Let’s make it so that our flow now requires a processUid text parameter. Whoever calls this flow must provide the processUid.

The processUid should consistently represent a single process or resource. If our requests are linked to a SharePoint site, maybe we concat the url with the some other metadata like a list name or item id. The idea is that 2 calls representing the same source/process/resource will always have the same processUid, while other calls will have a different processUid.

So far, it’s really not a different concept than the automation table. For this use case, I’m going to call it the requests table. Let’s make sure it has the following fields:

  • processUid
  • state (available, running, complete, failed)
  • processing start time (date + time column)
  • processing completion time (date + time column)
  • processing delay (formula column - time difference between request creation and processing start time)
  • processing time (formula column - time difference between processing start and completion times)
  • total processing time (formula column - time difference between request creation and processing completion)

Rather than call our flow directly, now we will call a wrapper flow that stores the requests in our requests table. By default, dataverse will also store the request creation timestamp.

Whenever a request is added to the table, we want to query our requests table for all available requests. The key here is that when we query our requests table, we ensure the results are ordered based on when they were created in the requests table - first to last. We run through the query results in a loop with concurrency disabled to ensure sequential processing.

Because each request has a processUid, we can use our automation state table to control processing.

By default, all requests are sent to our original flow to be processed concurrently as we loop through the requests. We should adjust our original flow, so that when a request starts processing, it is locked in the automation state table.

As we are looping through the requests and sending them to be processed, we check the automation table for an item matching the processUid. if any request has a processUid that is locked in the automation state table, we skip processing.

We continue with this pattern until there are no more available requests in the requests table. This ensures that all requests with the same processUid are processed sequentially while allowing maximum concurrency for requests of different processUids.

Loop Parallelization

Power Automate executes loop iterations in parallel by default. This behavior can be modified through the loop action’s settings.

When to Use Sequential Loops

Consider sequential execution when:

  • Operations within the loop must happen in a specific order
  • You’re working with rate-limited resources
  • Debugging complex operations

Configuring Loop Execution

To change a loop to sequential execution:

  1. Select the loop action
  2. Open its settings
  3. Disable the “Concurrency Control” option

Conclusion

Remember that building robust automated processes requires careful consideration of these factors from the start. Taking the time to implement proper concurrency controls and resource protection measures will save countless hours of troubleshooting and prevent data integrity issues in the future.


For more information on Power Automate best practices, visit the official Microsoft Power Automate documentation and community forums.

Benjamin Cloughessy avatar

About the Author

Benjamin Cloughessy

Adventurer

Benjamin is a software developer and student of the Word, passionate about both knowing and believing the bible. He particularly is passionate about bringing biblical literacy to the charismatic part of Christ's body.