If you need to get more than 5000 items from a SharePoint list, there are a number of different approaches. I’ve written up three different ways to solve the problem, which return exactly the same result, but have very different execution times. To get an overall understanding of this problem, check out the YouTube video I produced:
My requirement is to return 17,000 SharePoint list items to a PowerApp. I used Power Automate to get the data but found the standard “Get Items” action to be too slow. A good user experience is essential for application acceptance.
Eventually, I was able to reduce the amount of time that the Flow takes to retrieve the items from 2 minutes down to 8 seconds! That’s 13 times faster!
Since I originally wrote this post, I have figured out a method that is even easier and faster! Check it out here.
Links to my sample data and all the flows created are at the end of the post.
My example data looks like this:

It’s 7 columns wide, 17,000 rows of data. Here is how long it took Power Automate to execute the flow with execute each method:
Method | Time to Execute |
---|---|
Standard “Get Items” | 111 Seconds |
Standard “Get Items” with Select Action | 67 Seconds |
Retreiving same data with SharePoint API | 31 Seconds |
Retreiving same data with SharePoint API and two parralel queries | 19 Seconds |
Retreiving same data with SharePoint API and four parralel queries | 13 Seconds |
Same as above with optimised flow | 8 Seconds |
The table shows the results from six variations on three different themes. The fundamental principal behind each method is a “Do until” loop which collects the entire contents of a SharePoint list into an array. Each method has some pro’s and cons.
Which method to use?
The right method to use, depends on your circumstances. Here are some suggestions on what method may work best for you:
- If your flow is not time sensitive, just use the standard “Get Items” action with a “Do Until” Loop in Power Automate. The flow is simple and easy to build.
Get more than 5000 items from SharePoint with Power Automate - If you are already using the “Get Items” method but need it to be a bit faster, enhance the speed of it by adding the “Select” action.
Get more than 5000 items from SharePoint with Power Automate - If the performance gained from the “Select” action still does not meet your requirements, switch to using the SharePoint API. It is almost the same as using “Get Items”, but drastically faster.
Use the REST API in Flow to get over 5000 items from SharePoint - Finally, using parallel execution to retrieve chunks of SharePoint list data simultaneously increases speed further. But it does make the flow more complicated.
Combine SharePoint API & Flow Parallel branches for Performance
Why is the SharePoint Get Items action so slow?
The “Get Items” from SharePoint in Power Automate is very slow. The primary reason is the size of the payload that it returns from SharePoint. In addition to returning the columns defined in your SharePoint view, it also returns many other columns and metadata.
A single row of the sample data above results in this from SharePoint:
{ "@odata.etag": "\"1\"", "ItemInternalId": "1", "ID": 1, "Title": "Terry", "EmpNo": 1.0, "Last": "Carter", "Furloughed": 1.0, "Telephone": "(462) 723-7695", "Email": "[email protected]", "Brand": "PowerBI", "{Identifier}": "Lists%252f5000%2bItem%2bTest1%252f1_.000", "{IsFolder}": false, "{Thumbnail}": { "Large": null, "Medium": null, "Small": null }, "{Link}": "https://accendo1.sharepoint.com/sites/DocumentManagementTest/.....", "{Name}": "Terry", "{FilenameWithExtension}": "Terry", "{Path}": "Lists/5000 Item Test1/", "{FullPath}": "Lists/5000 Item Test1/1_.000", "{HasAttachments}": false, "{VersionNumber}": "1.0" }
That’s a huge amount of extra fields, which results in a massive payload. 5,000 rows of my sample data created a JSON response which was 3,545 KB.
The same 5,000 rows of data when passed through the “Select” action produces a JSON response which looks like this:
{ "Id": 1, "Title": "Terry", "EmpNo": 1.0, "Last": "Carter", "Furloughed": 1.0, "Telephone": "(462) 723-7695", "Email": "[email protected]", "Brand": "PowerBI", "ID": 1 }
For the same 5,000 records, the JSON response in flow was only 772 KB – A massive reduction. So passing the response from Get Items through the select action can make a big difference to how quickly your flow will run, and it is super easy to do.

Use the Sharepoint API for maximum speed
Although the “Select” action in Power Automate described above improves the speed of a flow within a do…until loop, the “Get Items” action is still retrieving a lot of unwanted data from SharePoint.
Using the SharePoint API allows you to get exactly the data you want without retrieving any unwanted columns or metadata.
Fortunately, this is very easy to do using the SharePoint API using the “Send an HTTP request to SharePoint” action. Compare these two flow actions:


These two actions, essentially perform the same operation, but the second one returns only the requested columns. Here is the difference in execution time:

Use Parallel Branching to get batches of 5000 items simultaneously.
Using the SharePoint API to get list data into Power Automate makes a huge difference to the speed of the operation. But it still gets the data in batches of 5000 blocks. With my 17,000 row sample the do-until loop has to execute four times to collect all of the data, all of which takes valuable time.
By using the parallel branch actions in Microsoft Power Automate we can engineer a solution which works much faster.
I tried collecting SharePoint data with two and four parallel branches. With four, I was able to get all 17,000 items in 8 seconds. The downside is that the flow is more complicated.
Conclusion
The standard Get Items action in Power Automate is fine for basic operations, but if you need more control, or speed, then there are better options.
I’m happy with the performance improvements I managed to create. Retrieving data from SharePoint is key to many applications and SharePoint handles the simultaneous requests well.
Example Data and Flows
You can download both my sample data and the flows discussed in these posts from here:
- Test Data to upload to SharePoint list
- v1 – Using the Get Items Action
- v2 – Using the Get Items Action with Select action
- v3 – Using SharePoint API to retrieve items
- v4 – Using SharePoint API and Two Parallel Execution Branches
- v6 – Using SharePoint API and Four Parallel Execution Branches
Adapting the flow to return SharePoint data to a jQuery grid
In the YouTube video above I also provided a demo of a jQuery grid that drew data from the SharePoint list using Power Automate, you can view that demo here:
Demo of jQuery based grid retrieving SharePoint data via Flow
You can easily copy the source code from that page, the flow has been left unsecured so that you can try it out.
Noor says
Hi
can you please share the flow that was successfully optimized and completed within 8 sec
“Same as above with optimised flow 8 Seconds”
Thank you sooo much
Really appreciated.
Best regards
Paulie says
It’s already in the post section titled “Example Data and Flows” and it’s called “v6 – Using SharePoint API and Four Parallel Execution Branches”
Ed says
I’m attempting to use this solution to export a SharePoint list with much more than 5000 items to a CSV file and it all works until I need to take the data returned from the flow and put it into some type of table format so i can then create a csv file on one drive.
Any idea how to add the last bits to the end of the flow so I can return all the items to a file?
Paulie says
I helped someone else that was trying to do the same thing and it worked fine. As there is obviously a need do you want me to publish a flow that does it?
Ed says
Paulie,
That would be so great of you if you would do an write up on it.
Many times I’ve found that there is a need to automate export of lists greater than 5000 records.
I’m new to the Power Platform and I’ve been able to do the export to file with the 1st and 2nd flow examples but using the API gives me an error when I attempt to create the file, so any guidance you would provide on exporting this information to a csv / excel file via flow would be greatly appreciated.
Srisahithi says
Hi Paul,
We are trying to use Microsoft Power Automate to insert/Update items from list file to an excel file. The flow is taking long time to run. For ex, for 3000 records takes 12 hrs.
Why the flow is slow. Is it because of Standard license. Will upgrading license increase speed ?
Paulie says
For the purposes of clarity, are you trying to write a new Excel file or create a new one with the contents of the SharePoint data? It seems like a lot of people have this requirement and I did say before I was going to do a blog post on it (CSV/Excel). The license you are using is not the cause of your problem, if you look at the flow history you will be able to see why it is slow. I will have a look tonight and see if I can see how to achieve it.
Paulie says
I tried exporting the 17,000 rows of SharePoint data to a CSV file and it took 30 seconds. Excel file is more complicated so I would need more details to provide you with meaningful information.
Noor says
Hi can you please advice how to use the output array of 4 parallel branches to add/update list items….
Usually we Get items > apply condition > create new item or update item in sharepoint lists
Thanks alot
KJ says
Excellent way to operate on more than 5000 items in such a less time!!
For Collecting more than 5000 items of SharePoint List to csv (ex:- 100000 in my case) , use compose as union(variables(‘arrItems’),variables(‘arrItems2’),variables(‘arrItems3’),variables(‘arrItems4’)) followed by a create csv and create a file actions.
Hope it helps!
Paulie says
KJ, this is a really useful addition as many people have already asked me how to use this method to create CSV files (and I was going to do a separate post on it). So this is something that a lot of people want to do.
RAJITHA POTLA says
Hi, my array is not retrieved in powerapps instead it shows only value. Can you show me you binding code in powerapps
Paulie says
Sure, it is like this:
ClearCollect
(
collectionName,'NameOfFlow'.Run()
)
RASHMI RANJAN SATAPATHY says
Hello , I have a similar requirement , can you please help and guide how to bind the array from flow in power app data table. I think i am just getting true/false in power app from the flowname.Run()
Pascal Garcia Esteves says
Good afternoon Paul, first of all thanks for the docs and youtube explanation. I’m with problem, i import the flow and change the directory, file and items that i want to select to powerapps, but when i make all the steps in powerapps i get your headers and not the headers that i have in my file. Can you help?
Many thanks
Paulie says
Hi Pascal,
I think you just need to adjust the response body to suit your column names (you can use generate from sample and supply it with the output that you are generating). Then it should be correct.
Paul
Kavin says
Hey Paul, Great blog I am trying to create an Automated flow that’s linked to an an Premise SharePoint which exceeds 5000 limit. The flow intends to send an email upon modifying an Item in the on premise SharePoint list However, the flow has failed to execute with the below error “status”: 502,
“message”: “The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator.
Is there a way to make this flow work?
NOTE: I am using an on-premise gate way to access the SharePoint.
James Fraderic says
Hi Paul,
Really i found one great article. Thanks for sharing the knowledge . i was trying to implement the same in my case by i m faing issue in the Combines arrays and respond area. i have changed the schema from {
“type”: “array”,
“items”: {
“type”: “object”,
“properties”: {
“Id”: {
“type”: “integer”
},
“Title”: {
“type”: “string”
},
“EmpNo”: {
“type”: “integer”
},
“Last”: {
“type”: “string”
},
“Furloughed”: {
“type”: “integer”
},
“Telephone”: {
“type”: “string”
},
“Email”: {
“type”: “string”
},
“Brand”: {
“type”: “string”
},
“ID”: {
“type”: “integer”
}
},
“required”: [
“Id”,
“Title”,
“EmpNo”,
“Last”,
“Furloughed”,
“Telephone”,
“Email”,
“Brand”,
“ID”
]
}
}
to
{
“type”: “array”,
“items”: {
“type”: “object”,
“properties”: {
“Position_x0020_Title”: {
“type”: “string”
}
},
“required”: [
“Position_x0020_Title”
]
}
}
Since i want only one column values. But its not accepting the new schema. Could you please guide me.
while saving facing the issue like “the schema definition for action with status code ‘200’ is not valid. the schema definitions for actions with same status code must match.”
Paulie says
It’s not accepting the new schema, because there are two response actions, you need to update both of them at the same time. If you update both, it will let you save it.