• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Tachytelic.net

  • Get in Touch
  • About Me

Solutions to the SharePoint 5000 item limit using Power Automate

April 19, 2020 by Paulie 46 Comments

If you need to get more than 5000 items from a SharePoint list, there are a number of different approaches. I’ve written up three different ways to solve the problem, which return exactly the same result, but have very different execution times. To get an overall understanding of this problem, check out the YouTube video I produced:

My requirement is to return 17,000 SharePoint list items to a PowerApp. I used Power Automate to get the data but found the standard “Get Items” action to be too slow. A good user experience is essential for application acceptance.

Eventually, I was able to reduce the amount of time that the Flow takes to retrieve the items from 2 minutes down to 8 seconds! That’s 13 times faster!

Since I originally wrote this post, I have figured out a method that is even easier and faster! Check it out here.

Links to my sample data and all the flows created are at the end of the post.

My example data looks like this:

Image showing data from SharePoint list which is going to be used as a DataSource for a flow

It’s 7 columns wide, 17,000 rows of data. Here is how long it took Power Automate to execute the flow with execute each method:

Method Time to Execute
Standard “Get Items” 111 Seconds
Standard “Get Items” with Select Action 67 Seconds
Retreiving same data with SharePoint API 31 Seconds
Retreiving same data with SharePoint API and two parralel queries 19 Seconds
Retreiving same data with SharePoint API and four parralel queries 13 Seconds
Same as above with optimised flow 8 Seconds

The table shows the results from six variations on three different themes. The fundamental principal behind each method is a “Do until” loop which collects the entire contents of a SharePoint list into an array. Each method has some pro’s and cons.

Which method to use?

The right method to use, depends on your circumstances. Here are some suggestions on what method may work best for you:

  • If your flow is not time sensitive, just use the standard “Get Items” action with a “Do Until” Loop in Power Automate. The flow is simple and easy to build.
    Get more than 5000 items from SharePoint with Power Automate
  • If you are already using the “Get Items” method but need it to be a bit faster, enhance the speed of it by adding the “Select” action.
    Get more than 5000 items from SharePoint with Power Automate
  • If the performance gained from the “Select” action still does not meet your requirements, switch to using the SharePoint API. It is almost the same as using “Get Items”, but drastically faster.
    Use the REST API in Flow to get over 5000 items from SharePoint
  • Finally, using parallel execution to retrieve chunks of SharePoint list data simultaneously increases speed further. But it does make the flow more complicated.
    Combine SharePoint API & Flow Parallel branches for Performance

Why is the SharePoint Get Items action so slow?

The “Get Items” from SharePoint in Power Automate is very slow. The primary reason is the size of the payload that it returns from SharePoint. In addition to returning the columns defined in your SharePoint view, it also returns many other columns and metadata.

A single row of the sample data above results in this from SharePoint:

{
	"@odata.etag": "\"1\"",
	"ItemInternalId": "1",
	"ID": 1,
	"Title": "Terry",
	"EmpNo": 1.0,
	"Last": "Carter",
	"Furloughed": 1.0,
	"Telephone": "(462) 723-7695",
	"Email": "[email protected]",
	"Brand": "PowerBI",
	"{Identifier}": "Lists%252f5000%2bItem%2bTest1%252f1_.000",
	"{IsFolder}": false,
	"{Thumbnail}": {
		"Large": null,
		"Medium": null,
		"Small": null
	},
	"{Link}": "https://accendo1.sharepoint.com/sites/DocumentManagementTest/.....",
	"{Name}": "Terry",
	"{FilenameWithExtension}": "Terry",
	"{Path}": "Lists/5000 Item Test1/",
	"{FullPath}": "Lists/5000 Item Test1/1_.000",
	"{HasAttachments}": false,
	"{VersionNumber}": "1.0"
}

That’s a huge amount of extra fields, which results in a massive payload. 5,000 rows of my sample data created a JSON response which was 3,545 KB.

The same 5,000 rows of data when passed through the “Select” action produces a JSON response which looks like this:

{
	"Id": 1,
	"Title": "Terry",
	"EmpNo": 1.0,
	"Last": "Carter",
	"Furloughed": 1.0,
	"Telephone": "(462) 723-7695",
	"Email": "[email protected]",
	"Brand": "PowerBI",
	"ID": 1
}

For the same 5,000 records, the JSON response in flow was only 772 KB – A massive reduction. So passing the response from Get Items through the select action can make a big difference to how quickly your flow will run, and it is super easy to do.

Image of output of Sharepoint Get Items action in Power Automate being reduced down with the select action.

Use the Sharepoint API for maximum speed

Although the “Select” action in Power Automate described above improves the speed of a flow within a do…until loop, the “Get Items” action is still retrieving a lot of unwanted data from SharePoint.

Using the SharePoint API allows you to get exactly the data you want without retrieving any unwanted columns or metadata.

Fortunately, this is very easy to do using the SharePoint API using the “Send an HTTP request to SharePoint” action. Compare these two flow actions:

Standard “Get-Items” action.
The same request using the SharePoint API

These two actions, essentially perform the same operation, but the second one returns only the requested columns. Here is the difference in execution time:

Image displaying the difference in execution time between Power Automate Get Items from SharePoint Action and the Same request to the SharePoint API.

Use Parallel Branching to get batches of 5000 items simultaneously.

Using the SharePoint API to get list data into Power Automate makes a huge difference to the speed of the operation. But it still gets the data in batches of 5000 blocks. With my 17,000 row sample the do-until loop has to execute four times to collect all of the data, all of which takes valuable time.

By using the parallel branch actions in Microsoft Power Automate we can engineer a solution which works much faster.

I tried collecting SharePoint data with two and four parallel branches. With four, I was able to get all 17,000 items in 8 seconds. The downside is that the flow is more complicated.

Conclusion

The standard Get Items action in Power Automate is fine for basic operations, but if you need more control, or speed, then there are better options.

I’m happy with the performance improvements I managed to create. Retrieving data from SharePoint is key to many applications and SharePoint handles the simultaneous requests well.

Example Data and Flows

You can download both my sample data and the flows discussed in these posts from here:

  • Test Data to upload to SharePoint list
  • v1 – Using the Get Items Action
  • v2 – Using the Get Items Action with Select action
  • v3 – Using SharePoint API to retrieve items
  • v4 – Using SharePoint API and Two Parallel Execution Branches
  • v6 – Using SharePoint API and Four Parallel Execution Branches

Adapting the flow to return SharePoint data to a jQuery grid

In the YouTube video above I also provided a demo of a jQuery grid that drew data from the SharePoint list using Power Automate, you can view that demo here:

Demo of jQuery based grid retrieving SharePoint data via Flow

You can easily copy the source code from that page, the flow has been left unsecured so that you can try it out.

Filed Under: Power Platform Tagged With: Power Automate, PowerApps, SharePoint Online

Reader Interactions

Comments

  1. Noor says

    July 12, 2020 at 8:15 pm

    Hi

    can you please share the flow that was successfully optimized and completed within 8 sec
    “Same as above with optimised flow 8 Seconds”

    Thank you sooo much

    Really appreciated.
    Best regards

  2. Paulie says

    July 13, 2020 at 10:17 am

    It’s already in the post section titled “Example Data and Flows” and it’s called “v6 – Using SharePoint API and Four Parallel Execution Branches”

  3. Ed says

    July 17, 2020 at 3:26 am

    I’m attempting to use this solution to export a SharePoint list with much more than 5000 items to a CSV file and it all works until I need to take the data returned from the flow and put it into some type of table format so i can then create a csv file on one drive.

    Any idea how to add the last bits to the end of the flow so I can return all the items to a file?

  4. Paulie says

    July 17, 2020 at 8:58 am

    I helped someone else that was trying to do the same thing and it worked fine. As there is obviously a need do you want me to publish a flow that does it?

  5. Ed says

    July 17, 2020 at 1:17 pm

    Paulie,

    That would be so great of you if you would do an write up on it.

    Many times I’ve found that there is a need to automate export of lists greater than 5000 records.

    I’m new to the Power Platform and I’ve been able to do the export to file with the 1st and 2nd flow examples but using the API gives me an error when I attempt to create the file, so any guidance you would provide on exporting this information to a csv / excel file via flow would be greatly appreciated.

  6. Srisahithi says

    August 17, 2020 at 8:43 am

    Hi Paul,
    We are trying to use Microsoft Power Automate to insert/Update items from list file to an excel file. The flow is taking long time to run. For ex, for 3000 records takes 12 hrs.
    Why the flow is slow. Is it because of Standard license. Will upgrading license increase speed ?

  7. Paulie says

    August 17, 2020 at 10:37 am

    For the purposes of clarity, are you trying to write a new Excel file or create a new one with the contents of the SharePoint data? It seems like a lot of people have this requirement and I did say before I was going to do a blog post on it (CSV/Excel). The license you are using is not the cause of your problem, if you look at the flow history you will be able to see why it is slow. I will have a look tonight and see if I can see how to achieve it.

  8. Paulie says

    August 17, 2020 at 3:23 pm

    I tried exporting the 17,000 rows of SharePoint data to a CSV file and it took 30 seconds. Excel file is more complicated so I would need more details to provide you with meaningful information.

  9. Noor says

    September 1, 2020 at 10:16 pm

    Hi can you please advice how to use the output array of 4 parallel branches to add/update list items….
    Usually we Get items > apply condition > create new item or update item in sharepoint lists
    Thanks alot

  10. KJ says

    September 4, 2020 at 1:58 pm

    Excellent way to operate on more than 5000 items in such a less time!!
    For Collecting more than 5000 items of SharePoint List to csv (ex:- 100000 in my case) , use compose as union(variables(‘arrItems’),variables(‘arrItems2’),variables(‘arrItems3’),variables(‘arrItems4’)) followed by a create csv and create a file actions.

    Hope it helps!

  11. Paulie says

    September 4, 2020 at 2:56 pm

    KJ, this is a really useful addition as many people have already asked me how to use this method to create CSV files (and I was going to do a separate post on it). So this is something that a lot of people want to do.

  12. RAJITHA POTLA says

    September 17, 2020 at 4:27 pm

    Hi, my array is not retrieved in powerapps instead it shows only value. Can you show me you binding code in powerapps

  13. Paulie says

    September 17, 2020 at 4:36 pm

    Sure, it is like this:

    ClearCollect
    (
    collectionName,'NameOfFlow'.Run()
    )

  14. RASHMI RANJAN SATAPATHY says

    September 18, 2020 at 6:23 pm

    Hello , I have a similar requirement , can you please help and guide how to bind the array from flow in power app data table. I think i am just getting true/false in power app from the flowname.Run()

  15. Pascal Garcia Esteves says

    November 10, 2020 at 4:10 pm

    Good afternoon Paul, first of all thanks for the docs and youtube explanation. I’m with problem, i import the flow and change the directory, file and items that i want to select to powerapps, but when i make all the steps in powerapps i get your headers and not the headers that i have in my file. Can you help?

    Many thanks

  16. Paulie says

    November 10, 2020 at 5:04 pm

    Hi Pascal,

    I think you just need to adjust the response body to suit your column names (you can use generate from sample and supply it with the output that you are generating). Then it should be correct.

    Paul

  17. Kavin says

    March 29, 2021 at 3:56 pm

    Hey Paul, Great blog I am trying to create an Automated flow that’s linked to an an Premise SharePoint which exceeds 5000 limit. The flow intends to send an email upon modifying an Item in the on premise SharePoint list However, the flow has failed to execute with the below error “status”: 502,
    “message”: “The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator.
    Is there a way to make this flow work?

    NOTE: I am using an on-premise gate way to access the SharePoint.

  18. James Fraderic says

    April 19, 2021 at 9:55 pm

    Hi Paul,
    Really i found one great article. Thanks for sharing the knowledge . i was trying to implement the same in my case by i m faing issue in the Combines arrays and respond area. i have changed the schema from {
    “type”: “array”,
    “items”: {
    “type”: “object”,
    “properties”: {
    “Id”: {
    “type”: “integer”
    },
    “Title”: {
    “type”: “string”
    },
    “EmpNo”: {
    “type”: “integer”
    },
    “Last”: {
    “type”: “string”
    },
    “Furloughed”: {
    “type”: “integer”
    },
    “Telephone”: {
    “type”: “string”
    },
    “Email”: {
    “type”: “string”
    },
    “Brand”: {
    “type”: “string”
    },
    “ID”: {
    “type”: “integer”
    }
    },
    “required”: [
    “Id”,
    “Title”,
    “EmpNo”,
    “Last”,
    “Furloughed”,
    “Telephone”,
    “Email”,
    “Brand”,
    “ID”
    ]
    }
    }

    to

    {
    “type”: “array”,
    “items”: {
    “type”: “object”,
    “properties”: {
    “Position_x0020_Title”: {
    “type”: “string”
    }
    },
    “required”: [
    “Position_x0020_Title”
    ]
    }
    }

    Since i want only one column values. But its not accepting the new schema. Could you please guide me.

    while saving facing the issue like “the schema definition for action with status code ‘200’ is not valid. the schema definitions for actions with same status code must match.”

  19. Paulie says

    April 19, 2021 at 10:12 pm

    It’s not accepting the new schema, because there are two response actions, you need to update both of them at the same time. If you update both, it will let you save it.

  20. Robert Wilson says

    June 28, 2021 at 4:56 pm

    Is there anyway of making the number of arrays dynamic, I am looking to use this to take NEW data from an SQL Query and update a SharePoint list with this information, then ultimately have this updated data pulled into a PowerApp to be used as a data source for another process.

    The issue is I will not necessarily always know what size the SharePoint list will be as it will be updated every hour.

  21. Paulie says

    June 28, 2021 at 5:45 pm

    Yes, I haven’t used the SQL connector in depth. Is there also a 5,000 limit on that too? It’s possible to make a dynamic number of arrays. Will demonstrate later on, not with my computer at the moment.

  22. Jaime says

    June 29, 2021 at 7:18 pm

    Hi Paulie, I am new to PowerAutomate and using the version 6. I inserted a create a csv file and it is working fine. My question is can you provide me a workaround to insert a filtering of some criteria, say for all color=’Red’ and region=’Asia’?

  23. Robert Wilson says

    June 30, 2021 at 12:41 pm

    Paulie,

    The SQL Get items seems to pull all the 23000+ items in one operation and quickly. The main issue I have is that the data I need to pull will increase every time I run the flow, but the amount will differ each time. I am in effect trying to check through all the SQL Data and Ignore those values already in the SharePoint List, and update the list with the SQL Data which ISNT in the list. BUT as the list is over 23000 items I cannot get this to work, and each time I’ve tried |I end up with tens of thousands of duplicates.

    The flow also needs to run on an hourly basis, making it even more critical to get the performance right

  24. Paulie says

    June 30, 2021 at 2:36 pm

    Does the SQL data have a unique ID? If so could you not change the SQL get items query to not get any ID which is already in SharePoint? I’d be tempted to make a table in SQL and populate it with all the items that are already in sharepoint and then create a view that gets everything from SQL where the ID is not in your SharePoint Items table. Then it would only bring out the added items.

  25. robert Wilson says

    June 30, 2021 at 3:20 pm

    Paulie,

    Yes there is a unique key field, BUT the SQL is not in the Azure environment, but on a different server in our organisation, so I am using an On prem gateway to access the table. So there isn’t really as far as I know an easy way of achieving what you suggested above, as the connection is one way to an ERP system.

  26. Paulie says

    June 30, 2021 at 3:23 pm

    Then I think you would have to do it like this:

    • Get all the items from SQL
    • Get all the items from SharePoint
    • Filter the list from SQL so that it does not contain any of the items found in SharePoint
    • Create items for whatever remains

    It’s not ideal, but it should work.

  27. Robert Wilson says

    June 30, 2021 at 3:30 pm

    Thats the issue, the SharePoint list will be over 23000 items and increasing all the time,

    So I guess I am going to have to use your multiple concurrent get items option, then combine again in to one array and compare the combined array with the SQL list?

    Or am i away at a tangent here.

  28. Paulie says

    June 30, 2021 at 3:53 pm

    I think you are along the right lines, I don’t really like the idea of it. Perhaps there is a better solution using a PBI dataflow.

  29. Robert Wilson says

    June 30, 2021 at 4:14 pm

    To be honest I don’t like the thought of it either, I actually found your videos after my first attempt created 347000 items of which almost all were duplicate entries, as the flow wasn’t checking the QL against the sharepoint items properly. Your mass deletion took 4 hours but everything was cleared, and it was WAAAY faster than doing this manually or the normal get items and delete items method.

    But given the set of circumstances I have at the moment until the server is transferred into Azure, I cannot see any other way to get this done.

    Thanks – IF I get it to work I will post here for everyone to see.,

  30. Rob says

    September 1, 2021 at 7:10 pm

    None of these work for me anymore. I know the ID of the 5,000th item in the list. The send HTTP request will fail if I try to filter for ID greater than the 5,000th item’s ID with top=200. The message says: The attempted operation is prohibited because it exceeds the list view threshold. I can select items just fine up to that 5,000th item, it just won’t go past that no matter the filter.

  31. Paulie says

    September 1, 2021 at 7:15 pm

    That’s the list view threshold that has kicked in, are you filtering on a property that isn’t indexed?

  32. Rob says

    September 1, 2021 at 7:24 pm

    ID isn’t indexed? I checked indexed fields and ID isn’t an option to add.

  33. Paulie says

    September 1, 2021 at 10:20 pm

    ID should be indexed, I’d be interested to have a look to see why you’re getting this problem.

  34. Rob says

    September 2, 2021 at 5:56 pm

    There may be something wrong with the list. I’ll try to recreate the list and see if I get the same results.

  35. Rick says

    December 8, 2021 at 5:33 pm

    Thanks for your help on this. I used your concept to read my list that has grown over 5000. Everything worked great until i tried to filter on anything other than ID. I ran some tests and no filters would work even if the filter column was indexed once the list went past 5000. Always get a ‘bad gateway’ response. Strange thing is the filters do wok for Get Items action even on the big list.

    Any thoughts?

  36. Russ Herald says

    December 15, 2021 at 2:50 pm

    Paul,
    Many thanks for this post. I’m kind of a newbie with Power Automate, but have made a flow that moves documents (including metadata) from one library to another, based on a Get Files (Properties only) that filters out documents older than a year based on a date field. The query works, but can I use one of your methods to scale it up to move an undetermined amount of documents, but perhaps growing to around 100,000?
    We’re still using the free license for Power Automate on an E5 license.

  37. Paulie says

    December 15, 2021 at 3:18 pm

    Hi Russ,

    You certainly could use this method, but your move actions would still take a very long time. If you combined the method described in this post with my SharePoint batch actions post you could move the files in batches of 1,000 at a time – which would be much faster. But if you are a beginner then it might be quite tricky, the batch method is somewhat tricky.

    Paul

  38. Adarsh Sharma says

    January 5, 2022 at 12:21 pm

    Hello Paul,

    I am quite new to power automate and till now I’ve created a few basic flows for my organisation. It would be great if you could help me out by telling me how can I create a HTML table using JQuery in Power Automate.

    Thank You!

  39. Paulie says

    January 5, 2022 at 12:24 pm

    Hi, I already made a demo of doing this which is linked in this post, see here:
    Demo of jQuery grid from Power Automate
    You can use the source of that page to build your own.

  40. Russ Herald says

    January 5, 2022 at 1:35 pm

    Paul,
    Thanks for your reply. Gives me something new to puzzle over in this new year, as we’re rehashing the basic requirements. The SharePoint Batch suggestion is interesting.

  41. Adarsh Sharma says

    January 7, 2022 at 4:51 am

    Hello Paul,
    Thanks for your reply, I created an HTML Table using Compose command in Power Automate, now what I want is to remove borders from the rows of that table which are empty or those rows which contains no data. I am aware that this is possible through JQuery, but don’t know how to do it in Power Automate. It would be great if you could help me out with this.
    Thank You!

  42. Duncan says

    April 20, 2022 at 12:30 pm

    Hello again, Paul.
    I´m using your solution version 2 (get items with select) on my lists but now I have troubles with a new large list over 220,000 items because flow is an endless loop. I would try to use the REST API (solution 3) but I have some errors on my flow because I,m noob with PowerAutomate. Please, can you publish entire flow like previous solutions?
    Thank you.

  43. Duncan says

    May 2, 2022 at 6:51 pm

    Excuse me, I’m here again.
    You can forget my previous post because I tried your faster flow too.
    It’s incredible fast and It’s works to me but my SharePoint lists are very huge. One of them have more than 100,000 items and the other one have more than 200,000 items. On both responses I can only get arround 77% or 79% of items, not all of them. And I try to export to CSV modifing flow but my CSV file has only first 20,000 items. Any advice or help are welcome. Thank you

  44. Paulie says

    May 2, 2022 at 6:57 pm

    Hi Duncan – I’d be interested to take a look. I suspect that you’re going over the 100Mb size limit so you might have to subdivide the result set somehow. I don’t know it for sure, but it’s just a hunch.

  45. Duncan says

    May 5, 2022 at 9:01 am

    Hi Paul, I have checked the size of the lists on my site. The largest (220,000 items) is 70 MB. I can successfully back up a smaller list (118,000 items and 34 MB) with your version 3 flow, but it never finishes with the other one. If you want and tell me how, I can send you any information you need. Thank you.

  46. Duncan says

    May 5, 2022 at 9:12 am

    Sorry, I forgot to say that I have another site with a list of 95,000 items and a size of 2.3 GB. I can also make a backup with your same flow. I suppose the key could be the number of items.

Leave a Reply Cancel reply

Primary Sidebar

Link to my LinkedIn Profile
Buy me a coffee
Image link to all Power Automate content

Excellent Power Automate Blogs

  • Damien Bird
  • Dennis (Expiscornovus)
  • Tom Riha

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 245 other subscribers.

Go to mobile version