I am having one excel which is having nearly 17000 rows, i want to add all the rows to queue with unique reference.
I have looped through the datatable and added the items to queue but it is taking more than 2 hours to add to queue. Is there any other way to add the queue items?
Hi @M1288 ,
Please try Bulk Add Queue Items Activity, to add items to queue.
Also instead of Unique reference, can you try to filter the value in excel before reading it.
Hi @Dharunya_Devi , i want to add reference to each item. And using bulk add queue items activity we can add upto 15000 items correct?
As per the documentation, yes it will allow 15000 at a time. Can the excel be splitted to 2 files.
you need to make a flow which will split the data table into smaller chunks, and upload these chunks one by one using bulk add queue item , such that size of each chunk <15k
To take first N rows of datatable use:
Now skip the first N rows using:
You can also use it according to your logic
You have to use some looping logics
17000 isn’t big enough to extend the time. What data manipulations do you do while adding the data, can you share?
I’m asking this because with the bulk addtoqueue you will need to prepare the data in the same way.
In addition, if you are doing a speed test, especially in flows where you use loops, try to run them directly with the run command, not in debug.
Hi @saurabhB, Let me try and will come back. Thank You.
Hi @muhammedyuzuak, I am validating single column for 17000 rows, if it is valid adding to datatable, then i have to add that data to queue. But taking more time.
Any Workflow sample shared will be very helpful.
Can anyone please let me know how to split the datatable to upload the items to queue using bulk add queue items. Sample workflow will be very much helpful.
Hi @saurabhB, can you please share the sample workflow. it will be very helpful