Usually 16000 rows won’t take that much time even if we go with for each row
I m wondering what kind of operations are carried out on each row
And is it necessary to process all the rows
If possible you can Initially filter the records that has to be processed and then take it with FOR EACH ROW activity
There are many other approaches to handle such a large where we can either split and keep the input either as multiple excel files in a folder or keep all the records in a QUEUE in orchestrator with a help of DISPATCHER process and let multiple bots to process it simultaneously which will eventually reduce the time taken to process those records
I did some tests and had to get to 1 million rows before any noticeable performance difference appeared, and even then it was only a few seconds. 5 million rows still only took 7 seconds for a for each with column value update.
Show us each individual thing it is doing, and the expression being processed. For example, the row["Serial_Numb… Assign, what’s the expression on the right? Show us each expression. Maybe there is a more efficient way.