Hi all,
I’m automating a process that involves reading and processing large Excel files (50,000+ rows). The current workflow using Read Range and looping through rows is very slow.
What are some ways to optimize performance when working with big Excel datasets in UiPath?
You can try using Workbook Read Range instead of Excel Read Range as it doesn’t need Excel application.
You can try filtering or querying data directly with Read Range + LINQ/DataTable.Select instead of looping all rows.
You can try splitting large files into smaller chunks or using Database/CSV import for faster processing.
I created a sample Studio project in the past (tested under Studio/Robot 2025.0.166).
This workflow reads a large CSV file as text, splits it into smaller batches based on the desired row count, saves each batch as a separate CSV file, and then reads each of the generated CSV files. Also performs a garbage collection and memory release before reading any new CSV file.
It should help customers when they are having very large CSV files to process and read them in a DataTable.
For the CSV with 100001 rows and 12 columns, the execution duration is 5-6 seconds.
For this sample CSV 2000000 with rows and 12 columns ( Google Drive - Virus scan warning ), it takes 3 minutes and 5 seconds.