Handling Larger excel files

Handling Larger excel files Dispatcher retrieves an .xlsx file from an automated email. The file contains approximately 100,000 rows of data across 10 columns. Before uploading the data to Orchestrator, some filtering is required. Would it be more efficient to import the entire file into UiPath Studio as a data table and apply the filtering there, or should I consider performing part or all of the filtering in Excel prior to loading it into Studio? I’d appreciate any suggestions on which approach might be more efficient."

It’s recommended to use Workbook activities to read the data and apply the filtering directly within Studio. This method is generally more efficient. If you use Excel activities, the processing time can increase significantly, especially when dealing with large datasets. However, if the data size exceeds , filtering directly in Excel might become more practical. If you notice that the reading or filtering process is taking longer than anticipated, you might want to explore treating Excel like a database. This would allow you to run SQL-like queries, which are much faster and could significantly improve performance.

Hi @Pratiksha_Kadam:

Please follow the steps below. If I help you, please mark as resolved.

Generally, for very large spreadsheets or databases, the most recommended approach to improve performance is to use the LINQ Query approach. Below, I will leave a video tip that can help you with the first steps: https://www.youtube.com/watch?v=QGjlSeEVmws&list=PL8Przw6Rdrj553wll-n9veqGS-Y9WtlaT&index=2

@Pratiksha_Kadam

Welcome to the community

If the amount of data shrinks a lot after filtering then better to do on excel and read

Also as only read is happening and huge data better to use excel as db to avoid any issues

Cheers

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.