Handling large excel records

Hello,I have API call which would take excel file ( input file ) as body resource to process.
Input file is created from raw API get calls which eventually does HTTP request to get output resource file.

Now my concern is input xlsx file has huge number of records like around 40k. I need to split in sizes of like 5k each and send to HTTP call. Can someone give me a logic were I can split this exel file.

Appreciate your help.

@adhishc - you can use read/write range activities to split the excel based on

Read excel with read range to get Datatable

if dt.rows.count > 0
DT_5k = DT.Clone
if firstSplit = True

DT_5k = DT.AsEnumerable.Skip(0).Take(5000).CopyToDataTable
firstSplit = False

DT_5k = DT.Clone
DT_5k = DT.AsEnumerable.Skip(5000).Take(5000).CopyToDataTable

write range to excel each datatable of dt_5k

Hi @adhishc

Please find the below workflow
chunks.zip (19.0 KB)


Mark as solution if found helpful :slight_smile:

This copies only 5k records. Thing is I have 40000 records, when I use write range I need to add 5k records in 8 different excel files with different names ( Eg Book1…Book8.xlsx).

pls use the loop based on rows count ex: no of iterations = round( rows count/5000 )
and wriite the filename like book_iteration_no

1 Like

This works!! Only need to handle header part.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.