Read extra large excel sheet in chunks

I am trying to read an excel sheet with 350k+ rows. I keep getting the out of memory error.
I have seen similar issues on this forum where people solved with reading in chunks, but I can’t quite figure out the workflow of how to read the excel sheet in chunks to create one datatable variable that I can then manipulate according to my requirements.

Can someone help? I need all the data from excel in one Dt variable.

Hi,
I haven’t tried this, so I don’t know if it will certainly work. But I have read on some forums that ExcelDataReader is a light weight tool that will help with reading larger files.

There is a thread on stackoverflow which explains how to use this -

Let me know if you need any help

Thanks,
Nishant

Hello @kasey.betts
Kindly reafer this thread, it will helps you.

Regards,
Gokul Jai

I am not even sure where to start with integrating and testing this. can you please help?

Hi,
Give me some time, I will try this and let you know.

Meanwhile, I feel 350K rows in a single datatable variable might still lead to OutOfMemory Exception anyway. Will it not be better to load X rows at a time into a database, process them and then get the next X rows and so on?
You can figure out the highest X value through trial and error (it shouldn’t slow down your system + no OutOfMemory exceptions)
This can be done easily in the read range activity, by specifying the Range property
image

First iteration, A1:X
Next iteration, A(X+1):(X+1+X) and so on…

Yes I think that will be much easier but I am unsure how to do that and then create just one datatable variable with the whole excel spreadsheet in it

Hi @kasey.betts ,

Could you check if the below component will help you read the excel file at one stretch :

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.