How to: Create a excel file using a datatable without using a Write range activity

Hello All

Since I have more than 500,000 rows in my datatable, I am getting error whenever I am trying to use Write range activity.

I am trying to create an excel file without using the write range activity.

Can anyone help me out in this, I have no idea, how to do this.

Thanks & regards
Aryan

Hi @AryanS

You could limit the dataTable size by using dt.AsEnumerable.Skip([number of rows]).Take([number of rows]).CopyToDataTable

The first result of dataTable uses Write Range and the subsequent results uses Append Range

Should Write Range still failed, Write to CSV then Open Excel and Import to Excel (Send Hot Key)

1 Like

Hi, the first thing i would ask you, is why do you have that many rows in the memory of your robot at the same time? Where do they come from? You should not read that many records at once…

@GreenTea

I am not trying to segregate the data table, that i can already do, i am just stuck while writing the file since there is just too much data.

Cant we use invoke code activity for the same?

Thanks

Hi @bcorrea

I have seen users with 10 to 50 years of data from ERP systems. It was extracted for ML data analysis… for visualisations… it takes about an hour to fully download it.

Other times, it is just purely text files with 100K to 500K lines of data. Text file size ranging from 20 to 50MB. Memory requirements ranging 32, 64GB or more.

1 Like

Still, i see no point in trying to deal with that many rows inside a single process at a time… If you need to generate that kind of report, it should be done in batches of way smaller size…

1 Like

@bcorrea
@GreenTea

I understand your point but the It will be almost 1 million in data maybe a little bit more, so i just need to change it into excel and upload it.
I dont have to process is, since it is already getting processed in SQL, i am just getting the data.

Hope this clears the queries.

Thanks

So, this is what a mean… Dont get all rows from SQL at the same time… Get like 50.000 at a time for example and then write range should have no problem…

1 Like