Hi, the first thing i would ask you, is why do you have that many rows in the memory of your robot at the same time? Where do they come from? You should not read that many records at once…
I have seen users with 10 to 50 years of data from ERP systems. It was extracted for ML data analysis… for visualisations… it takes about an hour to fully download it.
Other times, it is just purely text files with 100K to 500K lines of data. Text file size ranging from 20 to 50MB. Memory requirements ranging 32, 64GB or more.
Still, i see no point in trying to deal with that many rows inside a single process at a time… If you need to generate that kind of report, it should be done in batches of way smaller size…
I understand your point but the It will be almost 1 million in data maybe a little bit more, so i just need to change it into excel and upload it.
I dont have to process is, since it is already getting processed in SQL, i am just getting the data.
So, this is what a mean… Dont get all rows from SQL at the same time… Get like 50.000 at a time for example and then write range should have no problem…