Whats the best way to handle large records in datatable

Hello guys i have about 8 csvs containing about 380,000 records or more in each. I am supposed to do the same thing for all as they are all transactions for a day.
Although it was difficult to read all into one data table i succeed to do it. i did this in chunks but the workflow that handles this is not stable sometimes it breaks with error “Invoke workflow: Child job stopped with unexpected exit code 0x000003E8.”

But now the issue is that i have a lot to do on this data table, the data table contains 8 million transactions… i have do some join manipulations to remove duplicates and a whole lot of activities
but my process is breaking at several workflows with this same error “Invoke workflow: Child job stopped with unexpected exit code 0x000003E8.”

I know its due to the large data table…but how do i go about this

please help

1 Like

Have you got any solution for the same?

Hi @shashank.enugala

Welcome to forum

Can u explain ur query please?

Hi @NIVED_NAMBIAR

Requirement: I have to export Huge Data (>25k records) from DataTable to excel.
Iogic used: ExecuteSQLQuery(GotRecordstoDataTable)–>ExcelApplicationScope–>WriteRange
ErrorMessage: * Job stopped with an unexpected exit code: 0xC0000005

it would be better if you can give less than number of records, like <100,000 or < 300,000 record etc. if possible also give us number of column.

Hi @AkshaySandhu

NoOf columns-6
NoOf records fetched are dynamic which is coming from DB

what is the max number of records that is possible.