Bulk data processing , read data from csv files each file contains more than 3 lakhs records

Hi,

i have to merge the csv files which is having more than 3 lakhs of records and then compare those data with another csv file data that also contains 3 lakhs records and in both files column names are different and values are same.

i have tried with looping got error like :RemoteException wrapping System.Exception: Job stopped with an unexpected exit code: 0xE0434352

and also tried joining the data tables and used linq query to get the data it’s working fine upto 30k records but again for 50k records it took 1.30 minutes.

can any body help me to achive this in short time…

thank you.

@Kavitha_PB You can use like this:

  1. For each row for 1st CSV file
  2. Inside for each row use filter data table :

  1. Then if you want to write the matching data into other DT , Use Add data row activity

Hope this will help you

Best Regards,
Vrushali

Thanks for your reply, i have tried this way too, it works fine for up to 50k records then suddenly bot throw exception msg like RemoteException wrapping System.Exception: Job stopped with an unexpected exit code: 0xE0434352

@Kavitha_PB Kindly check the post:

Hope this will help you

Best Regards,
Vrushali

tried , but no luck… any other way to achieve this?