How to read comma separated text file having 30 lakh rows and 20 columns and add into data table for data manipulation?

I have an array. Based on the array I am going to filter it.

I have tried following but was unsuccessful:

  1. Generate Data Table Activity but it gives error(Can not generate data table because the input file is too big or contains too many rows ).

  2. ODBC but it is changing number format.

  3. Read text file the write CSV and then read CSV but it gives error due to rows more than 10.48 lakhs.

  4. Filter the text file based in Array and keep the rows which contains the array value in first column.----but this also seems to be not possible.

when array ( we assume string array) is already loaded, so we can check if we will fiter directly e.g. LINQ Where:

Maybe you can share some samples

we would suggest to explore deeper the details. Also Segmenting the data and parsing it in chunks could be an option

could set the numbers to string, or any other preventive action can be checked

can use File.Read… or using a buffered stream reading for reading the file content

as mentioned a above LINQ Where can be checked

Array is set of account numbers which is also a variable. Cant use greater than or less than method. The data in text format and having 19 columns without header. The filter based on the array values is to be used to first column and all the values in the filtered rows for 20 columns should remain as it is like normal filter we use for excel or data table.

Solved using

Thanks @ppr Peter Preuss

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.