How to remove duplicate rows based on Column?

Hi all, I have 3 data tables that I’ve filtered based on the “Product” type off an input data table. The second and third table need to be further filtered and then they need to be merged together for a total of 30 rows. So, if the first table has 17 rows the second and third need to make up the rest 13. The criteria for filtering second and third table are the same, for “Location” column we need to make sure each unique location appears at least once, and rest can be random. How can I filter the second and third table using this condition and make sure that after merging 1,2 and 3 the total counts is always 30.

Please advise.

Regards

Anived

Hi @Anived_Mishra

You can remove duplicated rows based on Location column values using this before do the merge and return only 30 rows using take method

dt_variable.AsEnumerable().GroupBy(Function(r) r.Field(of String)("Location")).Select(Function(s) s.First()).CopyToDataTable()

To return only 30 rows after merge, you can use this

dt_variable = dt_variable.AsEnumerable().Take(30).CopyToDataTable()

Cheers

1 Like

Hey all I did it, I created 2 lists of unique locations in those 2 tables and then got corresponding rows in Filtered table 2 and Filtered table 3 and to get the remaining rows I sorted the table to include the number of rows ( 30- rows in table 1- rows in table 2 and table 3 ) in a table that contains data except table1, 2 and 3 if that makes sense.

After I merged Remaining, Filtered table 1, 2 and 3 and wrote it to an excel.

Thanks all anyways.

Regards

Anived

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.