How to find number of rows of a data base that have the same data


#1

if i have multiple same data in one column. How can i find the number of row that have same data after filtering.


#2

Hi there @4ndyg0h,
Firstly, you can retrieve only those unique column values from the DataTable:
Assign - YourDTUnique = YourDT.AsEnumerable.GroupBy(Function (drRows) drRows.Item("ColumnName")).Select(Function (drGroupedRows) drGroupedRows.First).CopyToDataTable

Then, you can iterate through those unique values, checking how many of each value exists in the original DataTable, for example:
dictMyResults = new Dictionary(Of String, Integer)
For Each row In YourDTUnique
Assign - dictMyResults(row.Item("ColumNName")).ToString = YourDT.AsEnumerable.Where(Function (drRows) drRows.Item("ColumnName").ToString.Equals(row.Item("ColumnName").ToString)).Count

At the end, you should end up with a dictionary containing all unique column values and their associated population within the original DT.


#3

The workflow has validation errors. Review and resolve them first.

System.Activities.InvalidWorkflowException: The workflow has validation errors. Review and resolve them first. —> System.Activities.ValidationException: Compiler error(s) encountered processing expression “dictMyResults(row.Item(“ColumNName”)).ToString”.
Option Strict On disallows implicit conversions from ‘Object’ to ‘String’.

— End of inner exception stack trace —

I got this error so what should I change from here?


#4

Hi there @4ndyg0h,
You will need to replace the “ColumNName” value with your referenced column.

But the error you are receiving relates to:
row.item(“ColumNName”)).ToString

This should be:
row.item(“ColumNName”).ToString)

As the row.item method returns an object.

Thanks,
Josh


#5

@4ndyg0h

Can you please explain the requirement clearly, I didnt get it properly.

Regards,
Mahesh


#6

Hi there @MAHESH1,
What part are you having difficulties with?

Thanks,
Josh


#7

@4ndyg0h

It will be helpfull if you explain through example, your requirement

Regards,
Mahesh