Segment and Group Datatable by not grouping item with same transaction ID/item name

Hello everyone!

I have this use case where I need to segment a datatable but the challenge is grouping them not randomly. No same item name should be grouped or the process will fail. I know how to segment but is there any way to make sure no 2 items have the same id? Is there a script for that? Thanks

I have an example datable here:

Hoping you can guide me.


Hi @wonderingnoname,

First you can get your whole table datatable. Then you import the groups you want into the new tables by using the filter datatable. Finally, you write these tables in excel with new sheet names.


Hi @MY, thank you for your reply. However, I will not know what items are in the excel and there will be thousands of item in it. My initial thought is to get the unique value from the original excel then try grouping based on that. I am just thinking if there is another way and efficient way like datatable segmentation


Maybe it will help when grouping the data by Item value and each group will be a datatable within a list of datatables.

Assign Activity:
LHS:TableList | List(Of DataTable)

(From d in dtData.AsEnumerable
Group d by k=d("Item").toString.Trim into grp=Group
Select t = grp.CopyToDataTAble).toList

For alternates just specify some more details on the expected output related to the above given sample data

Thank you for your reply @ppr

My main goal is to group the item in excel file in a way that there will be no item with same item name. Like so:

then the TableList Approach has a chance to do it.

Once the tableList is retrieved you can iterate over the table inside the groups and pick up a group member from a particular position

The group with the max numbers of groups you can retrieve with
MaxGroupSize = TableList.Max(Function (x) x.Rows.Count)

1 Like

Thank you for this, I think this is what I am looking for. However, can you let me know how I can retrieve item from the group one by one? Sorry, I am not familiar much with datatable and the best I tried before is simple filter only.



Group Processing

midx = Enumerable.Range(0, MaxGroupSize)
gidx = Enumerable.Range(0, TableList.Count)

It is also handling groups of different member counts

Find starter help here:
GroupBy_1Col_CompileGroupSets.xaml (12.0 KB)

1 Like

Thank you! This worked perfectly.

additional question though, is there any way we can prevent having only one pair at the end? I ran the script on a live excel data and after a while I was left with 30+ (random) item with no pair.


group 1:

group 2:

group 3:

group 4:

what is meant with this?

Like if we have this example:

the 3rd group only have 1 item, I was wondering if we can avoid that in a way? Like if we can group it like:


looks like a different case. But we do feel, that there will be a solution for this as well.

The confusing part from the last example AB,AB,AC is the unclear rule on the how the pairing is defined. Can you elaborate on this?

The rule is just make sure to always have a pair for each item as much as possible.

just create some sample data set where we have discrete identifiable values in the cols (e.g. Material) for more occurring Item values. We will have a look at it. Currently, we cannot derive the full rule. Also, it looks more like combining rows into different pairs

Book1.xlsx (8.8 KB)
Thank you, appreciate your help in this. I added a test sample. Goal is to partner/group the item with other item with different item name and as much as possible not have a solo item remaining without a group.

thanks for the feedback / sample Excel. Refering to your origin topic description / request from most top this request looks like a different case. We would recommend to help other researchers by scoping 1 Topic = 1 case. It will be easier for them to find solutions for their similar case.

Have a look on the Join datatable activitiy and try some steps when setting the join operation to
!= In case you need further help on just open a new topic and we will pick it up from there. Maybe you can also provide the expected output sample. Thanks

1 Like

Thank you Peter for all the help! I will be closing this thread now. :slight_smile:

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.