How to total up multiple cells of a specific column based on similar values in another column

In the first column there is an ID which can be multiple times in this column.
So first, I want to find all rows having the same ID (column “A”)
Secondly, I want to total up all values of another column (belonging to the ID I received before).
To make it more clear, see the screenshot below:
In column “A” I need to consolidate all cells with the same ID/value (marked coloured).
Then in column “B” (marked grey) I need to total up all corresponding values and merge it in one row.
Can someone help and give a step by step solution please? :slight_smile:

dTable.Compute("SUM([Column B])", "[Column A]='847108'")

I got your requirements in that way that you want to sum up the grouped B values corresponding to all Distinct A values. in your screenshot the resulting datatable will have 3 rows.

the comma, used as a decimal seperator often leads to, that the datatype recognized for such a column is string and not e.g double. Then it has to be taken into account for the datatype conversions for preventing an exception.

Find attached Demo showcasing your scenario and handling the control of decimal character sign.
MarcJo.xaml (10.6 KB)

Let me know your open questions

1 Like

Hi @ppr,
thanks for your support.
It was a good advice with the comma as a decimal separator.
But it does not solve the actual problem.

First question is:
How can I group all IDs (in the first column) which are equal?

Second question:
How can I total up all values in a specific column belonging to the ID I received before?

→ all within a “For each row” activity

For avoiding multiple ping pong rounds may I ask you on following: Just provide us sample data (e.g. in Excel format) with input and expected output. In general I do feel we are close to solution. Thanks

doing a group and a sum inside a for each doesnt not look good, cause you would do the same calculation several times…

Input-Datei.xlsx (9.8 KB)

This is how the Input file could look like.
Now I need to group all rows with the same ID (column A) in order to total up all values in column B belonging to the one ID.
At the end I want only one row for each ID, including the total of column B and the first (or the last, it actually does not care which of the values belonging to the one ID is taken) value of column C and D.
I hope it’s clear enough now and anyone can help :slight_smile: :pray:

do you want this to overwrite your Sheet1 or to be created a new sheet with result? Or just need to use nin the process and not to be written back to excel.

I need to write it in a (new) csv at the end

Have a look on updated XAML
Datatable can later be used for writing into a csv file with UiPath activities
Demo XAML here: MarcJo.xaml (11.7 KB)
Just rewire the EXCEL path according to your environment

If you care to convert the values to number before, then you can use a custom activity written by our friend @balupad14 called Group By Aggregation and will be very easy like this:

1 Like

Hi @ppr,
unfortunately I receive this error message…

What do I have to adjust?

How can I get this activity?
Which package do I need to install?

crosscheck datatable variable name from read range, scope, and datavariable names inside the assign.
Check excel if its according to your provided sample. The dump that I have send you via messages were from a real run, so standalone xaml was working.

Or send me your XAML and I will have alook on it

1 Like

The sheet was wrong, sorry :see_no_evil:

But now I receive this error…

can you send your xaml, fasted way for resolution
Maybe you tried additonal to grab the summed up value later into a variable of string. So here a conversion is required

@ppr thanks…
MarcJo(1).xaml (10.3 KB)

nothing bad, I did run your provided XAML against the provided EXCEL without any issue.
Is the Excel the same?
give a try on this XAML:MarcJo(1)_B.xaml (11.5 KB)

It worked with your former XAML, thanks a lot…!!!

I will try to adopt it in my actual code.
In case it does not work I will come back to you.
But so far, many many thanks!!!