Remove Duplicate Rows Activity, accept Excel file as datatable?

Hi guys, I’ve imported values into the first column of an excel file and I want to use the ‘Remove Duplicate Rows’ activity to remove the duplicate values from that first row. How can I input the Excel file into this activity? Or, is there a way to removed duplicate values from the Matches variable so that they can be removed prior to inputting them into the Excel file? Thanks.


HI @css

You can use the Read Range activity to read the data in the excel file onto a datatable variable.

Now, use that datatable as the input for the Remove Duplicate Rows activity. The output of this activity will also be another datatable without duplicate rows

Now, use Write Range activity to write the data back onto an excel file


First use the Read Range activity to read the data in the excel on to DataTable variable(dt) after tha If you want to remove duplicate rows in datatable use this:

dt = dt.DefaultView.ToTable(True)

if you want to use remove the duplicate values based on a column a get that column unique values please go with this:

dt = dt.DefaultView.ToTable(True,{“Column_Name”})

Hope it will work for you!!!
Cheers @css :grinning:


Hi @Lahiru.Fernando, this is actually leaving me with more questions now that I think about how the bot is going to work. The bot will write into excel a number of items into one column, but that number is variable so it could be 10, 20, 30 times. So in that case how do I set the write range with a variable value? I tried doing this but this does not work, where my variable is just a number counter in the prior for loop: “A:“A”+vCount_Array.ToString”

The second thing is the bot will then get 10 other consecutive lists of items to put into that same “A” column, but I don’t want to overwrite any of the previous data. So is there a way to just have the duplicates of a Matches variable be cleared out prior to printing anything into Excel? I think that would save me 4 extra steps.

HI @css,

You can use the activity called “Remove duplicates” from the below package.


That section did not load on my end when opening the file.