"Get Attribute" to click on a certain row on a web page?

Hello,

I have a situation where I extract a dt from a webpage, and it returns 2, 3 or 4 or however many rows in a data table. I then go through a “for each data row” activity to match the text from a google sheet to click on mismatched addresses to delete them. If the bot sees two addresses that are exactly the same, I have a “correctcount” variable count the number of correct addresses. If it finds two, it deletes one of them. That all works great.

The issue is, sometimes the bot re-visits an account page as expected, but the duplicate address has already been deleted. The text now has a “strikethrough” line font going through the address, but the address is still counted in the data table. The bot winds up deleting both good addresses.

Is there a way to have the bot do a “getattribute” activity and click on a ROW on the website so I can see if there is an attribute I can find that would tell the bot it is already deleted? Match the website’s row with the foreach “index” count? The bot constantly clicks on the first address if the one below matches.

Thanks!

@Josh_James

Can you share the screenshot of the webpage which you are referring?
Also, share the selector for correct address and the strikethrough row selector?

Thanks,
Srini

1 Like

Hello Srini, and thank you for the reply!

Here is an example screenshot from the webpage I extract the data table from. I cannot reproduce test accounts for duplication, only production accounts have those. Pretend the second address with the strikethrough line is the same address as the top one. That would be what I see in production.

image

Here is what my Strict Selector properties look like. I added the “tablerow” part after I read a forum comment suggesting that. That did not work. The variable “strCEAddress” is the address text I am matching, and “index” is the row of the “for each” loop.

image

Would adding idx=‘number’ to the strict selector instead of tablerow be the right approach? Idx seems to let me click on the second correct address now instead of always being the first one. I don’t understand the logic, though, so I’m not sure if I should trust it? Why is the idx number of the first address ‘3’, and my variable for idx is one number lower, ‘2’… Weird.

Hi @Josh_James - Adding idx is not recommended, as the value is not stable, it’s better to stick with table row

Want to confirm, the duplicate address/strikethrough address will always be the second address? if that is the case, you can compare the address and if it matches, skip that row

Ex: You are extracting second row of data (second row data in a variable), also save previous row data in a variable, now compare both values using IF

True
>>  Skip processing second row data

False
>> Process second row data
1 Like

Hi Usha,

No, it will not always be the second row. Some accounts will have 3 or 4 addresses. The bot’s job is to delete mismatches and duplicates. The problem is, when it revisits the account page after everything has been cleared up, the bot will “see” the two matching addresses, even though one of them has a strikethrough, and deletes the remaining good one. So now all have been deleted, which is not ideal…

There is indeed a get attribute activity. This is capable of reading all element attributes as you can see them in the UIExplorer (botom left pane).

There is a good chance the strike-through variation has a different style class linked to it which you can detect, and based on that decide to click delete or not.

1 Like

Correct. I’ve been going through each attribute one by one to try and find it. So far, the only one that returns something useful is “outerHtml”. That one appears to at least show me the address is deleted! Unfortunately, I might have to use IDX references to click on the correct duplicate though…

Nope, it was not outerHtml… but I don’t think I need to find a solution anymore. We were able to reconfigure the source google sheet so we won’t revisit pages that have already been worked.

Thank you everyone for the help!

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.