How can I retrieve all the href links present in a webpage?

Hi All,

I want to fetch href links present on a webpage. I have a specific pattern for the links that I want to fetch. How can I got about to get them?

There’s no possibility of data extraction or using HTTP Requests activity on the webpage. Links are also not present on the Source Code that I view by doing “CTRL + U” and Find Children Activity also doesn’t seem to work as there is no specific pattern for the presence of all those links.
Also, some relevant packages present on marketplace are only compatible with legacy projects.

Those links can be of .pdf, .png, .jpg, .eml etc. format and are present on the UI as icons. I can use plain old Find element activity by applying page down on the long web page.
This seems to be unreliable and slow. I want to use the fact that all the links that I want to download has a specific ‘pattern’. If I could somehow get those links, I would be able to simply use them further.
Can we use XPath in any way?

Please provide your insights on the same.
Thank You.

Hi,

How about using FindChildren activity as the following?

Sample20231022-3.zip (4.2 KB)

Regards,

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.