I am trying to automate a webpage containing multiple links with same name. I have to click on each link - retrieve data - return back to previous page - then click on another link and do the same. All the retrieved info need to be stored in an excel. I tried with Navigat to activity, but still am unable to do. Can anyone help on this?
You can use Data Scrapping to get the urls into datatable.
Then for this:
foreach (DataRow row in DataTable)
\open web page
\wait for page to load and check the page
\store in another datatable
Thank you. But when I try to get the url in excel, i got the link as ‘#’. I am unable to get the connecting link on it.
Can you share a screenshot … you ll to check properly if url is extracted…
sample.pdf (5.3 KB)
this is the screenshot of the webpage. I have to click on each claim details link and then to fetch the info from there
screenshot.pdf (4.1 KB)
this is the screenshot of scrapped data
On further review I found that, the claim details link is refer to a function in jquery…How do i navigate to that link line by line