Complex Web scalping

Hi @loginerror @ClaytonM

I’m able success scalp multiple pages but how to I scalp the content from the inner links of the index pages.

For example take eBay and I’m searching for a book I can scalp to all the books that are displayed but how to scalp the information buy clicking the first book and taking information and click the next book get into the page and scalp the information going back to the third book click and …

You can try scrapping all the http links along with names and then use a for each row, for each of the urls captured, you can do the required actions and again store in a table if needed and then the loop repeats.

Let us know if this helps,
Pavan H

1 Like

Can you make a example file

To extend @pavanh003’s answer, look into Queues and Transactions. One robot would open the browser, navigate to eBay and issue the search, and then scrape all results. Each result - i.e. the URL - would become a Queue Item. Other robots could then use Get Transaction Item activity to go to that specific page, scraping all metadata as required.

If you’re looking for more examples, just head over to and finish their free courses. The advanced training covers Queues and Transactions, everything else is covered in Basic Training.


This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.