Scrapping datatable dynamically

Hello guys, can you help me with this complicate task, its a must do.

in the above web page i need to navigate(click) on every row so a new page will shows up like below

In the second page(second image) i have to scrap datatable that i will insert it in excel file and then click on return button so that the first page re-appear and so on.
the problem is that i need to do this task for each row shown in the first image and when i get to the bottom item in that page i have to go to the next page an so on to the final item. I know i have to use the dynamic selector but how can i use it in this case,

Pls help me guys and thanks,

@Aymen_Ftiti
it looks like a task within following pattern

  • retrieve a highlevel list with links
  • iterate over the link list and retrieve detail information

give a try on data scrapping and

  • configure the extraction to retrieve the links from first screenshot
  • configure the paging for the datascraping the links

then iterate over the retrieved links
readin the details again with another configured data scraping
add a third column to the detaill list to identify the details on which client it do relate

Is there an exemple i can rely on?

If this website is a public one, please share its url. If not, i would not use data scrape, i prefer to do manually, since the fields seems to be fixed for each company.

@bcorrea you could be my savior hhh, yeah its a public site
ok this is the link Annuaire des entreprises industrielles its firstly a search page where for exemple i will select the “totalement exportatrice” item in the dropdown list field “Régime” when i click “chercher” shows my first screenshot, i’m stuck cause the rows i have to click one by one to guide me to the second screenshot don’t show me any url links, so i need each time to click on the return button to return to the first screenshot after scrapping data from it.

Ok, here is how i would do: test1.xaml (23.9 KB)
Note that i am not getting all the columns and didnt go to the end of the table because it is too long, so it may have some bugs at the last page…

1 Like