How to scrape the dynamic table data & save it to excel

1st of all i know, how how to scrape data or data table from a website
or to find an element even if its changing its value or postions ,by tweaking the selector

but what about a data table that exists in a website that changes in few minutes

Let me give you all a scenario…(Please explain me)

There’s a site called

"www.x-rates.com "

here scrolling down we see so many currencies…now choose 1.
let’s say Dollars ?

now inside we have 2 type of table

i need to scrap this 2 table data whose values keep changing every now & then
yet i need to save the data to am excel

Yet we are not done

1 more question is

coming back to the home page

how to build a workflow that individually selects all those present currencies(56 in total)
then it scrapes the data table present inside it

Please help and guide me thanks

Hello @Jiban_Kumar_Das ,

At least for the last problem, you must be able to write a flow to loop through the 56 currencies and then open the x-rates page directly to get to the table.

Example: For Euro, the URL would be:

https://www.x-rates.com/table/?from=EUR&amount=1

In general, the URLs would be:

https://www.x-rates.com/table/?from={{strCurrencySymbol}}&amount=1

If the Currency on your list is also listed on X-rates, then that URL would be a valid one. And the validity of that page is likely confirmed by looking at the Table title like so:

image

The underlined element of the title selector would change with the currency you are looking for at that time. Again, this is one of the ways to go about confirming that the currency is listed on x-rates.

The Robot has to know what URL to go to, and therefore you must have some universal list of currency symbols to begin with.

Currencies are independent entities and it would not make sense for the Robot to depend upon the X-rates drop-down to extract the list of currencies. Instead, it must bank on a general list you provide to it as input.

The other two solutions may be a derivative of the last one.

Thanks and I hope this helps.