Iterate search text in a website for data

I have to use data scrapping to collect data from a website.

I have a list of 14 texts that i need to search in the same website (say like )

I have done : separately (and it is working)
I used the For each row to cycle through the list.
I have used the Datascrapping to collect the data from each search.

What I am not able to do :slightly_frowning_face: is
combine both of them and automate the search.

every time I give the text the search URL changes

For hotel

For Restaurants

Every time I have to change the attach browser activity so that it takes the new search text.

How can i automate it ??
I used a flowchart .
I am very new to Ui Path.
I have no coding background. I am using the community edition. i have attached my file.
Yel_scrap_flow_readexcel_upload.xaml (24.9 KB)

Can anyone help out ??



ca you share your csv and excel files?

Please use wildcards and try,

Hii @monika.c
The csv and xls files are in the zip folder attached below|attachment](upload:// (11.0 KB)

could you explain this please ?

as i understand , after searching it still gives me the same URL …

Hi @Sae_98,
Please remove Titles or URL from your selectors and (16.9 KB)

Hii pradeep ,

Thank you for the file.
I added the screenshots.
I have a doubt. In the place where it says
“Element exists” what exactly should be added ??

The flow stops there. I attached the browser page again to indicate if the page is loaded or not. (19.2 KB)

Thank you

Here, “Element exists” is used to know whether the web page is loaded or not, it gives us boolean value.
I forgot to use that value in the flow.

is it compulsorily needed ?? or it can be skipped ??

the wait time can be given in the

DelayBetweenpages in the Extract Structured data Activity right ?

Yes, It is mandatory because, we cannot move forward without conforming whether the page is loaded or not.
For extract structured data activity : Timeout is given in DelayBetweenPages, so that bot waits for the next page to load :slight_smile:

Oh ok … thank you …
I will check that and get back to you .
Thanks for the reply :slight_smile: