I had built a workflow whereby Bot will download all the attachment in the browser (per search result).
As the result searching maybe varies, whereby some have four attachment , or some have 7 attachment. How i build it is i build seven click activity and tick the " continue on error button as per screenshot below.
Which mean if no save element found, the activity will continue. May i know if there is better way to build it ? As it seem like i need to wait for quite some time for the worfklow to finish running.
Is there anyway that how we can enhance it . I also put a lot of delay tools which may only applicable for certain process.
How i can ask Uipath to skip those process if certain search result not applicable instead of keep waiting for bot to run the full process.
If your search results are dynamic then don’t do like adding 7 or 4 clicks.
On your result page see how the attachments listed. Analyze the page first.There might some dynamic ids. Implement dynamic workflow to download the attachments though it is 4 or 7 attachments.
Is it possible to share the url or result page? It would be help to give exact solution.
As the url need to have access to enter, hence, i would not able to share it. The screenshot below is the result page whereby i need to click for each button. Previously i use anchor base , it seem like result no good.
So how i did is , i detect the UI element by row (selecting row number as part of the criteria - hardcoded). It run successfully.
Do you have any idea how i can built more dynamic workflow that able to detect each row for the save button?
In addition:
If you can get the download URL in the datatable (from scraping), or maybe the selector for the download button, you could just do For Each Row and add your actions directly.
I try to use the scraping tool before and no url was founds.
For the each row, i don’t how to ask Uipath to detect all the rows . which mean the UI element properties.Cz it always fail at certain stage. Do you have any idea?
May i know if we use the data scrape will able to detect dynamically for each variable result.which mean if there are eight row appear in the search result, they will able to detect it. And if only 4 row appear, they will detect all four.
Yes, Data scraping will fetch all the rows. When you develop the workflow for doing the data scrape consider more than 2 rows result. Then scrape first element from first row and second element from second row. You can scrape for all the columns.
Reason being is the scrapped data does not capture the empty row. Based on the analysis, the structure look like one row data and following by empty row and row data again and following by empty row again and repeated the pattern for each row data. But the scraped data only extract those row data without considering the empty row.
Your scraping looks you took the complete table. When you do data scrape it always ask you want complete table “yes” or “No”. You have chosen Yes it looks.
From the above screenshot it show the actual rows starts from row 5. So you can built the flow as follwos:
Initialize the Count variable with value 5.
Hope you have saved your scraped data into datatable. Default it will ExractedDataTable you can customize if you want
open For each data row loop
Do your click activity by passing the row id as count
Increment the count by 1. Count = Count +1
Another way of doing is , when you do datascrape when it asks for Get full table click no and scrape the data row by row. And give the proper column name. Try this method and see how many rows you are getting and is there any blank.