Hello all, I hope that this post is appropriate and in the correct section…I searched and couldn’t find any specific posts similar… I have completed the Developer Foundation training and have been working my way through some of my own automations. One thing I noticed is the need to use Delay Before and Delay After when using a click activity to ensure my data scraping grabs the correct data. Super high-level overview of the process I am trying to automate is that I have an excel file with ID numbers, I then iterate through that collection of numbers, search for each number, click on additional details for that number and then scrape the data presented on the screen and go back to search and do it again. Without any delay the robot seems to click back and forth too fast and the data is not scraped correctly and adding delay seems to alleviate this. Has anyone else experienced issues when not using delay and is this against best practices and there are better ways to solve the issue of incorrect scraping?
Most of the applications which we are trying to automate are very old. So, they may take time to load the entire data, on the other hand Uipath works fast.There is a conflict in both behaviours. If we don’t add delays, Robot searches for the item which is not present on the screen reasons may not be slowness of application etc;. whivh in return throws exceptions.
So, we should view from humans behavior. Say,if page not loaded we can’t see the data and we’ll wait till data appears. Hence the same should be done by Robot as well(Replicate Human Behaviour).
Best Practice may be using DataTables, because it may reduce burden of closing and opening of excel each and every time.
And Ofcourse using before and after delays.
For click activity, use simulate click.
Thanks @DAishwarya for your response. I agree the robot should replicate human behavior but also should be more efficient than the human. So in my example when I add the proper length of delay it takes the robot about 2 hours to take about 500 ID numbers search for them on a website and scrape some data for each to excel. I guess since I am still pretty new with the technology it’s hard for me to gauge whether or not this is about the average amount of time for robot to complete a task or abnormally long.
(I’m guessing you may have already sorted this one, so this is perhaps more for advice of others who search for this post)
I suggest you try breaking out into separate processes:
- first one to capture each search item (e.g. listed in Excel?) and create a queue transaction in Dispatcher
- second to perform the scrape for each item, using the queue transaction.
If you’ve done the Advanced Course (v highly recommended) then the Enterprise Framework gives the structure and practice on using it.
Also, instead of delays, use On Element Appear or Find Element, that will wait until the element appears. Both have a default threshold for timeout, but should be long enough. If you’ve a particularly slow app you’re working with, then Try/Catch the On Element Appear/Find Element, then repeat a few times (if needed) before, if still doesn’t appear, setting Transaction Status to ‘Application’ exception so can be re-tried.
Bear in mind also that to get the most out of automation, to get the increased through put that may be one of the business objectives, then some investment of the old infrastructure underpinning a v slow application, may be necessary.
Hope that helps (someone)