How to reduce robot Execution time?

Hello Everyone,

Here I am explaining process step by step:

Step 1- Read 7000 URLs from excel sheet
Step 2- Iterate One by one
Step 3-Open browser and Extract information From Html page
Step 4-Download Html page as text file and save in the local
Step 5-Update status report of the process
Step 6-Genrate hash code if file is downloaded \ save successfully
again step 1

the process is taking 28 hours for 7000 URLs I need to reduce bot execution time.

Thanks in advance.

Load the spreadsheet into a queue and create your automation to pull from the queue. Then run multiple Jobs simultaneously. This is how you scale an automation.

28 hours for doing the above 7000 times is actually quite good. It’s like 15 seconds each.

1 Like

Paul’s suggestion is perfect , you need to use multi bot architecture because there isn’t much to optimise based on the description provided
Along with that few things you can try:

  1. Don’t open new browser instance every time , keep the browser open (capture the browser variable) and navigate to the new URL on the same browser instance
  2. try using webdriver , because it worked in background and doesn’t have any UI-loading so the cpu/gpu &ram consumption is much less and the loading works much faster (About the WebDriver Protocol)
  1. if you only need to download the HTML into text file try using the HTTP request activity .For this first understand how the webpage load happened from the web-browser network tab , if it directly returns the whole HTML content then this will be best (https://docs.uipath.com/activities/docs/http-client)
1 Like