Uipath web automation becomes very slow after each iteration

Hi,

I am working on web automation with list of URLs and within each url I have the nested loop to scrape the information. The problem is it’s getting slower each time.

I have tried:

  1. cleared logs
  2. cleared dt after one iteration
  3. cleared cache after every N scrolls
  4. checked cpu, memory (constant 30%)
  5. invoked code for memory flush using Gc.collect method after each url
  6. run on orchestrator and run on studio makes no difference for me
  7. have delay after N scroll and before gc.collect

Not sure how UiPath is working at the back that’s behaving as such, hope to get some explanation. I have cleared everything possible but the issue persists. But when I manually stopped and rerun it works fine. Is there anyway I could automate the manual stop and rerun after one iteration?

I notice the same thing and I checked RAM and CPU and they aren’t throttling. So it must be some bug in UiPath Web Automation. Using Chrome browser and 2022.10.

A possible solution could be to close and reopen the web browser (or the application you’re interacting with) after a certain number of iterations. It’s similar to manually stopping and re-running the process and can help clear out memory or other system resources that may not have been freed up.

In your UiPath automation flow, this could look like:

  1. Start a loop that will process your list of URLs.
  2. For each URL, start a second nested loop that handles the scraping for that URL.
  3. Inside the second loop, after N iterations (or scrolls, or however you’re measuring), close the application (web browser) and reopen it.
  4. Continue processing the rest of the URLs.
  5. At the end of each major URL iteration, you could also call GC.Collect() to initiate a garbage collection and try to free up additional memory.

The slowdown might not be related to UiPath and its processing. Instead, it could indeed be due to web browsers’ memory management. Web browsers (especially complex, modern ones) can consume significant system resources, especially during extended periods of use or when handling resource-intensive tasks (like web scraping).

Browsers can cause memory leaks, consuming more RAM over time due to not releasing unused memory. They can also utilize more CPU as they process Javascript, render pages, handle extensions, etc.

This is why the strategy of closing and reopening the browser after certain intervals can help to mitigate this issue. Each time the browser is closed, the system resources taken up by that browser are typically freed up again, which can then be utilized in the subsequent sessions.

This makes it more of a browser-related issue – or, more specifically, an issue with how the browser manages system resources – as opposed to an issue with UiPath itself. It’s a good distinction to make, and important for troubleshooting the root cause of such problems.

1 Like