Robot Optimization

Hi All,

In my organization, we have recently started UiPath Automation. So I wanted to understand how to use robots in optimize way.
Are there any instructions or suggestion that UiPath provides to utilize the robot for multiple process. Like based on application used in processes or based on running time / work load we should assign processes to robots or anything else which will optimize the robot usage.
Any suggestion would help.

Thanks…!!

Hello @shubham!

It seems that you have trouble getting an answer to your question in the first 24 hours.
Let us give you a few hints and helpful links.

First, make sure you browsed through our Forum FAQ Beginner’s Guide. It will teach you what should be included in your topic.

You can check out some of our resources directly, see below:

  1. Always search first. It is the best way to quickly find your answer. Check out the image icon for that.
    Clicking the options button will let you set more specific topic search filters, i.e. only the ones with a solution.

  2. Topic that contains most common solutions with example project files can be found here.

  3. Read our official documentation where you can find a lot of information and instructions about each of our products:

  4. Watch the videos on our official YouTube channel for more visual tutorials.

  5. Meet us and our users on our Community Slack and ask your question there.

Hopefully this will let you easily find the solution/information you need. Once you have it, we would be happy if you could share your findings here and mark it as a solution. This will help other users find it in the future.

Thank you for helping us build our UiPath Community!

Cheers from your friendly
Forum_Staff

Hi @shubham

Let’s ask our @MVP2020 's about that one :slight_smile:

In my mind, the most important would be process priority vs process runtime. If you have a bunch of processes which are time-sensitive, you don’t want them to be stuck behind a process that is less of a priority, but takes an hour to run.

Hi @loginerror,

Thanks for the reply. I am also thinking that it may depends on runtime / workload. But also wanted to know if there are any other factors that should be considered as I don’t want to waste another robot license if we can optimize it.
So may be I’ll just wait and check if anyone have any other inputs to add.

Thanks!

Hi.

I would highly suggest grasping a good understanding of how the Queue works and how a process can be broken down into transactions. This allows the utilization to prioritize jobs/transactions. Eventually, if not already, the Queue will be able to be automatically trigger when new transactions are added. However, there are some concerns with efficiency of this, such as closing out of applications or logging off between each transaction or slower, less intuitive error handling. - when multiple robots in parallel are used, also understand that applications and files are used by multiple users at the same time, so it is necessary to have more robustness to handle this, like attempts to output results data to Excel using a Retry Scope, but there are various ideas surrounding this.

If the automatic trigger allocation of transactions will not be used, you should consider xaml code in your process that looks at the available robots and triggers the job to run on multiple robots to handle the transactions more quickly from multiple robots in parallel. This can work well, but it will be eventually necessary, using this technique, to leave a few robots available for other prioritized jobs that may need to run. To ensure this idea works well, it would be ideal to dynamically adjust the number of robots being utilized to accommodate when the robot utilization is very busy and when it is not. Like if you have a process that can use 8 robots, but is very busy in the morning, it may dynamically be running on 2 or 3 then adds more as more robots become available.

Either way, your process development and ideas will evolve over time and as UiPath updates their products. So be prepared to have old processes that aren’t utilized as well as newer ones are, as time goes on.

This is from my perspective using unattended robots that run on a server behind the scene.

Regards

2 Likes