I would highly suggest grasping a good understanding of how the Queue works and how a process can be broken down into transactions. This allows the utilization to prioritize jobs/transactions. Eventually, if not already, the Queue will be able to be automatically trigger when new transactions are added. However, there are some concerns with efficiency of this, such as closing out of applications or logging off between each transaction or slower, less intuitive error handling. - when multiple robots in parallel are used, also understand that applications and files are used by multiple users at the same time, so it is necessary to have more robustness to handle this, like attempts to output results data to Excel using a Retry Scope, but there are various ideas surrounding this.
If the automatic trigger allocation of transactions will not be used, you should consider xaml code in your process that looks at the available robots and triggers the job to run on multiple robots to handle the transactions more quickly from multiple robots in parallel. This can work well, but it will be eventually necessary, using this technique, to leave a few robots available for other prioritized jobs that may need to run. To ensure this idea works well, it would be ideal to dynamically adjust the number of robots being utilized to accommodate when the robot utilization is very busy and when it is not. Like if you have a process that can use 8 robots, but is very busy in the morning, it may dynamically be running on 2 or 3 then adds more as more robots become available.
Either way, your process development and ideas will evolve over time and as UiPath updates their products. So be prepared to have old processes that aren’t utilized as well as newer ones are, as time goes on.
This is from my perspective using unattended robots that run on a server behind the scene.