Performance Advantage of the High-Density Robot Model?

I’ve successfully used the high-density robot model for about six months now. The model is touted as “maximizing resources,” since the server environment is reused, as opposed to spinning up new client VMs in addition to the server. In preparation for the creation of a brand-new system, I wonder about whether there’s a performance advantage between a standard VM setup with a separate bot per VM vs the high-density setup.

I do find some disadvantages in the high-density robot model:

  • Changes to installed applications are systemic, which may not be desired, such as settings for Office products.
  • With studio open, especially a couple of instances, it’s fairly easy for a user to consume all of the processing resources.
  • It’s extremely easy for log files to take up all server storage, which ultimately results in a complete failure for all robots on the server.

Is there a performance advantage with running the robot in a high-density setup? What about if the robots are shared on the same server with Orchestrator or ElasticSearch/Kibana?