I thought I would throw this idea out there, because it is occurring sometimes and could get worse down the road.
When someone deletes a Robot or simply changes its Environment, every single Schedule that is set up to run on that specific Robot has it unchecked and must be fixed after. This requires that Robot managers and developers to put in place additional processes for preventing those changes that cause scheduled jobs to not run the next time.
- Deal with the known issue until you can use the “Number of Robots” parameter instead of Specific Robot
- Just don’t delete the Robot or change its Environment… ever.
- Document all the details for Scheduled Jobs to fall back on when edits are made, and ensure that the Robots are checked again from those details for each Schedule that would be affected.
I think editing the Environments might be a common thing to do and there are benefits to it like disabling jobs from accessing that Robot temporarily or making eventual structural changes. Soon, a team might have 100 schedules so fixing them will become a very tedious task.
Possible solution? When a Robot Environment changes keep it checked but only hidden from the Robot list while running a job or schedule, so it can just be made visible again when the Environment gets changed back.
i don’t know if this will make it into the priority list but thanks for reading.