When i start a job from the robot tray it loads the entire spreadsheet items onto a queue on Orchestrator. So i initially started the job for the first time and it loaded my queue with 800 items. The system halted so i had to start the job again, this lead to the job loading the same items onto the queue. How can i make it so that it can pick up from where it left off?
The best practice is to clear the queue items manually when testing the data. Once the items are loaded for the final version of your project, you will not need to load the data again, but for now, it will likely be simpler to remove them manually.
If the dataset is so large that this takes too long, create a smaller file to use for testing in the meantime.
It is a good practice to use Reference as unique too, so no matter how bad things go, it wont have duplicated items in queue…
Will turning this on avoid dumping the same data items into the queue when the robot is restarting? Will the bot pick up from where it left off meaning on the next new item in the queue?
This will prevent putting duplicate items in, but it will not result in the bot resuming where it left off.
However, I suppose you could enable ContinueOnError when you load each Queue item. That way the dispatcher will move on to the next row item to be loaded to the queue. This will not work if you’re using Bulk Queue Items.
You can use Unique reference option for this to avoid duplicating the same queue item
You have to choose one reference value while using Add Queue Item Activity in studio, and make sure while creating the Queue in Orchestrator you have to enable the Unique Reference option to YES
If you are placing queues in Init state then better to wrap the Add Queue item activity with Try Catch, because if you enable the Unique Reference to YES then duplicate will throw system error, and it will end your execution
To avoid this we use Try Catch, so it will proceed with your automation when you are using Re-Framwork
Hope this helps
Thanks
picking off where it left is more about YOU, deleting from excel after successfully adding to the queue, but this is a way to make sure that even in worst case, it wont duplicate… You would put in Try/Catch the add to queue part, so if it is existent item it errors and go to the next…
maybe this activity helps you to run for a reconcilation before uploading the remaining items
kindly note the limit of 100, but there is a page skip functionality available
@michaelamay0
You should seperate the robot to 2 robots:
- robot1: Load data to queue
- robot2: Process data in queue
This structure helps you scaling robot size when queue data becomes large