How to avoid loading items onto the queue twice

When i start a job from the robot tray it loads the entire spreadsheet items onto a queue on Orchestrator. So i initially started the job for the first time and it loaded my queue with 800 items. The system halted so i had to start the job again, this lead to the job loading the same items onto the queue. How can i make it so that it can pick up from where it left off?

The best practice is to clear the queue items manually when testing the data. Once the items are loaded for the final version of your project, you will not need to load the data again, but for now, it will likely be simpler to remove them manually.

If the dataset is so large that this takes too long, create a smaller file to use for testing in the meantime.

It is a good practice to use Reference as unique too, so no matter how bad things go, it wont have duplicated items in queue…


Will turning this on avoid dumping the same data items into the queue when the robot is restarting? Will the bot pick up from where it left off meaning on the next new item in the queue?

This will prevent putting duplicate items in, but it will not result in the bot resuming where it left off.

However, I suppose you could enable ContinueOnError when you load each Queue item. That way the dispatcher will move on to the next row item to be loaded to the queue. This will not work if you’re using Bulk Queue Items.

Hi @michaelamay0

You can use Unique reference option for this to avoid duplicating the same queue item

You have to choose one reference value while using Add Queue Item Activity in studio, and make sure while creating the Queue in Orchestrator you have to enable the Unique Reference option to YES

If you are placing queues in Init state then better to wrap the Add Queue item activity with Try Catch, because if you enable the Unique Reference to YES then duplicate will throw system error, and it will end your execution

To avoid this we use Try Catch, so it will proceed with your automation when you are using Re-Framwork

Hope this helps



picking off where it left is more about YOU, deleting from excel after successfully adding to the queue, but this is a way to make sure that even in worst case, it wont duplicate… You would put in Try/Catch the add to queue part, so if it is existent item it errors and go to the next…


maybe this activity helps you to run for a reconcilation before uploading the remaining items

kindly note the limit of 100, but there is a page skip functionality available

You should seperate the robot to 2 robots:

  • robot1: Load data to queue
  • robot2: Process data in queue
    This structure helps you scaling robot size when queue data becomes large
1 Like

Can i use this with bulk add quese

Hi @Sira

You can use the bulk queue item activity as well, for that you need to have a datatable and you need to create a column which will be your unique reference

Check below link for your reference

If this answers your question click on Mark as Solution


Using build data table i will create a reference column. That will stop duplication of queues when running on Reframework

@Sira Yes, from the Orchestrator you need to enable the Unqiue reference while you creating the Queue, that ensures everything for duplicates


1 Like

When i run its saying reference is required for unique reference quuees. I added a colum reference in my build data table. Im using add bulk queues.
There no option in bulk to do tht

Do you have the data under that column ‘reference’? There will no option to add into the activity


Yes i added a data in the reference build data

This is how i have it set up is this right way