How to update same asset by multiple robots

I would like to update single asset value in orchestrator by multiple robots which are running in parallel. How to avoid concurrency problems while updating same asset by multiple robots ?

Same Problem Here !

Why are you trying to do this? What is the goal you are trying to accomplish? Assets may not be the best way.

I have a string dictionary with the values ​​of the Excel files I need to create.

I serialized a string dictionary and set an asset with all these values.

7 servers consume this dictionary doing the values ​​one by one.

I set an “Active flag” to prevent servers from making the same document.

The case is that 2 or more servers can configure “the asset flag” at the same time and can make the same document. I need to avoid this as it is only one process for 7 servers in 1 environment.

I can’t use a queue because some files may fail due to connection problem so the process has to re-create the failed file.

I need to consume this dictionary one by one, as a queue, but I can’t use a “Get transaction item” because it doesn’t allow me to get other items with other state besides “New”,

and I can’t use “Get queue items” because I need 7 servers to work one by one the files in a “Live way”, and not to have the whole queue at the beginning for each Robot.

This is because I need to create 90 documents in less than 6 hours, and some Excel documents can last 40 minutes refreshing data.

I do not know if it makes sense.!

You should be using a Queue, not Assets. Create a Queue Item for each Excel file you need to create. Then you can have as many automations as you want pulling from the same Queue, and they cannot step on each other.

Queues can have automatic retry, or your automation can add a new item to the queue if there is a failure so it gets reprocessed.

You shouldn’t be getting anything but New anyway.

It sounds like you need a dispatcher automation that runs and creates the Queue Items for all the files that need to be created. Then you have a performer automation that runs and creates the files, pulling from the queue. If a file fails, mark the Transaction failed and have the Queue set for auto retry, or just have your automation add the item back into the Queue as new.

1 Like

It sounds great. Thanks for your time.

Is there a possibility that two or more severs start the same document (Item Queue) at the same time (same Hour, minute and second) ?

I think it can cause redundancy.

No it is impossible. This is part of the point to Queues - two Jobs cannot Get Transaction and end up processing the same Queue Item. As soon as one automation gets the transaction from the queue, it is marked In Progress and no other Job can grab it.

1 Like

Thanks !! :robot: :robot: :robot:

Also, is there away to Reset or Remove all the items from UiPath Studio at the beginning of the process regardless of the status of the items?

I’m not sure what you mean. Queue Items are not stored in Studio. If you want to remove Queue Items from the Queue, do that in Orchestrator. New are the only ones to worry about, because it’s the only item status Get Transaction will get.