One process should be executed by multiple robots

Hallo,
today I would like to ask how you would solve a problem like this.
Imagine the following scenario. You have a finished process and you want multiple robots to perform this process at the same time. You can define that in the orchestrator. The problem at this point is the filling of the queue. How do you ensure that the queue is only filled by one robot and not by the others at the same time?
Ok we have a solution for that. But it doesn’t seem to me to be really mature yet, so I’m asking the question here today.
At first we were told that the robots would do this automatically “among themselves”. However, tests have shown that this is not true and each robot then reads the queue again.

Hey @martinN,

Try to implement it in multibot architecture way.

One is Dispatcher and another on is performer

Dispatcher : Only one robot who will upload those data in the queue.

Performer : Multiple Bot who will execute the process from queue.

Thank you for the answer.
Yes, we tried to implement a “master” robot mechanism.
It works reasonably well, but there are other minor problems. So I looked for alternatives.
The dispatcher is also a problem. To do this, you have to commit to a specific robot. But that is exactly what we want to avoid. We aim to ensure that any robot can be used for this process.
Our solution looks like this: After starting, each robot creates an empty text file with its name in the directory that contains the data to be read in (e.g. “robot1.txt”). Then a check is made to see if he is the first to create such a file. If so, he is the master. All others then only performers.
It actually works quite well, only minor problems arise that seem to have nothing to do with this approach. However, this check takes a while and I just wanted to ask if someone has another solution that might be faster or more elegant.