Multiple robots parallelism

Hello experts,
I run a workflow with 2 robots, I want to generate a log file at the end, but I want the robot that runs the last transaction that processes this log file
To avoid both robots working on the same file
I’m working on the 2017 version where there are no Get Queues
In advance, thank you for your feedback

You could have a flag of sorts that is stored outside the workflow to indicate how many or which robots are currently working, if it’s the last/only one, then process the log file.

Some other potential options

  • Append to a log file instead of overwriting it, if order does not matter, otherwise sort it afterwards.

  • Make use of the Robot’s logs for Transaction Started/Ended, and Transaction Status.
    When it comes to the Transaction Status, you could use the Add Log Fields Activity to add the queueItem, but before doing that you would assign a Dictionary<String,Object> of what would be your queueItem’s output arguments to in_TransactionItem.Output. While you cannot set output in your Set Transaction Status for Failed items, this will allow you to write the queue item and any output values you want back to the logs for both successful and failed items. (Don’t forget to Remove Log Fields afterwards.
    From here it would be a matter of querying your logs with whichever process you have in place to do that.

I thank you for your reply return
the first option seems to me to answer my problem, but I do not know how to retrieve this information, do you have an example?

I wanted to update the input file with logs, so I update and not create it, option 2 do not respond