I have 3 bots running parallely working on queue items. Say each bot generate an excel at the end of its completion. How to consolidate all the excels generated by all the 3 bots at the end. What could be my logic to verify whether all bots completed processing queue items. Please explain all possible solutions in details.
I believe you could have fourth process that checks how many files there are in a defined directory. If there are 3, you could process them
In future, number of robots that can be triggered at the same time may vary depending on varies criteria such as machine availability, work load etc.
So instead of checking for static number of files i.e. 3. Check the queue before you consolidate.
To brief: Check if there are NO ‘New’ queue items and NO ‘In-progress’ items (if there are any in-progress items, the transaction item is being worked by a robot), Once the above two conditions falls true you can make the robot consolidate the files from the folder.
Did you try this ? I also thought on same problem and 2 robots may have the same condition simultaneously and generate 2 reports . Maybe I’m not right
Considering there might be time difference in execution of both robots( may be with workload, time triggered, memory, etc). We can make the robot(when there are no more new items in queue) to lock the file(s). If there are any locked files the other robots need not proceed with generating reports after processing the in-progress transactions.
But i think we have to use a loop to check if the transaction is New/In-Progress. This takes a lot of time for iteration. This is not a feasible solution. Could you suggest any other alternative??
Accepted…But what should be the time difference to trigger this bot…How to know all the transactions have completed successfully?? What is the logic for this?
You need not iterate too many times. When you know the Max time the process(single transaction) can take to complete- make the bot wait till then ( approx time) check twice for status. If it is still in progress, you can skip that particular transaction and consolidate ( why i say this is because - when the robot is killed in between the status will be left as ‘in progress’, the status gets changed to abandoned only after 24 hrs, so not to consider such transactions wait for the max possible time to check for the status)
As you mentioned to specify the max wait time, But this is not the reliable solution as the no of transactions differs each time, what if the number becomes double/triple at times, you can’t work out those situations in this case. My code should work dynamically…I hope you understand my requirement.
Also my code shouldn’t include checking for status as it should iterate through all the transactions to check this.
Sorry that i couldn’t make you understand
What i mean as wait time is -> Processing time of ONE transaction (end to end)
What i guess you understood is -> Processing time of one transaction multiplied by number of transactions.
Hope you are aware of queue concept -> the workload gets shared by any number of robots.
- We have 100 transactions
- AHT is 1 min per transaction
10 robots triggered at the same time
- This means each robot might process 10 (approx). In this case, when all the bots picked its 10th transaction-> one robot would complete it before other robots(may be with the time difference in seconds) so the bot has to wait only for average handling time of transaction (i.e 1 min).
2 robots triggered at the different time (lets say, One got triggered at 6.00 a.m and second got triggered at 6.30 a.m on same day)
- By the time second robot triggered, the first robot would have processed 30 transactions (AHT is 1 min) and now the Queue will have remaining 70 transactions (i.e. 35 each). When both the robots picks its 35th transaction, one robot would complete it before other robots(may be with the time difference in seconds) so the bot has to wait only for average handling time of transaction (i.e 1 min).
The above scenarios works the same way even when the number of transactions differ each time. So do not worry about number of transactions-> this can be handled by Queues.