Requirement is to process records in excel and notify business users (updates email) during first run (morning hours).
In second run (afternoon hours), bot process same records again and import excel to system and notify business users (successful execution).
Solution : In the first run, Dispatcher is processing records, notify users after processing.
In second run, dispatcher is processing them again and add transactions to queue for further processing.
Performer picks transactions and import file to system and then notify business users for successful execution.
How can we optimize for first run and make sure first run transactions are also added to queue but whole execution is successful after second run ----> Do you think it is also good idea to create two separate queues for this scope?
It seems first run performance is partially skipped due to business needs. First run is only foucussing on sharing updates and found out early issues in excel data.
Hi @Sonalk ,I hope you are doing well.
I do not think add queue item is required in both the runs i.e. first and second as you said that mostly the first run will be to check whether each excel data is valid or not and on that basis inform the stakeholders.
In the second process you can use it as a dispatcher and perform add queue items.
Bot has invested few minutes (everyday) to preprocess data and performed data integrity checks to notify BU users. This is not measured as per current design.
I understand that meaning of succesful transaction is only applicable for 2nd run but my only concern is related to first run.
Hi @Sonalk ,
If you want queue to be involved in first run also then you have to make use of two queues in the first queue add excel data in the first run.
In second run get those queue data and again add to the second queue and change the first queue item’s status to successful.
And in the third run i.e. the performer get the items from second queue and do the operations.