Multiple runs in a day without queues using RE framework

Hello all,

I am trying to use RE framework for attended bot without using queues. So I will be writing excel data into data table (using read range) in Init. Then process each record in process transactions.

Question here is:

  1. If user initiates the bot and after processing few records say 100 out of 1000 user stops the bot. Agter some time (same day) user triggers the bot again to complete remaining records, will all the 1000 records again get loaded to data table?
  2. If yes, will they get added as duplicates?
  3. Any best practices to avoid this duplicate loading in init?


You could add a ‘Status’ column in the Excel file and write back the status of each transaction to that column. Filter datatable on this column to process only items you need to.

Thanks Kiran. This way(status field) I can avoid processing same items again. But when we run bot again 900 not’processed items will again get added to data table right?

  1. How can I avoid this process overhead? I am talking about 1k records here but it can grow later.
    2 Will running init add these 900 records as duplicates?

I’m assuming you are referring to when the user triggers the process again after a few items are processed.

When you use the Filter Data Table activity, it outputs a datatable with only the rows matching your criteria. When you filter, you’ll have to keep track of the row index or some sequence no to help you update the correct cell in file when you process that row.
Reading entire contents of the file and filtering the data table will probably be faster than using Excel range related activities.

If you are referring to when the workflow goes to Init after a system exception, you can read the file in ‘First Run’ sequence or read the file only if the datatable variable is Nothing. This will ensure your datatable variable is not overwritten once data is read.

Since you are not using queues, there would always be some overhead. Typically you shouldn’t run into memory or performance issues with 5K or 10K rows. But if memory and speed are constraints, you can test and tweak the code to your requirement.


This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.