I am developing a robot with ReFramework. For each transaction, the robot has to work with multiple different applications. Some of the processes can be retried when an exception throw, like download a table from a website. However, the other processes cannot be retried, like import invoice into the work system which might cause duplication error. So I have to set maximum retry time equal to 0 which lose the meaning to use ReFramework.
Has anyone encountered the same problem? Please let me how do you guys think about it.
Depending on the process, you may want or need to work with multiple queues. Let’s say that the first step involves downloading data from a web app, as well as some data manipulation - this could be you first queue, with up to three retries. Now let’s assume that the final step involves uploading an invoice towards your ERP system - this could be done in a new queue without any retries being permitted.
So, you could split the load into two processes - both based on the Framework - and even run them separately with different schedules on multiple robots.
@redlynx82 Thank you for your answer. If I split the process into multiple robots, how could the robots share their data or table. For example, first bot download tables and manipulate, and second bot import invoice to ERP. How could the second bot get tables which are manipulated by the first one?
If I set the local C drive as a hot folder, the second bot will not find the hot folder once it runs on a different machine.
Queues are perfectly suited for this job. Each item in a queue will be handled by UiPath similar in a similar way databases handle transactions. So, it’s no longer up to you to set up a SQL database, make sure that data is inserted by one robot, read by another, and that a status per row is maintained - let alone retry handling, handling of orphaned entries, and much more. Read more about the whole concept here.
Your first workflow (or robot) would download and manipulate the table, storing individual entries in a queue. Depending on your requirement, this could be an individual row, or even a
DataTable object per Queue Item. This robot would make use of the Add Queue Item activity.
Your other workflow and/or robot would then use the Get Transaction Item activity to retrieve the next item in queue.
Since you mentioned “hot folders”, may I assume that you want to store or upload a binary file to your ERP? Well, nothing’s going to stop you to read a binary file, convert it to base64, and then store this string in a queue.
I have encountered a project something like this. What I did was creating a multiple flag for each process. Meaning, i will always check on the flag before proceeding on a specific process. Retry should still work the same but on the unfinished process only.
Really awesome idea. I never think about I can upload table as queue item. I will try what you mentioned. Thank you @redlynx82
@Emman_Pelayo Thank you for your answer. Could you explain more on how to create the flags to avoid retry part of process? In addition, how could robots detect some processes have been done and skip them? Thank you
In your main process, you will always check a flag before proceeding, something like:
If boolFirstProcess = TRUE
//do process 1
If boolSecondProcess = TRUE
//do process 2
You can save these Flags/Status on the main Source File or if not possible on a Temporary File.
@Emman_Pelayo Thank you. I will try it in my process.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.