Reframework is intended to work with queues assuming that the data is known in advance, but how would the template adapt if the process discovers the data “in-process”? It is not clear to me if I have to add a process in the Initialization state that adds all the items found to a queue in the orchestrator, or if I should, in the “Get Transaction Item” state, add the item one by one as they appear to the queue in the orchestrator. Or it is also possible not to use any queue and simply process them.
ReFramework will only work on Transaction Items in New State. In-Progress queue items will be ignored by the process. In Progress transaction would be set to Abandoned after 24 hours. To reprocess them you will have to add them again into Queue.
So? How do I work when a process dynamically discovers the elements? Suppose I have a page that obtains product data but without knowing it beforehand. Is the template not prepared for this type of situation?
There is no out of the box solution or readymade solution for this as it varies process to process.
You will have to create a separate process/bot to navigate to the product data page, extract data, add it to queue for further processing. This bot will be called as Dispatcher bot
To process the product data one by one, you will be using REFramework as a separate bot. We call it Performer bot.
It can work with current processing items as well…and also can work without queues as well
if you have dynamic trasactions to add you have a Add transaction item option where the item will be added to queue but will be added as an in progress item directly as it is being processed…this can be designed
also if you knwo dynamically that from a website today you might process 10 requests then you can divide process to first identify 10 called as dispatcher…
and then the second part of processing can be done separately called as performer
Thanks to both (@Anil_G and @ashokkarale ) for the answers.
I’m looking on Google to see if there is a video that shows an example to be able to do it.
It would be great if you had a link to an example of how to do it.
The scenario you are describing would be a case of Dispatcher Integrated in Performer - not typical but still quite reliable if done properly.
To achieve that, I’d suggest adding this “Dispatcher” sequence, at the Init stage, inside the “If first run, read local configuration file” Then stage, at the end of the sequence, right after “Invoke KillAllProcesses workflow (first run)”
In case of a System Exception, you don’t want the dispatcher part to run again during a retry of the initAllApplications, as it would load duplicate items if the queue/code does not contemplate that.
If there are no items to be uploaded when you get to the Get Transaction Data, it will follow the No Data process accordingly.
You can also run it again, and finish completing the items on the queue if no new items are added by the dispatcher part.
Long story short, drop your dispatcher logic there, and you have successfully integrated the dispatcher into the performer.
If instead you only want to keep adding items to the queue dynamically while you are already in the Process Loop, when you identify the proper conditions, just use an Add Queue Item activity in the Process itself, and will add it at the end of the current queue.
Disclaimer: Be careful as this can create infinite loops if not managed properly.
Ok, I think I understand you. I just found a video that I think shows that exactly, I’ll try to follow that line. I thank you a lot. Her response was illuminating. Greetings
PS: I pasted the link to the video in case anyone is interested in seeing a sample