I have one query, if we have suppose 35 lac records that need to process in one day how can we achive this as to process 25 lac records it is taking 24 hrs. Please explain in both the edition how we can achive this in community edition and enterprise edition
If that’s the case then we can upload the data from excel to orchestrator queues with one bot
And process the queue item with multiple bots
If one bot picks one queue item it will pick for new item and convert that to In Progress
While the another bot when running will look for next new queue item so every item will get processed one by one by all the bots used for this this process
There are two approaches you can explore
As @Palaniyappan explained scale up number of robots so that the 34 (cross multiplication) hours you will need to process 35 lac transactions will be reduced.
Be strict with what you send to processing / dispatch. In short, pre-process your data. Sit with your end user and ask them, are there ways you can clean the 35 lakh data points and remove items which you know will result in either business or application exception in the robot? The idea here is to only populate the queue with items which need to be processed. Once your pre-process you can already send the exception report to the end user.
Example : If your end users say, “We avoid processing items when two of the columns have no values.” Then you can pre-process the data (filter your data) to avoid those data points. May be, just may be you will save a lot of processing time for the robot.
Doing this will also impact approach one, you may or may not need to scale up your robots (performers).
Thank you @jeevith for the deep explanation.
Thank you @Palaniyappan , I will try this.
Cheers to you
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.