For a process, we required to process 150 - 160 K items and each transaction item processing will complete in less than 10 seconds (only some API calls).
For better reviewing and handling, we require to add these data’s to the orchestrator queue but, we are checking whether adding bulk data to the orchestrator queue make any issues and also, if there is any limit on adding the items to the queue
Orchestrator type: On-premise
Someone having experience on this please share your thoughts.
Also, all these data adding and processing completes in one week or less.
One thing is for sure 150 K items is not directly supported by bulk add… The limit is around 20K or so…Around 40K it failed with bad request for sure with experience
Other options you can explore are Add transaction item instead or add bulk queue or use a dispatcher performer where performer runs parallel to dispatcher and run a loop on it.
I would say add transaction item is better if you need only for tracking and reporting
We are not concerned about the dispatcher or the performer part, we are checking whether the orchestrator queue has any limit/maximum number of items that a queue can occupy or else, this bulk data adding to the orchestrator queue results in any orchestrator performance issues.
Till your sql has enoough memory the number of queue items can go well. But using bulk add queue items we cannot add all 150K + items we have to separate them because it would fail. If your server is havign good capacity then there is no as such limit for the number of queue items a queue can handle.
Hope this gives some information
How about trying it and telling us if it worked?
Thank you for your time.
I think the bulk queue item activity have a limit of 20K or less, anyway we won’t be using the activity as we require to filter with multiple conditions before adding to the queue.
Sure, I will update the details once the process is completed.