Hello guys. So Im trying to figure out what is the best approach to complete the workflow that I need to do.
Basically, I need to read a CSV file, from which I will form 6 different data tables that has needed data to input into a API call. The thing is, that one data table has lets say 30k records, but the API call can only take up to 4 and one call and you can make 300 api calls in 15 seconds. I dont know how to loop the process, so that it would input first 4 records to api call body, then make call then input next 4 and so on. If you have any ideas, please let me know. Any help would be highly appreciated
you can use orchestrator queues for that, first of all you can append the your data table value with an help of BULK ADD QUEUE ITEM Activity.
- then you can retrieve the data from queue and send to api calls.
Thanks for the possible solution @Baskar_Gurumoorthy ,
But is it possible to provide some more detailed information about orchestrator queues in general, or in the best way, a sample workflow? Theese activities are basically new to me, so any more detailed help would be highly appreciated. Thanks
You can try below stages
- Read the CSV file and store all the transactions in the Database or pull the transaction as per the API call. and get specific records and mark them Done
OR
You can Split the Record into small files if it’s a bulk activity call then one by one
Parallel Activity Using this activity you can call multiple requests at one time
Note: pls create one reference Number for all requests and store this database so you can track which API calls pass and which API calls failed
@Povilas_Jonikas I hope you got the idea of how it work if you think you will get multiple approaches solve this issue.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.