I have a 300k record transaction file (Excel). The processing time without improvements is about 6 seconds a record, so in a fault free world and continuous running would take about a month. There are some obvious things to do, but would be interested in anyone else’s list of best practice tips or links to best practice on handling large volumes.
If you have large ammounts of data like you have, the ReFramework with Queue Items in Orchestrator is perfect for you. One of the main ponts about the ReFramework is that in case an error occurs, the process stops and restarts and resumes automatically.
Also in Orchestrator you can monitor the Queue Performance, how many transactions were processed, how many erros, and others stats.
Thanks for the reply. A bit more info on the implementation. The Workflow is implemented using the REFramework, a performer and a dispatcher. The dispatcher does a bulk load. I am going to experiment with 2 robots.
@ChrisC, Wow dude that’s a lot of records to be processed and its difficult to give advice with little knowledge, if i knew the process maybe i would be able to give advice or suggest something. just out of curiosity
Why did you choose to use excel and not just go with a database, like MS Access or MySQL?
And i am curious to know how are you manipulating this data, are you performing calculations or what is the process briefly?
Lastly why didn’t you just go with the old VBA Macros instead, just write some code on that excel to do whatever you want to happen in the workbook, what is the UiPath significance to your project?
@ChrisC - if you read 300k records and keep it in process buffer and fetching each record to process which can lead to high cpu memory utilisation and low performance…
maybe you can try with two individual processes…
1 - to push bulk data to queues (one time activity/based on data load and will release the memory once the activity completed)
2 - to process each queue items (if you have multiple robots in hand/free - you can dynamically allocate multiple robots)