My Challenge: Internal website starts to become slow to load after processing 100 items from an xlsx document
Task:
Using the ReFramework
-Read original input xlsx doc to get total number of datarows to process (normally over 10000 rows to process)
-Split input .xlsx in to batches of 100 rows
Loop:
- Log in to site
- Process each item in current batch
- Log out
- increment to next batch
What i have tried:
I currently have a basic workflow completing the above in a loop however, the internal website I need to interact with throws so many exceptions that it’d be more reliable to move this process to the ReFramework.
Initially the data to process is captured from an internal .xlsx document (dtInput), so i read this in an initial sequence and calculate the number of batches to process (as per intBatchSize = 100).
I then assign intNumberOfBatches = Convert.ToInt32(<Math.Ceiling(dtInput.Rows.Count / intBatchSize))
Within the next sequence i have:
Login workflow…
for each batch in Enumerable.Range(0,intNoOfBatches).toList
RowSet = dtInput.AsEnumerable.skip(batch*intBatchSize).Take(intBatchSize).toList
This basically removes the rows that have already been ‘worked’ from the dt
//Take the contents of next batch from RowSet list & copy to dt:
dtRowSet = RowSet.CopyTodataTable
Then:
For each item in RowSet
//Process accordingly
Log out workflow
Apologies i can’t upload my existing workflow as it contains internal data, but can any one help transform this in to using the reframework?
Thank you in advance