Hi, I have a very large excel file which has 350k rows and I know that the large excel files has to be read in chunks, but some how I’m unable to put it logically in a workflow. How can this be achieved? Some pointers which will help me get to it will be appreciated.
One suggestion could be to add a variable called counter. Before your foreach activity create the variable and set it to one (or 0). After you do everything in the foreach activity increment counter with the assignment activity. counter = counter + 1.
All the work that has be done put it in an IF activity. Inside the IF statement write something like this “counter >= 10000”. So if counter is greater than 10000 use the break activity. If the counter is less than 10000, continue with your work and again at the end of your work increment your counter as stated previously.
Do you have a screenshot of an example of this?