This is the sample data in the input sheet:
Also, i have created output sheet to upload in DB, as DB alows only 100 per upload so i am doing so…
After successful upload i am deleting the output sheet data as i have to get the next batch(next 100 records), using this below method,
and i will repeat these steps over 4L data.
Kindly let me know other possible ways to handle this.
ppr
(Peter Preuss)
March 25, 2024, 3:27pm
22
We would assume / ask if the datatable can be used directly for the DB Update
In general we would expect a flow like this:
Read In Excel - All Data - dtOrig
Assign: TableList = dtOrig.Chunk(…
For each | item in TableList
do the DB Update/Actions with item (item represents the segmented data)
OR
Read In Excel - All Data - dtOrig
Assign: TableList = dtOrig.Chunk(…
For each | item in TableList
Write range - use item and write it to a new Excel - new sheet (we avoid the cleansing of previous data)
Use the newly created Excel for the DB Update/Actions
the second option is what i am currently working with, and if i create a new excel for each 100 upload then i will have 400 new excel, and providing the input to DB is also risky if i constant with same name, so will use the same output with cleaned before upload.
ppr
(Peter Preuss)
March 25, 2024, 3:40pm
24
the for each activity has an out-of-the-bot index out property and you can use it as well for excel filename / sheet name