How to upload the data scrapping data only one item at once?

Hello All,

After scrapping the data table the table contains 5 rows.
While adding to the queues the bot should add the first row only ,after adding the data to the queue, it should process that data
Then it should add the second row to the queue and process that transaction and so on.

Thanks in advance

@naveen.s

After extracting the data table, use a “For Each Row” activity to iterate through the rows. Utilize the “Add Transaction Item” activity to add a queue item to the queue. This activity will also provide an output called “TransactionItem”, which you can use for further processing.

Process each transaction within that loop. After executing one transaction, the bot will upload the next queue item and continue the process in this manner.

I hope this will resolve your query!

Best regards,
Ajay Mishra

@naveen.s,

You will have to make your bot Iterative. Here is a sample modified REF as Iterative.

I just updated the logic in Get Transaction Data like this to make it iterative.

Sample Code:
IterativeRoboticEnterpriseFramework.zip (1.5 MB)

You can refer this video to understand Iterative type of the process:

The number of rows in data table is dynamic.
The bot will trigger for every 30 minutes so the bot will extract the same table every 30 minutes.

My actual requirement is for every 30 minutes first bot needs to extract the data from table, and add to the queues, then it should process all the number of queues.
After 30 bot should extract the same table and bot should check whether already added queue is present or not if present bot should skip that row if not present bot should add those rows then process the each data , and so on.
Like this the bot should do it for every 30 minutes

@naveen.s But Firstly you asked different question, and I believe the solution which I provided above is best suitable solution for that scenario, no worries!

Now for this, what type of data you are extracting do we have a unique column in that!? If yes, then while uploading queue items we will pass unique reference, and using this feature suppose in next execution when we again extract the same data and upload the queueitem, then add queueitem will throw an error duplicate unique reference. And we have to handle it using try catch and go for the next row.

This will definitely resolve your query!

Regards,
Ajay Mishra