How to used ExtractDataTable two time in workflow

Installer(.exe or .msi):

License type(Free, Trial/License code):

Studio/Robot version:

Current behavior:

Screenshot:

I am getting list of product and then after open url one by one then scrap data and i want to insert or write in a csv.
I am not able to do.
Please help me.

Hi @ayush.kumar

Create an excel and pass for each row and do the Datascrapping accordingly

Thanks
Ashwin S

@AshwinS2
I am not able attach my workflow.
I have done this strategy.

But not working.
How can i share my workflow with you?

@ayush.kumar
drag drop your workflow into this box before you click on Reply

I am a new user. and new user can not upload file.
Show this message when i uploaded.

@ayush.kumar,
I understand, in that case, can you please provide more details so that I can visualize

I have scraped product listing.
and call url of product one by one.and open page by URL and scrap data one by one.
Now i am not understand how to write data in excel.

@ayush.kumar,

What is all this? Sorry, I still didnt understand.

I have scraped product listing.
and call url of product one by one.and open page by URL and scrap data one by one.
Now i am not understand how to write data in excel.

@ayush.kumar,

What I understand is that you need to perform the Data Scrapping for two different pages in a website.
You can do the data scrapping on the first table :- dt1
and then perform the data scrapping on the second table :- dt2
Note:- use different datatable variables for each case.

Now, you can use write range activity where you put filename, sheet name and range.
in the value field, you can pass the datatable variables (dt1 or dt2).

Can you send me a test email?
I will share workflow.

I am new in RPA.
Please

@ayush.kumar

Done.

@hacky

Please check your email.

@hacky

did you find issue in my workflow?

@ayush.kumar,

As I see, almost all the activities are showing this message (missing or invalid activity)
This happens I dont have the packages that you used in your workflow.

All I understand is that you have created a flow which extracts couple of tables from “https://www.lazada.co.id/”. But apart from that there is no clear explaination of the steps. As the screenshots are also not visible.

If you could explain me what are the EXACT STEPS that are happening then I could create a fresh workflow following the best practices for you.

Thanks and Regard,
@hacky

@hacky
I want to search any keyword on website.
That after get all produect list with product_name,img_url, final_price , initial_price, discount,review,rating,seller_name etc.

@ayush.kumar

Capture

for performing the data scrapping on this page, you need to fix the display of the list like:-

  1. Should it be Thumbnail type view or
  2. Should it be List view.
    (note:- the options for the same is given in the image shared above. )

No need to change any thing.
Its default

@ayush.kumar

I see, but as I open the URL, I am getting either of the two in intermittent fashion.
Also data scrapping can be checked by using message box printing the following dt1.rows.count.tostring

Also, please have practice of checking the workflow in modules, this makes the testing very easy.

The activities that you have used are showing the error messages and cannot be detected.

@hacky
yes
I will improve.
I am new and learning.
I will do practice by your workflow.
When can i aspect give me workflow?