Data scrape for each row with multiple data lines per input data - output with original input on each data row

Hi i’m trying to do a data scrape from a website with for each row. My input is simple a part number, from that part number the website has 3-10 prices and i need all of them, but i would like the excel output to have the original input part number as the first column on all the associated prices.

Example: Input part number 3 - Prices 7, 10, 11 next part number 4 - Prices 2,8,11
3 - 7
3 - 10
3 - 11
4 - 2
4 - 8
4 - 11

I’m using Write range

Hi @Mads_Hoxer_Larsen

Welcome to the UiPath Community.

What are you using for scraping the data from website? Are you using Get Text Activity?

Hi @Mads_Hoxer_Larsen !
Welcome to UiPath Community :grinning_face_with_smiling_eyes:
If I understand well, what you want is to have as for an output:
3 - 7,10,11
4 - 2,7,11
Is it ?
It would be easier to work on your xaml, do you mind sending it us here ?

@Mads_Hoxer_Larsen are u entering the Part number? is these part numbers are present in Excel, Or in Some collection?
2.Read the part number from excel -

for each row in DT
PT_num=row.Item(Part_number).Tostring —Use assign activity
3.Open the website , enter the -PT_num
4. use - Find children activity and get all the pricess
5. use for each and look through the o/p of the find children
for each item in str_collectio
use 2 write cell activities(one for part number(column A), another for price(column B)

This is the solution as per my understanding from your post.

Please provide more input so that we can give you accurate solution.

Thank you

This i my current version, with an excel input like this:

The output currently look like this:

Here it creates a sheet with each original part number and in that sheet the alternative part numbers and their relevant information we are screening.
My hope is it will look like this instead:

With the original part number in column A for all the scraped information, so E149134 is repeated on all the results of that search.

Already now, thank you so much for your effort

I’m using the data scraping tool