Duplicated row in Append item to collection

I used it but it seems that duplicate the last row only ? I don’t understand why ?

for each to loop through Dt lists , then inside of it another for each to loop through rows in each Dt , so inside of it I add the Append item to list , but it duplicate the last row as the same number of Dt rows ?


@Hisham_Alshareef

First of all your append collection out is empty…how are you verifying that?

Cheers

@Anil_G
Yes i forgot the photo was before i assign a variable in the out collection, but i did assign a variable in it and same error

@Anil_G

First we initialize a var list of dictionary …

arraylist1 = new List(of Dictionary(of String , String))
json_dict = new Dictionary(of String, String)

then inside the For each in Dt we used Add to Collection activity with argument type Dictionary(of String, String)

and the collection is the list variable above > arraylist1

Now, want to add these dictionaries to list ??

Noted that json_dict is a dictionary hold the rows and it work fine…

@ppr

this is the post I meant…

@Hisham_Alshareef

Canypu tell what you are trying to achieve…first you havr a list or you create a list of datatable and now your loop through each and then in each you are looping through each row…and adding to collection?

Is thia what you are trying to acheive?

Cheers

Yes …

we have a list of Dts , this is the first loop

then inside this loop , we have another loop for each row in dt

then we need to add the dictionary json_dict into a the list arraylist1 … through the loop for each row in dt …

@Rawan.md
thanks for your PM

just share some sample data, so we can check for rewriting options

grafik
set output to arraylist1 or another variable, but dont let it empty

I will share with you a sample project

sampleData.xlsx (8.8 KB)

Sample.xaml (47.8 KB)

In each datatable it’s just take the last row in it and then duplicate it as the same number of its rows…

I couldn’t understand , why !!!

@Rawan.md

Here you are adding data with same string so the dictionary item will get re qitten for every datatable loop

No Datatable assigned

cheers

this is the final json format I need for each DT :

{
“bulk_data”:
[
{“key1”:“value1” , “key2”:“value2”},
{“key1”:“value1” , “key2”:“value2”},
{“key1”:“value1” , “key2”:“value2”},
]

}

No there is assign an ouput of dt called dt_table

but is seems removed when I created a sample project …

@Anil_G

@Rawan.md

If you want each of the group as a separate item in bulk_Data then each should have a different key and also you have to re initilaize the array_list1 as for each group it should have new data and each new data should be a different key…for changing key you can append count to the key string

cheers

I’m sorry !
didnt got u … can you explain in more details …

each DataTable will convert to the Json format I mentioned above
bulk_data is an Object of dictionary ( string , list ( dictionary(string , string ))

@Rawan.md

So here you are creating datatables which contain data grouped by Color Column

Then you are looping through each table and then through each row to add each row of datatable to a array…

image

Now you are appending the data to Bulk_data dictionary to a key bulk_data

here the grouped data is clubbed back again as for each datatable the array_list1 is still having old data and after creating list after each datatble the data is getting re written to same bulk_data key value in the dictionary of bulk_data…

Instead if you need a dictionary of bulk_data for each color combination…then you can either use color as key or change key by using count in the bulk_data dictionary key …like this bulk_data(“Bulk_data” +Count.ToString)

Hope this is clear

cheers

cheers

no , I can’t add anything to the bulk_data :cry:

about the dattables which are created by grouped by color … this is not necessary it could be by a specific number , let say each 2000 rows in separate DT … because I will handle a huge amount of rows more than 20,000 rows …

So for each iteration of datatable in datatableList

I will have the Json format I need ( only one key bulk_dat )

then I will send it through HTTP request …

This is the main whole scenario…

@Rawan.md

But it is not getting separated…as the array is getting appended always… inside both loops…so even if you are dividing into 2000 rows still at the end in array 2000 are combined any made in 20000…if that is needed then youneed not do all these loops…

Cheers