How to Edit Data- or Can I edit Data extracted from a table xml page in the Data definition from Data scraping?


#1

Hello Everyone,

As I continue working on my iddea on UI path, I find some difficulties so I try to read the learning guide and watch videos and whenever Im stuck I post my question here .

So after using the function for Data scraping on a webpage xml that contains data from a query. I want to get specific rows and columns and not all the table, However because of the reason that the query has sub- row which is under the main rows , so after i perform an action to show the sub rows, I can not use the data scraping to get main and sub rows along with the columns together and i recieve an error . SO the only option that i have is to take the whole table, However there are some extra rows and columns that are taken with needed table that i want . For that I see that there is an option for Editing Data definition when I click on it i see the xml to

I was wondering if there is an option in the Data definition to delete the extra column or row and if so, what it would be like or what code language should i read about ?

Also how to write a code to delete a colum or a row from a extracted table?

I have attached a screenshot, hopefully it will clraify what i mean.

Thanks
Reagrds


#2

you do not need to delete explicitly if you are doing data scraping, when you select column to scrape data you might have added empty column, so before completion of extraction of data have look at data preview and you can delete from there only. else loop the data table check if all values in column is null you can delete similarly to row, if all column value is null you can delete the row.


#4

I didn’t add empty columns it was when i tried to add the only needed columns it gives me an error and the other way was to take the whole table. Regarding looping data table, the thing is that the workflow goes like that … the bot access the query which is online get the data in the table and then save it in a csv file, so am in the step of after saving it in csv- I want to filter the data with so that when the csv opens the data will be clean and doesnt contains empty columns. How can I loop that? does the data needed to be filtered before it goes into csv after its captured via data scraping - or - after saving the data it in csv but then how becasue when the workflow will start the csv doesn’t exists yet and only exists after its saved from the step of write csv. any ideas ?

thnx


#5

can you attach you workflow.?