Data Scrapping error - Column does not belong to table DataTable

Hi,

I am trying to use Data Scraping for the table on the webpage, it was working fine until yesterday.
Once I click on the row I want to select, the robot is showing the following error message:
image

I tried with initialized data table before scraping and ContinueOnError True/False, but that didn’t work as well.
What could I try in order to fix this?

Anyone had similar problem?
Thanks in advance. :pray:

Hi

If it was working earlier then it should be

Try restarting your machine and give a try

And is it occurring when doing data extraction or after that

@bp777

Hi @Palaniyappan, what do you mean by “ma home”?
It occurs during data extraction

It was a typo
Sorry for that

I meant to restart your machine

@bp777

@bp777

Restart the Machine and launch studio.

@Palaniyappan still facing same issue after the restart

Can you check the Column names that are correct in Dt @bp777

@bp777

check column names in project.

Can you share a screenshot of preview in data scrapping
Let’s check once the datatable column and it’s name

@bp777

Hi , Use check database initialization and column datatype. It may be not correct.

@bp777

Check the intilization of the data table and column name in the Datatable

I have initialized the data table at the beginning of the sequence.

Column names were extracted automatically since the robot could recognize table format on the webpage.
Currently the robot is not able to see the table and outputs wrong column names.

@bp777 It looks like something changed in the application. Hence the bot not able to identify the tables in the application and so it is showing empty column in the error msg

@bp777

Can u Share the Screenshot?

Definitely, it’s very weird since the same thing was working a few days ago.

@Vaibhav_Rajpoot_17 I am not able to share the screenshot of the table since it contains business data, but you can find the attached error screenshot In the post.

Edit: I have also changed project settings to modern but the issue still exists.