Can't scrape anymore a website with simple fields

Hi,

I was scraping fine yesterday in a website named Chorus Forms; the robot recognized all fields and gave me separate selectors for each field.
But today when I came back to the selectors of these fields, I noticed that they were no more valid.
When I clicked on Screen Scraping option, I noticed that UiPath is not anymore able to scrape the data separately: it’s taking the whole page.
I checked the browser I am using (Mozilla Firefox 60.9 ESR x64) and it still has the Add-on UiPath Web Automation. I did not change anything in the code, I did not update any package, and I did not update the browser.
Studio Pro version: 2020.4.1
UiPath.UIAutomation.Activities version: 20.4.1

I don’t understand why the robot can’t recognize anymore fields in the website ? How can I make again the robot recognize the presence of the fields ?

Best regards,
HB

Browser extension may be disabled. Can you try to reinstall it and try again.

1 Like

Thank you, it’s working again !
To prevent this case in Production, how do I know if the extension is disabled before running the robot ?

EDIT: it’s not working anymore again :confused:
When a new Firefox page of Chorus Forms is opened, again the robot can’t recognize fields.
I can’t reinstall each time the add-on everytime I want the robot to read fields in the website, it’s not going to help in production.
Is there any other option ?