I have developed a robot that enters a web page of a public company, this company is an entity that supervises the tax payments of the state. My problem is the following:
When I manually enter the page, I load a .txt file into the page, and the bills are paid correctly. However, when I perform the same steps with the robot, I get an error on the screen.
I have done many tests with the user, I execute with the robot and the web application crashes. We ask the user to do the process manually and it works.
I tried to work with incognito, but neither.
Some idea of what to make in these cases? Is there some form that the page of the state detect that it is not a person, but a robot and for that reason it blocks? Can that happen?
Could you give us more information about the website?
If the website needs credentials to log in, could you maybe just share screenshots of the site crashing and the activities that you are using to upload the file?
Sometimes (although rarely) it is possible that your bot is interacts differently with an application compared to a human. This is not apparent on the surface, and it’s probably due to the website being built using legacy technology .
Can you try a couple of variations of the optional properties such as ‘Simulate Click’, ‘Send Window Messages’, ‘Activate’ and ‘Click Before Typing’ etc.
Sometimes that yields different results, many a time you will find one successful combo that works consistenty.