I am scraping a website for 2-300 records daily. The process includes entering a code and and do a search for that code and corresponding data.
The website has now implemented a functions that seems to detect “robotic behaviour”, and it simply show and error message. If I do the search manually it works fine.
is there a good way to add a activity/code that delays the different steps of the process for a second or two, mimiping a more human behaviour?
You’ll be continuously fighting the website for what is considered robotic.
Things like Captcha don’t even just rely on the actions you do on a certain site but overall usage of the browser etc…
Things like turning off instant mouse movement, using hardware events not simulated, verbose actions (less efficient movement)… its ultimately about figuring out what you’re fighting - there isn’t really a standard outside of those bot-blocking services like captcha.