Random delay in scraping loop

Hi,

I am scraping a website for 2-300 records daily. The process includes entering a code and and do a search for that code and corresponding data.

The website has now implemented a functions that seems to detect “robotic behaviour”, and it simply show and error message. If I do the search manually it works fine.

is there a good way to add a activity/code that delays the different steps of the process for a second or two, mimiping a more human behaviour?

You’ll be continuously fighting the website for what is considered robotic.

Things like Captcha don’t even just rely on the actions you do on a certain site but overall usage of the browser etc…

Things like turning off instant mouse movement, using hardware events not simulated, verbose actions (less efficient movement)… its ultimately about figuring out what you’re fighting - there isn’t really a standard outside of those bot-blocking services like captcha.

@wwls
for click (set motion type = smooth , delayBefore =
(new Random().next(1000,3000))

do the same for type into activity (for DelayBefore property)

make sure "Simulate click / type " property is unchecked

in your scraping loop you can add a delay activity
and set duration =
timespan.FromMilliseconds((new Random().next(1000,3000)))
(1-3 seconds)

1 Like