I have a requirement to process all the records present in a web application. Would like to get your advise on the approach to use high density bots, since the number of records that needs to be processed is huge.
I had the below doubts. It would be a great if anyone can throw some light or guidance on this:
1)There is only a single user for the application and the entire automation steps needs to be done with in the web application. Can HDB interact with the web application and process the records with out duplication?
2)There are no unique identifiers for the records and the selectors are not stable. The records get removed from the application list once processed. Hence the selectors also gets changed multiple times.
3)Since there are no unique identifiers/selectors what else can be pushed to queue, so that bot can identify and pick the correct record.
Appreciate any help/guidance on this. Thank you in advance for any help.
1)There is only a single user for the application and the entire automation steps needs to be done with in the web application. Can HDB interact with the web application and process the records with out duplication?
not really … you can actually take all out and therefore take it in chunks, I think this will solve the density.
2)There are no unique identifiers for the records and the selectors are not stable. The records get removed from the application list once processed. Hence the selectors also gets changed multiple times.
Ok, here you have 2 things, unique identifiers can be constructed like contactenating data items.
The second thing about the selector it can worked out by looking at them closely, I don’t think they can be harder as a website of a famous car service I just dealed with, oh gosh … that was a pretty annoying website.
3)Since there are no unique identifiers/selectors what else can be pushed to queue, so that bot can identify and pick the correct record.
Like I said I would like to see the data to se if you can concatenate some of them to create a unique identifier.