Currently, when a downstream system goes through an upgrade which might have potentially changed random selectors, the only way is to run the automation, see if there are failures, and then debug them to find out if it’s a selector issue or anything else. We can use the UiPath Test Automation to rig up a test suite, but then that’s another product to the portfolio.
What would really help if we can have a pro-active Selector Validation Station which can leverage the UiExplorer to run through selector validations in a project and then provide a report of which selectors are valid and which might have changed. This would drastically reduce the selector debugging times, which are still the biggest source os unhandled execution exceptions.
So, is there are a plan to come up with a similar solution/approach towards Selector Validation?
You can check out some of our resources directly, see below:
Always search first. It is the best way to quickly find your answer. Check out the icon for that.
Clicking the options button will let you set more specific topic search filters, i.e. only the ones with a solution.
Topic that contains most common solutions with example project files can be found here.
Read our official documentation where you can find a lot of information and instructions about each of our products:
Hopefully this will let you easily find the solution/information you need. Once you have it, we would be happy if you could share your findings here and mark it as a solution. This will help other users find it in the future.
Thank you for helping us build our UiPath Community!
There are definitely different approaches to a problem, here is my take:
The selectors should be fixed at the process level, as different processes might have different impacts. Having a controlled setup to apply the solution will be a lot easier than applying to all processes and trying to fix issues. Having the option to push to all processes or selected processes will help in the longer run.
At regular intervals when there is an update to the interacting system, more likely every quarter, month, or every 6 months.
RPA Support Personnel or RPA Developer
This is a tricky one, I would like to see something similar to UiExplorer which tells you the selector is valid or not once the new selector is applied. Wonder if we can have an option on the station to check on the connected robot/dev machines as well.
Thank you for your response. Please see my responses below -
The SVS should be outside of the Studio, where the user can select a package that needs testing, and submit it to the SVS for validation. As the target persona might not be an RPA Developer, the user should be able to use the SVS outside of the Studio
This definitely needs to be on a per-process basis, as selector requirements and functioning cannot be generalized across processes, even if two or more processes use the exact same applications
This would be very specific to the Enterprise Operating environment and how often down stream sub-systems are getting updated. For example, Microsoft releases security patches, which are updated on Robot machines as part of MMW (Monthly Maintainance Windows). Ideally, as part of the CoE and ROC strategy, Automations should be part of the UAT before security patches are applied on Robot machines. During this time, SVS would play a crucial role in ensuring that all selectors continue to function as part of patch and no selector construct is getting blocked as part of the patching. This is just one use case, and there can be many. So, it can be as frequent as Monthly, but once every quarter is definitely not out of question, going by Application Release Train cycles.
The SVS should be producing a report ideally with the below fields -
Xaml Name and Path
Invalid Selector Details
Exception Message including which part of the Selector is failing
This would be primarily be used by the Ops Lead and Support Manager, to produce a report and hand it over to the Change Management team if changes are required.
The report from the SVS should be a valid Change Management Artificat, which can be fed into the Test Case Suite and can be validated against the changes made to the actual process, so that the Change Management Team has a Change Validation Check List to validate the changes against them.