"Automate" QA/UAT documents with consistency!


As a business begins ramping up their governance model for proving “accuracy” and “completeness” as part of production deployment…
…it is important that one maintains ones sanity!

Has anyone thought about creating a package to automate the collection of this visual evidence and documentation, while a process is running?

There are many components to governance documentation, so this is specific to “accuracy” and “completeness” for UAT (and QA too).

What ideas do you have to automate this, if any?

Should this be one workflow that triggers your process and in parallel can somehow detect when to provide the visual evidence, maybe through various arguments?

Or, should this be many small components which you place in key locations in your process… maybe this is a more tedious approach?

I’m thinking the more challenging part is collecting the visuals and proof. Then, once everything is collected, just generate the documents to comply with your governance model.

Other features:
— template or config of governance model to provide versatility
— store transactions which were tested
— get Robot name, Env name, and timestamps of job runs (for QA)
— trigger on additional robots which have not been tested on yet (for QA)
— show results of job runs either by screenshot or from logs somehow (for QA)
— get manual proof of same transactions (can the robot screenshot itself or has to be manual?)
— get results data
— organize manual proof with automated results so a comparison can be made easily
— generate email for approval from business user

Testing Logs are something that needs constant attention when patches or process changes are made, and this can get repetitive. It’s also human nature to take the path of least resistance, so accuracy and compliance is not sustained as well as it should be.

So, please share any other features that could be useful for this idea… or if something like this has been created which could be referenced.

Thank you for taking the time to read and discuss. You know, at some point, we need to get some savings from ourselves the developers and governance engineers…




I hope having someone to detect when to take the evidence while the process executes, will be better as we can enable that person to trigger the process of taking screenshot the moment he/she want to save it, with key press trigger and followed by that a Take screenshot and saving it would help us in these situations and is easy comparing with a small component along the process.
Because at some time’s especially in UAT we may face issues that we are not yet aware of
So if any such appears then we won’t be able to take that as an evidence if been a small component developed and added along the sequence (not sure because I haven’t tried to take screenshot via automation process in parallel at different spots as we need while the process is executing)

And once after saving those evidences we would be able to attach them to a template if needed

Cheers @ClaytonM

1 Like

This is a really interesting topic!!

Regarding screenshots, what I did a few times was record the process using the Optage - Desktop Video Recorder Activities Go! component

1 Like

I changed the category, although I felt it was somewhere inbetween Developer issues and Discovery :smiley:


I wasn’t sure too, that’s why I updated my reply a few times :sweat_smile:

1 Like

The video solution is pretty cool, and I suppose I could always just record with a third party like OBS. But video seems more difficult to comply with “accuracy” and “completeness” cause you need to watch a long video of the process rather than just quickly looking at the manual results and automated results. Plus, the size could be larger. But, I think the main problem is you can’t compare results and confirm that it was successful or not. It’s definitely something that would work well for creating demos. Or am I thinking about this wrong?


You’re right, but the video allows you to pause and take any screenshot you want without the need of the process to running live. It doesn’t help much, but it’s less tedious than debugguin the process just to take screenshots.

It would be nice to have something like JUnit in Java


Hello @ClaytonM

Another interesting topic from you :slight_smile: I would also like to share some experience on this…

Few weeks back, I was consulting someone here in the forum on one of their projects which really had lot of logging. The process really included some amount of error handling and considerable amount of logs were also generated due to their standards and requirements. In the workflows, I noticed something interesting… They have included a set of tasks at particular key locations, to extract logs and also screenshots combined with different conditions they wanted to check… I would say business rules or logic rules for example… All these screenshots and logs were then saved in a process execution document which is an excel.

So, basically, if you look at that excel, it has all the information about the process that was executed along with the screenshots… Very detail and very nicely done. A person can go through it and understand what really is happening in the process. The only concern I had was, whether this is an additional overhead to the process because it has lot of other things to do while working on the key process is it is supposed to work on. I was thinking about the efficiency and the execution period… However, if such level of logs can be generated and if it can be documented like you mentioned, it would be really helpful for a person to understand. It will also save lot of time from QA, as well as from a developer because they don’t need to go through this burden manually…

Considering the workflow, they had included those tasks in the same process workflow. However, I would really believe that if we can introduce a standardized workflow template to handle such logging, it can be invoked in key locations similar to how this guy had done it. This will make the workflow more readable while getting the requirement done as well…

What do you guys think about this…


Yep. There’s also a thing as “too much logging” too, which he should be careful of. Like we don’t need to know if it navigated to each and every little thing the process does. But, it is important to know in the Logs, for example, that the application was successfully loaded, and if not, why it failed to load and if it is the type of exception that should be retried or end the process due to invalid password. Devs will suck at this starting out, but hopefully over time, improve on this. The logging should be provided by the components (or Libraries if you have advanced to that point) which perform each task of the process, and should not really be done by the individual project. In the end, you would be able to see that the task was successful with any details that are helpful with it. You should “not” see a full page of Log Messages from a single task, because it fills up your database and most of the information would be useless.

However, doing this well doesn’t exactly help prove to the business that it is completing the items successfully. They need to see the visual of the actual results. This is because the Log Message is whatever the developer told it to output and doesn’t show that the previous actions before did what it intended to do.

On the other hand, you do need to capture these actions in text, so you can see what the screenshot represents.

Speaking of components and Libraries, one challenge might be how do you interact with a component which you may not have developer access to?, because (in ideal situations) it would be installed from a Library feed. You almost have to do something in parallel and feed it the information it needs to know when to screenshot. Unless you take the more manual approach which would involve you pressing a keystroke fast during the process (which would be difficult I think) or using a recording to snip the images.

Still brainstorming or atleast having fun thinking about ideas. Thanks guys.


Using the same. We have it integrated into our process framework.

1 Like