How does OffsetY work in the Find Relative Element Activity?

Hello Community,

I’ve been working with Find Relative Element Activity and trying to figure out how the OffsetY attribute works. I have tried different combinations of positive and negative values along with the Position Setting (Center, TopLeft, etc.), but none of these combinations work.

Here below is a screen shot of the dialog I’m working with.

  • I have the element at 1 (Update Firmware), relative to which I’m trying to get the value of the label at position 2
  • The value that needs to be extracted is “Monoprice Mini V1”

I pulled a clipping of the operating area and it’s size based on current screen resolution is 504px x 136px.

Problem Statement :

Based on the image size, a value of 90px from the Center position of the parent element below must be able to help the Robot identify the label above the parent.
Assuming this size is correct, what must be the direction of OffsetY value I must define?
Is it positive 90 or negative -90?

The documentation doesn’t specify the +/- direction of the Offset values. Based on my understanding of the example application, +OffsetX is to the Right. But it doesn’t say anything about relative elements on the top/bottom/left of the parent element.

What am I missing?

RESOLVED! I think my reliance on the pixel measurements in the image was misplaced. With some experimentation, I was able to solve this. Posting the solution here so that community may find this helpful.

  1. OffsetY is negative for relative elements above the parent
  2. If Position=Center for parent element , values of -30px to -40px work reliably
  3. If Position =TopRight or TopLeft for parent element, values of -15px to -30px work as reliably

I could use the pixel range to parameterize the FindElement so that I don’t have to change the code if screen resolution changes in the future.

Output of the Sequence. I have 4 other printers configured, and this solution works regardless of which of the printers is currently active

1 Like

Hi @AndyMenon - Since I haven’t used Offset functionality here…i am trying to bring my Publishing experience to the board here…When we develope a code for a address position(so that it will sit perfectly in the envelope window). Our vendor will give us offset from Left and from top as shown below…Now we know what position we have to place our text from the top and from the left…

Sorry, if this did not offer any assistance to you…

Hi @prasath17,

Thanks for posting this. It definitely reinforces solutions to the problems we are trying to resolve. If I don’t find it helpful, someone else will. :slight_smile:

The issue here is not the actual numeric value of the offsetY itself, but the direction. Intuitively, it works for us humans, but the Robot expects a negative value in the upward direction.

Here is a snippet of the documentation for the FindRelative element . It doesn’t state anything about the +/- direction of using offsets. Would be nice if the documentation were to be updated to add something about +/- directions as well. I have left a feedback on the documentation page linked to the image below.


Cheers! :+1:

1 Like

This Automation has proven to be a pleasantly nice challenge to solve! Sharing purely from the point of view of Learning!

If someone shares my interest for 3D printing, I recommend that you try automating the Cura Slicer Software. For the uninitiated, a slicer software converts 3d model files into numeric co-ordinates for a 3D printer to consume and print out a physical model. I’m slicing a large number of models that are being printed across 4 of my 3D printers. It’s become really painful to slice the models manually and therefore I decided to have my Robot do it.

I was pleasantly surprised at the number of challenges this simple interface threw in my path. It kind of compels you to explore and use different techniques to arrive at the solutions! :slight_smile:

From a value engineering perspective, this is a great exercise in automation!

The interface has UI elements whose names are duplicated like the “Prepare” elements in the Interface below. UiPath Explorer does not know the difference between the Prepare tab on the top-left and the Prepare button on the bottom-right unless it uses the idx value! :confused:

Added to that, the name of the UiElement is its value and this does not give us much to work with even when using UiPath Explorer because if the values keep changing, the control names keep changing as well.
Similarly, there is no sure fire way to confirm that a model is loaded into the interface unless the model dimensions are visible on the bottom right. Again, because these controls don’t have fixed names, things like Enhanced RegEx selectors come to mind to get at the model dimensions and confirm that the model has indeed been loaded! :slight_smile:

The UI has duplicated Hotkey combinations (which I guess is a bug) that makes it difficult to open child interfaces such as the Print dialog. Check this out below - ALT+s and then p takes me to Profiles, and never to Printer :smiley: !

Which brings us to the Printer Settings Dialog below. None of the UI Elements in the target Black Box have a fixed control name.

  • A couple of solutions, such as targeted screen scraping of the boxed area, followed by Regular Expressions to extract the required information run across your mind :thinking:

  • And there is Find Relative Element activity that helped me solve this problem :+1:

The Cura interface has multiple child windows. Extracting Machine settings for the currently active printer is a good exercise for using Anchor Base Activity.

And why do I need this?

  1. The Robot compares the machine settings with the model dimensions (from above) and if the latter doesn’t fit on the machine, then it will conclude that the model is too large for the current printer!
  2. The Robot can then switch to another printer until it finds the right one suitable for the model
  3. This code will become universal regardless of how many printers I add to my slicer! :slight_smile:

Finally, the Close Window selectors for first & second child windows appear to validate during development. This is a kind of tease because they don’t work when the automation is actually run. Using UiPath Explorer helps resolve this issue.

I’m happy that this opportunity came my way. It’s a welcome change to automate something quirky once in a while!

:slight_smile: :+1:

1 Like

And to wrap this up, here is a short video on how this solution works.