Studio to Orchestrator supporting templates and artifacts

Hello all,

I will try my best to ask this question, I may very well be overthinking this, but bear with me here…

To set the stage, lets say I have a local development machine, One server hosted Orchestrator and three unattended bots on VMs.

When I create a new project in studio, REFramework project for example, there are multiple folders and “extras” created at the original local project path, such as a “Documentation” and “Data” folder, as well as the config.xlsx file that is created.

When I am ready with my project to publish as a package to Orchestrator for unattended bots, what is the behavior and function for these extra folders and config file as they exist on my local? Are snapshot copies of these folders pushed to Orchestrator and all items are bundled within the published package? Or is this local original project path queried with each execution of the automation?

In other words, do my unattended bots need drive access to the original local path in which the project was developed and published to Orchestrator to access something like the config file?

Thank you!

Hey @sean.finn

Short answer: No, your unattended robots will NOT be looking to access your local extra files/folders (which they would of course fail if they did).

But also, you’re not overthinking it! It is a totally valid question and the good news is that the packaging mechanism takes care of it.
Here is a sample library package under a user’s ...\.nuget\packages folder:
image

Here is one of my deployed processes under the same packages folder:
image

The 1.0.1-alpha.1 through alpha.4 are four versions I deployed to Orchestrator.
Each of them stores the “snapshot copy” as you put it, and is downloaded to the unattended robot machine, including copies of the config files and all other extra files.

Two more things about this:

  1. This does not protect your code from any local files that you explicitly referenced in your code.
    For example: if you access a locally installed target application or a file with its absolute path, it will most likely fail unless the same path is accessible on the robot machine. Something like C:\Users\sean\ may not be available on your robot machine (or at least your robot user will be unable to access your user folder due to permissions).
    That will cause an issue. Ensure to reference your files with relative path, not absolute path (e.g. “Data\Config.xlsx” works across all bots, however “C:\MyProcess\Data\Config.xlsx” will create problems for you)
  2. Inside the Data folder, you will also find Input, Output and Temp folders. If you delete the placeholder.txt file in those folders and publish it to the orchestrator, it’s possible that the folder will not be copied to the package (since it’s an empty folder).
    NOTE: If your process downloads something to Data\Temp folder for instance, it may be worthwhile to put a Path Exists check and create any missing folders.
1 Like

Thank you for this information! Very helpful! I had a feeling that there was something here with relative vs absolute paths, so thank you for the clarification!

One follow up question I do have:

Say I have a project here and it is up and running in a PROD instance, I assume then if I wanted to make a change to the config file (or any other file for that matter) that I would then need to re-publish the updates from Studio back into Orchestrator, is that correct?

Thank you for your expertise!

Yes, if you are including those files as part of the Nuget package.
We often keep a template of configuration files in the Project, but the real configuration files are stored in a shared location accessible by the Robots/Machines that need access to them and then referenced through Input Arguments which can be set with default values in the Project and if needed overwritten at the Process, Trigger/Job level.

This allows developers to default to their own little sandbox and as we promote a process/project through environments we are not having to build multiple releases for simple configuration changes.

@codemonkee That’s a very effective way to manage projects without multiple version changes on Orchestrator. Thanks for sharing!

@sean.finn
@codemonkee’s answer covers everything quite well.
As for managing the config files, I’ve recently learned another way to ensure minimal code changes for config related changes.
Using Storage Buckets for storing process-related config files.

You can upload a config file (you could rename the config files specific to process to remove ambiguity - e.g. for a dispatcher-performer pair of a process, Config_Performer.xlsx and Config_Dispatcher.xlsx files can be stored to the same storage bucket)

You can use Download Storage File activity to use these files.
You will need to change the config loading logic in your workflow (e.g. InitAllSettings.xaml in RE Framework) such that instead of using “Data\Config.xlsx” it will download storage file to, say, Data\Input folder and use that config file instead.

In short, there are multiple ways to achieve efficient config management, do what works for you! :slight_smile:

Yeah, Storage Buckets are good as it gives you a standard interface for the files and the storage technology can be changed without having to worry about how the process is accessing the files.

We actually don’t leverage Storage Buckets, etc. Still using a NFS, which allows us to manipulate the files in place or if they are too large to process over the network in a timely fashion to copy them locally and push them back.

Pros and Cons to Buckets vs Native File Storage it mainly that if you want to work with the files with 3rd party services, they need to query Orchestrator to get the Meta data in order to fetch the file and process it, etc. But in the end this is no different than other Storage Buckets like AWS S3.

Another reason we use an NFS so for the Exception Screenshots and other data files, if you were to leave the default file location in ReFramework, these would be stored in the expanded Nuget destination and if not kept in check can fill up your Robot’s local diskspace depending on how you have the Robot and Nuget configured and requires additional maintained and monitoring.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.