Best practices for URL management


The aplication for automatization has different ulrs for each environment .
What are the best practices for url management for environments (Dev, QA, PRD)?
I was thinking of using the UiRobot.exe.config, what are your opinions?
As a note, my project only uses robots it does not have Orchestrator.


This is my opinion but I think it’s best to manage your global parameters in a text file like a .config or .settings file for example. Also, if you use direct path to that file then you don’t need to republish the process (if you use Orchestrator) each time you make an adjustment (while that may or may not be a good thing)

you could have the urls for each environment set as parameters and a parameter for which environment to use.
Maybe like this:

envir:= prdUrl

And, you can have other settings too. Your robot would read the “envir” parameter in and use that to read in which URL you use.

For example,
Read Text file to config as String
Assign envir = config.Split({“envir:=”},System.StringSplitOptions.RemoveEmptyEntries)(1).Split(vblf(0))(0).Trim
Assign url = config.Split({envir+":="},System.StringSplitOptions.RemoveEmptyEntries)(1).Split(vblf(0))(0).Trim

You can also do this differently using a json file but I’m not as familiar with its syntax.

In any case, I also think your best practice should be consistent among your organization and there’s no real right answer to what best practice to use. You don’t necessarily need to use a config file but it does make it easier when you start having tons of parameters to go through.


Just an opinion, why not use config.csv files for Dev Pilot Prod from shared location, not just urls you could store other essentials which vary in Environments. Since Uipath has read range activity, the lines of code to retrieve these values will be less than reading from text.

We have data disclosure restrictions where the dev team is no given direct access to PROD data.
Thus, having one config file that has all the env values is not a possibility at all.
We, most of the times use assets.
This way the code in all environments uses the same asset name. but the Environments management team handles what value should the asset hold in that environment.
This way the code also stays intact, data is also not disclosed and URL’s are easily managed.

Note: we have different Orchestrator instances for DEV, TEST and PROD.

If there is a case where single orchestrator instance is used across environments, then you can have a naming convention in place like DEV_Assetname, TEST_Assetname, PROD_Assetname

1 Like

True, but the User does not have an Orchestrator.

Also true in case for 1 config file for all 3 environments since Prod and Pilot environments are isolated in most of the cases.Besides Orchestrator which has just the Config URL, we use 3 different config files for 3 Environments and Prod sharing is limited to few.

Currently we only have Config Loc and Credentials in the Assets, remaining parameters in the Config file. Not sure if its a good practice, but I find multiple asset creation in multiple orchestrators tedious in 2016.2

Missed that, my bad

True, Asset Management among multiple Orchestrator environments can be tedious.

Another method that can be used for @agjj677 or his organization is to create a workflow that returns the URLs as arguments.

So he could use Invoke Workflow and tell it if it’s a Test or Prod so it can return the correct URL for that Process.

1 Like

Then would you put the assets in “settings” in config for test and prod url’s and have one boolean asset in orchestrator to switch between the two url’s (ie test, prod)?

Yeah, I guess you can do that. So basically have both test and prod urls in the config, then have another setting where you put “TEST” or “PROD”. So when you use the setting in your code, you can use that string as part of the key.

Let’s say you have 2 urls next to a key in your config like this:

Job_Environment       PROD

SaveDir_TEST         \\server.domain\folderTest
SaveDir_PROD         \\server.domain\folderProd

Then, in your code you can use the dictionary like this:


Essentially, combining the environment with the dictionary key string.

This is just an idea, though, and I realize my post in this topic is from like 1.5 years ago.


@ClaytonM How can we manage different test data’s per environment using this approach. Do we need to create test data for different environments as assets?

First, I would create a Boolean Argument in your Main xaml for ‘TestMode’, so you can easily set that to True or False in Studio or Orchestrator (after it’s deployed).

Then, use that Boolean in your code to set your input and output filepath locations, which would be read in from your config xlsx. So, IF True then assign test filepaths, else assign prod filepaths. You can also assign it in the dictionary.

IF activity condition: in_TestMode
    Assign config("TestMode") = True
    Assign config("Input_FilePath") = config("Input_FilePath_Test").ToString
    Assign config("Output_FilePath") = config("Output_FilePath_Test").ToString
    Assign config("Portal_URL") = config("Portal_URL_Test").ToString

    Assign config("TestMode") = False

(although using Library projects, the URL would likely be integrated in the Library package and therefore a test argument would need to be implemented in its own arguments)

You can also set other variables based on the Boolean argument if needed.

Finally, place copies of data if needed in the Test location.

You can also use Assets if it’s something shared between multiple projects.

This is just an idea, but hope it helps.

@ClaytonM Thanks for your response. I have more than 2 environments in which case I think switch will work instead of if-else but after looking at different options, I think different folders for different environments will suite best to my needs.

Yeah, switch would be a good option and you can set the environment using an argument rather than using a Boolean type as I mentioned in my idea.


Assign config("Environment") = in_Environment

Switch activity condition: config("Environment").ToString

    Assign config("Input_FilePath") = config("Input_FilePath_Env1").ToString
    Assign config("Output_FilePath") = config("Output_FilePath_Env1").ToString

    Assign config("Input_FilePath") = config("Input_FilePath_Env2").ToString
    Assign config("Output_FilePath") = config("Output_FilePath_Env2").ToString

@ClaytonM Where does the Input_FilePath_Env1 and Output_FilePath_Env1 files exists? Do they exist in Orchestrator queues or within the studio. In my case, I will be using the test data across different processes so maintaining them in queues will makes more sense, right?

In my example, I was storing them from the config xlsx in a dictionary so it can be easily sent to parts of the process. However, you are right this isn’t a good approach if they are used by many projects.

In that case, then using Assets would be an option to set the file locations for each Environment so they can be pulled in when a job starts.

If the locations are specifically related to the transactions being performed, then yes include it in the Queue item properties rather than Assets.

Or use a combination, where the Assets hold the directory and the Queue item holds the complete file path.

For example, you may have a check number or date that changes in the filename. So when you add the transaction, combine the file path with Path.Combine(envDir, fileName)

But only use Assets if it doesn’t change between processes. Use the Queue item for things that change per process and transaction.


@ClaytonM Ok I see what you are saying in terms of setting the file path variable how to use assets and queues but my doubt is more around where to put the data files. Do I need to put all these files as part of project in Studio or should I put them in Orchestrator Assets or Queues. I think it should be queues as that’s what allows you to have a collection of data and that data needs to be accessed across process, right?

There are two ways to do it:

  1. Simple way is include the link to the file associated with the queue item in the queues. The location of the file should be on a shared location on the network that can be accessed by each environment and robot user.

  2. Store the actual data in the queue item. This requires that it is in json format though, but there is a .net solution to convert data tables to a json string and back to a data table.

From my experience, I have used both. Many projects I have completed will store the data row in json format containing the transaction data being processed in the queue. So, then when the queue item is pulled it converts the data back to a data row which can be appended to a results report. And, then a link to the output file will show up in the output section of the queue item.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.