We have a number of processes at work using UIPath to automate various online processes. They have numerous sections in common for accessing the resource, assigning and getting activities.
Currently the bot’s running the processes are assigned to a series of VM’s on the network, being controlled by an orchestrator server.
The processes have several elements in common for accessing resources, assigning and getting activities. These have been extracted as workflows in order share between several bots.
What I wanted to know was the accepted best practice for sharing the code between several bots on different VM’s. We could store all the common workflows on one of the VM’s, and have the bots do a remote call as needed. Another way would be to physically copy the common workflow into the project folder and just call it locally.
What do you think? Are there other solutions?
I think when you have a dev machine with Studio and reuse workflows from a common folder like “shared workflows” you just write your process and when you upload it to Orchestrator it will package everything and create a nupkg. When the robot is started Orchestrator sends the file to the robot for execution. So you do not share workflows between robots, you share workflows during development.
What would happen if robots A, B, and C all reference a shared workflow in a common folder like you suggest… and I decide that the common workflow must be updated?
if I’m reading you correctly… this means that I would need to open one of the robot projects (A, B, or C), modify the common workflow, and then open & re-publish all 3 robots, because each of them must bundle up the new common workflow.
I was hoping to find a way that would allow me to simply change & publish the common workflow without having to re-publish robots A, B, and C.
You might want to experiment, but I am pretty sure that if the common folder is outside of the nominated project folders for the bots, it doesn’t actually get bundled into a the orchestrator package.
Instead it will get called live whenever it is invoke from one of the bot’s. This would mean you don’t have to redeploy the bots when you make a change.
Don’t take my word for it - set up an experiment and confirm the behavior. Even if it is accurate, that would open up it’s own can of worms; network connectivity, multiple bots accessing the file simultaneously, version control etc.
If you use the “Fullpath” to the shared workflows in your Invokes, your project will still direct itself to that location whether you republish it or not. When you publish a project all it is doing is copying the project folder, so when you use the fullpath instead of the relative path, it still calls that location whether you republish it or not. So, in other words, all you need to do is use the fullpath and place the .xaml files in a shared location on your network.
There are also disadvantages to this too, though, like when you update your shared workflow it better work on every project that utilizes it.
But, even then, this is how we are doing it. So, we share the workflows, then use the fullpath in the Invokes, so all you need to do is update one thing and publishing everything is pointless… unless the location changes where you need to make a change in the projects. EDIT: you can also use a fullpath to a settings file where you can edit the filepath locations to avoid this if you desire.
The hard part is trying to get your RPA team develop with modularization in mind. Too many times, you tell an associate that you already developed a piece, only for them to make a copy of it rather than just calling it with arguments or requesting updates to work with their projects.
There is an alternative method that is built in, where you can make an update then each project that uses it, needs to upload the update then republish again. It’s probably more stable that way, but I think it’s not that efficient or flexible. Also, I’m not very familiar with that mostly because it requires IT to set something up and needing to republish tons of projects eventually everytime an update is needed… scares me
Over time, I’m sure “best practice” on all this will evolve.
Thanks, that’s pretty cool.
I wasn’t sure I could just drag & drop a raw XAML workflow out on a file share directory and have it be consumable by a published process. Was thinking maybe there was some kind of special dance UiPath did when publishing a process… like compiling it into a binary or something so it runs faster, and maybe the common workflows needed to be published and/or compiled too (like a class library or something).
Nice to know it’s that easy.
Sharing code real-time v. sharing code in the package.
They both have pros/cons but I have been working on getting the shared code in the package.
By having version control download the shared code outside of the projects, there is only one copy.
By using a Windows symbolic link, you can make the SharedCode look like a subdirectory in each project:
(from each App dir): mklink /D SharedCode …\SharedCode
Then change workflows to reference the SharedCode in the link and not the physical directory so that the SharedCode gets included in the package.
This is next level stuff. Awesome.
I’m not certain I follow - I’m still new at most of this as I didn’t come from a coding background.
So you’re saying you have a file named shared.xaml that is being called by multiple other .xaml files. Rather than saving this in a shared directory such as //server/name/uipath/shared.xaml you are instead saving it to a version control system? Then a working copy is pulled down from the version control system somehow each time a project tries to call the shared.xaml?
No. The first question you have to answer is: Do I have any shareded code that needs to be versioned (e.g., Robot1 needs Shared.xaml v. 1.0 and Robot2 needs Shared.xaml v. 2.0.) If the answer is yes, both Robot1 and Robot2 cannot use the same Shared.xaml on a LAN drive. So you want to include the specific version of Shared.xaml with the Robot when it is packaged. But only code that is under the project directory (where project.json is) will be included in the package. So you can physically copy the file there OR (as mentioned above) create a symbolic link that will make the file look like it is physically there when you make the package. Why would you use the symbolic link instead of copying the file physically? It has to do with ease of source code version control. But if you don’t need it, then don’t use a symbolic link.
I’ve had a crack at this shared Xaml pattern via
mklink /D Shared ..\..\..\Shared (in my case) and can confirm that the invoking (parent) workflows run as expected both in UiStudio and packaged to UiRobot. Nice!
Caveat as we’re using git/github: the sym-link gets converted to a plain file when git cloned to a new folder, unless we add
core.symlinks = true to git config.
And when we do that, Git for windows doesn’t restore it back to a proper
mklink /D: UiStudio will run the parent workflow but I can’t publish it due to exception:
System.Exception: Publishing the project has failed.
Error: Access to the path ‘XXX’ is denied. —>
System.UnauthorizedAccessException: Access to the path >‘XXX’ is denied.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy, Boolean useLongPath, Boolean checkHost)
at System.IO.FileStream…ctor(String path, FileMode mode, FileAccess access, FileShare share)
at NuGet.PackageBuilder.WriteFiles(Package package)
at NuGet.PackageBuilder.Save(Stream stream)
at UiPath.Project.Deploy.ProjectPackageManager.Pack(WorkflowProject project)
at UiPath.Project.Deploy.ProjectPackageManager.PackAndPublish(WorkflowProject project, Int32 timeoutMS)
at UiPath.Project.ProjectManager.Publish(Int32 timeoutMS)
— End of inner exception stack trace —
Have you or anyone seen this and found a workaround?
found the workaround to the Git on Windows issue - nice and neat.
Recent versions of Git for Windows support handling of symbolic links; you just need to enable it during installation as described here:
With that enabled, Scott’s suggestion does work using git(4W) without config changes (see note):
- create symlink via
mklink to expose shared Xaml folder visible in the UiStudio project root
- git commit/push symlink to git/github like all other files
- git clone/fetch via git4W and symlink is converted back to Windows properly
Result: shared composable Xaml files stored once in a common folder in the git repo & invoked from multiple UiPath projects that run successfully in both UiStudio and UiRobot. Sweet!
note: haven’t yet tested symlink handling via git fetch/pull; only tested via new repo clone.
Hello everyone in this thread! So does this mean that there is still no embedded functionality available in UiPath for automatically control code dependencies?
Currently I am also working on solving the puzzle on how to implement the best possible solution for code sharing between UiPath projects, but all what I can see so far - are only the manual hacks (like folder sym-links, etc.)
Are there any official recommendations could be provided by UiPath team?
@ovi I know this was in the product catalogue. Guessing it hasn’t been implemented yet…do you have an idea of when this will be available as it’s a pretty important feature?
Also, do you have information on how it will be implemented. Presumably the two packages will be published together rather than separately, but how in the Invoke workflow command would this work? would it be …/componentspackagename/… for example?
Last I heard Orchestrator 2018.3 is supposed to have new code sharing features but I don’t have any details…
Yep - we’ve seen a demo from @Sorin_Calin and it looks perfect
Uhh… can we get a link then to some docs for this magic new feature?
The only thing I see in the 2018.3 docs is:
Common (reusable) components (such as App Navigation, Log In, Initialization) are better stored and maintained separately, on network shared drives. From that drive, they can be invoked by different Robots, from different processes. The biggest advantage of this approach is that any change made in the master component is reflected instantly in all the processes that use it.
I read this thread and what you are trying to do is really simple, if I got it correctly.
You just need to keep your common workflows (CW) and a commonly acessible location. When you want develop a robot, you invoke whatever CW you need, and then publish the robot.
Imagine you have deployed 15 robots that all use one CW. If that common workflow needs to be updated, you just need to update it and save it via UiPath Studio.
You won’t need to republish all 15 robot packages and when executed they will reflected the CW update.