Design Best Practice for interaction between Projects?

Does anybody have any Design Best practice for interactions between projects?

Within a project, design best practice between workflows is understood and documented.

Orchestrator provides the ability to “Schedule” but does not give “Dependency”.

The scenario would be a “Project” in Studio, creates a “Package” which is then defined as a “Process” for use in Orchestrator. Depending on the success or otherwise of that “Process”, a different “Project” may need to execute in order to “Clean-up” from the error situation.

The same “Clean-up” process might be used for different “Process” executions, which therefore does not suggest that the code should be duplicated into each “Process” as extra workflows.

My only thought is to use an Orchestrator queue and place information on that. Such an approach would mean a lot of executions of the “Clean-up” process to check on the state of the information on queue.

HI @DavidMartin

You may need to have a look at this post. This has lot of information on running multiple separate projects through another controller workflow…

I have a developed component which is a much improved version of what is shared in the above post. It is still in review state and will be published soon in Connect.

Let know whether this helps!!

2 Likes

Thanks. It was not a post found when I was searching.
It contains some good information.

Here is my view on this…

A “project” should invoke from a pool of workflows, and those workflows can either be specific only to that project or shared among other projects and other future projects (in other words “reusable”). In this case, there should never be a reason for one project to interact with another project, in my opinion. And, doing this would messy up the infrastructure of your robot and project environment. - I cringe whenever I work a project created by someone which consists of multiple processes deployed in Orchestrator, and has made maintenance annoying.

If you are talking about workflows like “clean-up” workflows or various other tasks that may be common between multiple projects, then there are few good methods for this. One is to simply place these workflows in a network location accessible by the robots, and invoke the full filepath. A second way is to look into deploying Libraries, then the projects can see these from Orchestrator. Another way, which I have never done so I am not sure how to best do it, is to publish it, then invoke the workflow from the local location where the workflow package is deployed to on the robot machines.

I don’t know about any of this documented though. Before Libraries, we just simply call the full path to the workflow, therefore storing them in a network location. Now that Libraries are here, that might be a good solution, but is something that takes time if you are like me and have a large number of workflows to redeploy as Libraries and also fix all the projects to call these Libraries.

But, anyway, sorry if none of this is helpful :upside_down_face:

Regards.

1 Like

Thanks for the input.
I think you have illustrated there are different approaches and expectations about how functionality should be divided in order to get isolation and reuse.
Regards Dave

Hi Clayton,

Do I understand you correctly that your clean-up workflows are then not deployed as part of the project but rather on a company-wide share? Interesting approach!

We are working on some somewhat complex automations with multiple state machines, with shared code, however different invoke parameters and I guess this could work with that. The issue that we see is the complexity of maintaining the solution rises and some tend to move to the old ways and have a copy of the code in each sate machine folder. However this brings the other issue of remembering to change the code in both places :skull_and_crossbones:

Thanks for your insights, and @DavidMartin thanks for you question as well!

Sonny

If it is something that might require updates to for every project when an update is needed, then sharing it would definitely benefit. If it’s something that as long as it is working for that project, then keeping it as part of the project would work. - I use a combination of both. I believe you will really want to eventually migrate most things to a Library project, because this will give you versioning.

You just gotta be careful about sharing the workflows when there is no versioning, because any changes to that workflow can impact any already deployed projects. That is why Libraries should be something to move toward, but it is challenging for sure, since you might already have many many projects and many many reusable shared workflows that need to be migrated. - it requires an RPA infrastructural change.