Resolving dependency issues on robot systems without internet access?

Hello all! My very first post :wave: :slight_smile: !

Is there a better way to resolve nuget dependencies on systems/VMs without internet?

Current solution: I’ve resorted to identifying the difference in ‘Users\username.nuget’ folders & sub-folders from a working (internet enabled) environment compared the non-internet robot virtual machine using PowerShell scripts and a comparison application. Then resolving the dependencies by uploading each one to the tenant ‘Packages’'Libraries’ feed in Orchestrator.

After ensuring the depencies on Orchestrator match the working environment, re-running the robot on systems without internet pulls the correct packages from Orchestrator. This is time consuming because there’s no easy way to identify which dependency a project actually depends on, as I’m comparing two environments globally.

Ideal solution: Would be to run a script on an internet enabled machine where I could point to a project.json file and it would recursively either collect (from the running system) or download (from the internet) all dependencies (and sub-depencies) as *.nupkg files which correspond to that project, and then I could upload them to Orchestrator manually.

To take it one step further, it would be a dream if I could then cross-reference the libraries on Orchestrator against the dependencies identified as missing, and highlight only the ones I need to upload.

Does anyone have any suggestions to improve on my current solution, or to get closer to my ideal solution?

Thank you kindly!

not sure why you are willing to upload the packages manually? they are all custom packages or official?

If you do not need to have the internet connectivity on VM, how would the robots are going to run?

you can have a setup to send all the components/packages created by team to the host library feed and in all the tenants have the source for feed as Host. That would make sure if the robot is downloaded on a VM it will get the required packages from host. As to run a robot you need internet so that will be done smoothly.
Hope this helps, let us know your thoughts

I’m uploading the packages manually because all systems have no internet access (custom/3rd party/offical packages).

The robots don’t need internet access to run in an on-prem configuration (Orchestrator included).

Thank you, but we don’t/can’t have internet on the robot VMs.

For an update on my current solution, I’m now building projects pre-launch in a clean VM, then I scrape / push all the missing packages between the VM and Orchestrator. I do consider this solved with this adjustment, given the constraints of the environment.

I might suggest building one or more internal Nuget repository that acts as a mirror to the public repositories.

This way you have control if you want to mirror all or a sub-set of public packages and which versions. You would then configure Orchestrator to use this internal repository as a source or configure your Library and Package feeds to use an external (to Orchestrator) repository.

We use Artifactory from JFrog as well as Nexus for internal artifact/package repository as well as for managing mirroring of external repositories so that Systems that don’t have direct internet access can still do their dependency management during the build process.

Another potential option is to allow the Orchestrator or Robot VMs access to the public internet to authorized sources via Whitelist / ACL to specific trusted Repositories.

In the same vain as Artifactory , there are a lot of other services and products out there like myget.org that can be hosted publicly or privately (Cloud or On-Prem).

Long term I wouldn’t want to do all that manually… it’s one thing to take care of your Packages’ dependencies, but what about your dependencies’ dependencies’ dependencies?

1 Like

Thank you very much for the insightful answer Tim! I do believe this is the correct long term approach and will mark it as the solution.

We’re currently quite a lean operation (I’m sole RPA Dev), so it’s not too much of a setback at present. However, you’re absolutely right on the unsustainability of a manual approach long term. I will save your post for when I’m ready to tackle a more comprehensive/forward-thinking approach such as the one you shared.

1 Like

In case it helps anyone who’s still doing the manual approach,

Run these PowerShell commands from the .nuget folder in both environments,
Get-ChildItem -Path .\ -Recurse -Include *.nupkg | Select -ExpandProperty Name | out-file c:\temp\packages-env.txt

Then run this against both text files you’ve gotten (or use a diff application),
Compare-Object (get-content .\packages-required-cleanvm.txt) (get-content .\packages-test-existing.txt) |
where {$_.SideIndicator -eq “<=”} |
select -ExpandProperty inputObject |
sort | out-file unique.txt

unique.txt will show the packages ‘unique’ to the clean environment compared to the one you intend to run the process on, simply upload the missing packages to Orchestrator and re-run the process.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.