Using AI Fabric endpoint for projects on internal orchestrator

I am wondering how we can publish a project that uses an ML Skill endpoint for processing invoices on a process within our own internal orchestrator.

We have many jobs/environments/robots/etc set up in orchestrator installed in our system. We do not use the Clout.uipath.com orchestrator. How can we set up AI Fabric endpoints and use them in projects deployed in our environment?

We are looking for help on the same issue. Did you solve this problem?

Not really. We installed AI Fabric on-prem with a local Ubuntu server and started hosting models there. It has been a tough transition and I hope they release the ability to run cloud AI fabric endpoints on-prem, but this seems the only option for now.

Hi there,

We now have the possibility to run cloud AI Center endpoints on prem for Document Understanding model, we are going to release the possibility to do the same for all ML Skill in coming weeks, next step after that will be the possibility to interract with datasets, again using endpoints.
For Document Understanding models this work as on prem, first deploy the Skill then go into the Skill, modify current deployment and make it public. The authentication is done using DU Api key so make sure you use the key associated with same Cloud account.

Jeremy

Jeremy,

Is it correct to assume we need an API key for a pure On-prem installation of Orchestrator and AI Fabric?

Yes except for airgap installation

Hello Jeremy,

I have few doubts reagrding same topic, I have data manager and orchestrator on prem set up and we are using AI Fabric cloud endpoints in my OCR projects. It is working fine when we are only using endpoints but I can’t access ML skills which is on AI Fabric cloud version.
If I can access endpoints through API key then why I can’t access ML skills through API key?