Can we integrate Orchestrator with Tabluea desktop or UIPath Studio to get live logs data?

Hi, I am trying to create a dashboard in tableau which will show live visualization of the orchestrator logs. I am new to UIPath and I am not able to get live data to the tableau. All I can do is export the data from the orchestrator which will be a manual task. I want it automated either direct integration or getting it through the studio. I am using a virtual machine for this so any mouse click activity gives me an error as it is unable to find UI element when VM is not in use.

I’m not familiar with tableau, but you can connect to Orchestrator API and request logs that way.
https ://platform.uipath.com/xyz/xyz/odata/RobotLogs

Check documentation page:
https://docs.uipath.com/orchestrator/reference/robots-requests#retrieving-robot-logs-according-to-the-robot-name

1 Like

According to their website, Tableau Desktop supports a number of data sources. A few that jump out at me are

  • JSON files
  • Microsoft SQL Server
  • OData
  • Splunk

At the core of UiPath (Orchestrator, Studio, Robot, ect.) UiPath uses NLog to manage and write logs so you have a lot of flexibility on how and where to send the logs you want in the format that you need.

JSON files
You can simply introduce a new NLog target and format to dump your robot logs into a JSON file that is also rotated by NLog.

Microsoft SQL Server
UiPath Orchestrator uses SQL Server as the backend database. You’d have to familiarize yourself with the database schema, but so long as your Tableau Desktop can reach the source, this would be a viable option. Alternatively if you are concerned about performance, etc. You could create a new NLog target to direct a copy of the logs to another database in the SQL Server, or even another Database type altogether.

OData REST API
UiPath Orchestrator uses OData for it’s REST API as @krzysztof.szwed pointed out, one of those endpoints is the odata/RobotLogs. There are others so you are aware as you don’t mention which logs you are interested in. With any luck, you won’t need to know much other than the endpoints you are interested in to make the connector work. Keep in mind that best practices is to perform periodic maintenance on your SQL Server Database, so if you want more than just the live logs, you may need to look at sending your logs somewhere outside of Orchestrator for historical reporting.

Take a look at the documentation for OData / API in Orchestrator as well as Swagger for more details

Splunk
I thought I would mention this, depending on your setup and resources, constraints, Splunk has a Free tier that allows you to configure and ingest up to 500MB/day. It is easy enough to configure NLog to send log events to Splunk and/or ingest that other data sources as well.

1 Like