Anyone out there doing Kibana insights?

Hi all,

I’ve just discovered the magnificence of Elastic Search + Kibana, so I have some big new ideas.

I’m wondering how many of you are already using “Log Message” to write structured bits of real-time data into your logs from which you can later glean metrics and insights? Is there a better approach?

I’d be curious to hear about some use cases and/or successes…

1 Like


I recommend you to have a look on the activity Add log field" and use those custom log fields to make clear segregations between your logs (ex: transaction status, exception type). Using them will be sligthly more robust than only considering messages only.


Thanks for the affirmation. That’s definitely in my plan.

I was just curious to know whether anyone is already doing this kind of thing… i.e. emitting JSON-type entities into your ElasticSearch logs that can later be interrogated via Kibana queries & aggregations… or even interrogated in real-time via API to make AI-type decisions during robot execution.


I’m currently using Kibana for just that. Within our automation processes we use the add log fields and log message activities to send specific bits of data to Kibana. From there we are creating dashboards and visualisations to send out to stakeholders.

Are you looking for anything specific?


Great to hear. Since I’m pretty new to ES & Kibana… I was hoping to identify some role models to spark ideas and help affirm my approach.

So… given that the most common ElasticSearch use case is to index NOSQL “documents” for the sake of providing search & insights in the context of an application… I’m wondering whether you are essentially indexing structured “documents” from your robots via Log Message + extra fields?

I know that log message will add a string into the log, and since I haven’t used the extra fields yet, I don’t know what kind of additional structure that introduces into the log message. I’m guessing the normal Log Message item is already JSON format behind the scenes. Do you have a sample of what that format looks like? I don’t have access to our ES instance yet to see it.

When we add new fields, are they just sibling elements next to the original log message field (within the same object), or are they child elements of the main log message element? I’ll be trying it myself soon… but if you had a quick sample of what a message with extra fields will look like, it would help me wrap my brain around it.

I’m trying to imagine whether I have any ability to create a hierarchical relationship (nested objects) using “add log fields”, or if I would need to create my own JSON objects in the standard “log message” call to accomplish that.

Before I learned about extra fields I was imagining that I might create my own JSON string (having multiple NVP) that describes a searchable entity… and then simply logging that.


So I’ve attached a preview of a test log message (some parts ommitted) of the way that it looks as a table and as json. On this you can see three fields, called argument1 ,2, and 3. These 3 fields have been added using the add log fields activity.

When you run a log message after the add log fields activity this will then push a new document to the elastic search database. The message field is the specific log message and then any add log fields are added as extra fields in that document. (The 3 arguments in this case)

When we are using this for data logging we make sure that the documents we want to count have a specific log message in and then filter for that message.

hope that makes sense.


Hey, I am working on a “best practices” document about logging and reporting with Kibana. It is following the same principles as what Jake describes above, and should contain all the information you need to know to squeeze the maximum out of logs/Kibana. Once we review it internally we want to make it public to all users.


@Sorin_Stania Looking forward to this document

Kibana RPA how-to version 1.pdf (1.2 MB)

This is the Kibana document. I have also finished the logging best practices document, but we are still validating it internally.


Thanks! Will check it out later today.

Thank you

Hi Sorin,

Have you finished the section regarding logging ? I would love to give it a read,

best regards,

Hello - Is this available anywhere?

1 Like


We are currently working on integrating this into the product to make it easier to use, and also into the ReFramework. Once this is integrated, most of the things mentioned in this document will happen automatically, in the background. Are you using the ReFramework by any chance ?

1 Like


Please see the next reply. We have been working a lot on the methodology and integrating it into the product, so it’s still a moving target - basically I need to write a new version that is simplified in terms of implementation, though the principles are the same… so please stay tuned. You can use the ReFramework in the meantime; please ping me again in a few weeks, I might have some updates. Thanks for your patience !

1 Like

Not yet, but we have plans once we upgrade to 2018.X. Currently we are working on enhancing our reports using Kibana/Elastic Search. In that process, I’m trying to research the best ways the Robot can perform logging, so that the reporting can be easily accomplished.

That was my goal as well

Thanks @Sorin_Stania.
I am new to UiPath and struggling to get RPA log in “.json” format which seems basic to start on this document.
Would you suggest any configuration settings to store log files in “.json” format or point towards a discussion/exercise where its explained. (currently log files are in .log/.txt format)

Thanks in advance.

Hey there, sorry for the delay. Not sure we are discussing about the format of the file, or the extension of the file. You can change the extension but that won’t change the internal format of the file. What are you trying to accomplish ?

Hi, I wanted to report on UiPath log. Currently we are using Desktop version.
The log is getting created in .txt. Whereas to write the log in Elasticsearch and view in Kibana we need .json file.
And that’s the reason I am searching the way I can store the log file in .json and I can feed in to Elasticsearch.
I found one way (changing the config file) to store the log file in .json but I faced issue as that .json file didn’t have header action before each line I couldn’t bulk upload it in Elasticsearch.
So we need direction/help on how we should go about this till the time Orchestrator is getting set up.