Pulling logs from DB

Hi all ,
we want to pull the logs from orch DB and build some reporting in power bi. has any one done this exercise before ?
Note :
we are having elastic server but we thought we will move away from elastic. for various reasons. so we thought this kind of serial architecture would be good.
robot logs – orch – sql db – reporting DB
rather than robot logs — orch – orch db and elastic ( via web. config simultaneously)

so please let me know your experience in this

I would look at adding an additional target to your Orchestrator NLog configuration so it writes the logs both to Orchestrator’s SQL Server DB and your Reporting DB. If your Reporting platform can keep track of ingesting the database table, that could be an option, but depending on how far back you want to keep in Orchestrator’s DB, I think NLog might be your best route.

We do something similar but for Orchestator > Cloudwatch > Splunk. A simple diagram can be seen here

1 Like

well , thanks for your answer first of all. we are looking for a serial execution because,
when we give two targets , and one being a flat file then there are problems. we are missing some logs. when contacted uipath support they said that "writing to flat file is not a good choice " . currently we have beats which will harvest the logs from the log file and send to ELK. but because beats also creates problem some times and ( we are using only community edition of beats and elk ) fail to harvest the logs, we are missing the vital logs. so below architecure is better

image

than ,

In my example I use a flat file written to disk in a JSON format, while I don’t disagree with a flat file being my top choice, it does have its time and place. (I selected it out of time constraints at the time). That said, in the last two years, I haven’t had any issues with missing log messages (at least not at that stage of the pipeline), if you are in that scenario, I would probably dig into why.

Targeting a flat file to disk is by no means your only choice, you could just as easily direct the logs to another database, or directly into an ELK stack or where have you.

The concern that I would have with another system (as I haven’t watched how Orchestrator interacts with the SQL Server is additional load and locking of Orchestrator’s database which could have a performance impact on your Orchestrator/Robot activities.

Are you looking at a replica using SQL Server functionality, or another solution for reading the data directly out of the original database? (e.g. Splunk’s DB Connect) - Did you have a target solution/platform in mind for your reporting piece?

db-splunk
db - power bi

this is what we are thinking for. and do u use and time out and buffer writing while target being a flat file ? also concurrent writing set to be true ? am talking about web.config file