Writing output logs into SQL Server

Hi - We are interested in writing the output status into a sql server table. For example - there would be say 20 bots performing an invoice processing process. we need to capture information like

  1. Transaction Start/End Time
  2. Completion Status
  3. Failure Message
  4. etc.,
    for all the 20 bots (may be 100 records each per day) into sql server.
    What is the best way to do this? Will it be a problem if 20 bots try to write and there might be a deadlock and thus delay in bot execution?

Any other option like, bots updating a text file or some sort & have another bot just to write into the DB?

Share your thoughts & best practices

regards
Sri

I think you can do it inside the workflow itself. The output of log message activity can be passed to the execute query activity in order to write it to the sql database.

Thank you.
Question was more on - if 20 bots are trying to write to the same table at almost the same time, will there be a performance bottleneck and hence slowness in bot execution.

Is this the best way to do it?

Regards

I would suggest to write a seperate worflow which would read the log files and put those into the database … Think of a workflow which will pull up all the log files form 20 or so bots , since log files are in json format , parse them and load them into the Database. This ways you would not encounter locks on tables since only one workflo would perform the inserts , and also your workflows would not get stuck (in error etc) due to database issues (connection / downtime etc )

But i would definitely recomend using ELK stack for all your logging purposes, if that does not help out or you want to acheive something different which ELK cannot offer , maybe then i would look at other sources.

Ideally i would log everything in ELK , and use ELK API’s to pull specific content for any downstream processes ,
ref - ELK API’s

use add logs fields activity

One option is to keep bots performing and SQL updating itself separately.
Use below SQL package, it contains functions to query Orchestrator via API.

What you can do is schedule SQL job per hour or whatever schedule you like. In SQL job call query to read all recently completed jobs. SQL reads status/ logs and updates SQL.
This way both functions can be independent of each other.