HTTP request to fetch job logs

Hi,
I have a Process Monitoring process where I use an HTTP request to fetch logs from a specific process folder. Currently, I’m able to export logs, but I want to modify the process so that I can retrieve logs for each job directly without relying on the export functionality.

From my investigation, I found that the log export limit is 100 per tenant per day. This makes it difficult to monitor jobs at scale.

Is there a way to use an HTTP request to get the job logs in JSON response (for each job) instead of exporting them?

Any guidance or sample API requests would be really helpful.

Thanks in advance!

@Shahazeer_U
You can obtain the Job List by utilizing the Get Jobs activity and use the API to retrieve the logs for a job by iterating through the job list.

https://cloud.uipath.com/{organizationName}/{tenantName}/orchestrator_ /swagger/ui/index#/

A solution can be found on the Forum.
How download Orchestrator Logs for specific job through API - Help / Orchestrator - UiPath Community Forum

Where are you seeing the limit of 100 logs per tenant per day?

As far as I know its throttled, but its per minute, not per day.

Not 100 logs. The number of times you can export logs of any process/folder per day is 100. My initial process used to export logs for each job. There used to be atleast 100 jobs in each process folder. I hit the limit so I had to modify to export logs from each process folder and then split the logs into respective jobs using excel activities.

Orchestrator - Rate limits and large data field usage optimization

@Shahazeer_Ummer_D I believe it’s possible to get more than 100 job logs, and there are no daily limits on GET request API calls.
I’m referring to the link you shared.

Hi,

I think we can achieve it using /odata/Jobs with date filter etc.
(I cannot share but i have written code using it)

Please notice that there is limitation of number of log per a request and rate limit.
So it may be necessary to call multiple requests small size logs not to exceed number of log limit per a request and may wait a few second not to exceed rate limit.

Regards,

Initially I was getting an error while fetching it cause I had a small mistake while using Filter for JobKey. I have fixed it now and i’m getting the logs for each job like the API call you mentioned . For the log per request limit can you advice on a solution. Some of the processes the logs can exceed 1000+.

Ah sorry, I missed that nuance between the export limit and the job logs limit.

Just fyi on why these limits are there. Its precisely to discourage the type of thing you are doing, doing alot of data dumps on the API is costly for UiPath and they want to push people instead to Insights (not saying you shouldnt continue btw, just adding context for why these rate limits are there).

Regarding the 100 logs per request limit, you have to use pagination and stagger the requests. If you have 1000 logs you need to get them in 10 batches across 10 minutes.

Not sure if that will be possible given your limitations.

I do however have another solution to propose.
Would you consider catching the logs at the source, as in when they are generated, so you can store them in whatever downstream location you are using? This way you skip the API and any rate limits altogether?

Hi,

If number of job logs of certain period exceed limitation, separate it into 2 periods then call a request for each.

Regards,

Yes, I understand the API thing is discouraged but we need more information so I need the logs.

Catching the logs at source is a good idea and it might save time but the implementation of this might be complex and take some time. Might try it for a future scope

Thank you for the suggestion about splitting the time periods. I’m thinking it might be a bit tricky for some of our processes that generate a lot of logs in quick succession, even within 1 minute if we take 1 min as the time period to split . That got me thinking about a similar approach.

I’m thinking if I could instead get the timestamp of the last log from one request and then use that to get the next set of logs and keep iterating till the log count.

My Logging scope can help you do it easily

If you wrap this around your process it will output a list of Logs, which you can write to a text file or whatever you want to do with them.
I think thats easier than the pagination and trying to grab the logs in the way you are by the API given all the hurdles.

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.