Increase max page size through query to Orchestrator API

Hello guys,

I want to retrieve the queues and jobs from Orchestrator cloud API but I only retrieved 1000 jobs and queues because the limit of 1000 of the $top parameter.

How can I increase the value of this parameter? I found something related to odata.maxpagesize, but I don’t know how to pass this through the query to the API.

This limitation does not exist in on-premise orchestrator. Sorry I have no way of validating this

Since it is specified in the documentation I’d assume odata.maxpagesize set above 1000 would be ignored.

Edit: odata.maxpagesize seems to be ignored even when $top is not used.

You can loop over the request until @odata.count from response is less than or equal to sum of $skip and $top parameter. You’ll have to manually increment $skip as well.

1 Like

I believe this is part of the OData specification in which UiPath is limiting it to 1000 per page.

https://docs.uipath.com/orchestrator/reference/building-api-requests

Rate Limit for Results Displayed for Automation Cloud Orchestrator API Request

Note that, for Automation Cloud Orchestrator services, the results displayed by the API requests are limited to 1000 entries for each page.
You can use $top and $skip parameters in your requests to retrieve subsequent pages. For example, use the GET https://cloud.uipath.com/odata/RobotLogs?$top=1000&$skip=2000 request to retrieve the robot log entries between 2001 and 3000.

1 Like

Thank you guys, I applied the solution using the top paramenter and increment of skip parameter. I tested a case where the response would be nothing, and I notice that I could apply a condition to verify the if the response value were zero, and it worked. But now I have this issue

Depth of pagination is limited in Elasticsearch by the max_result_window index settings. Make sure skip+ take is lower than 10000. Error 1015.

image

Please see

The API is for querying logs, meaning it’s optimized to help you find a specific set of logs based on some criteria. If you want to get a large amount of robot logs from cloud (for example for archiving purposes), you should use log exporting: Exporting logs

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.