Queue Improvements (Delete, Query)

i_completed
r_2017_1
orchestrator

#1

@sajal.chakraborty @richarddenton @nicolas.roussel @Trevor @AlexF @andraciorici @andrzej.kniola

Based on the feedback received from more customers I gather here some queue related feedback.

  1. UiPath functionality to query the queue
    There is no way to interrogate queues in UiPath like get exception items, get duplicate items [this could be customized based on date range- SQL queries to fetch queue items from DB ] etc

  2. How to get a queue item which is not in the “New” status ?
    How to delete a queue item from the queue ?

  3. Business is asking for feature of Deleting Queue Items based on certain criterias, for example Queue items which get Abandoned because of network issues.

  4. Ability so search by specific data

  5. Progress or custom field on queue that can be updated at any time during processing

etc

Based on this I would I would centralized it like this

  • have an identifier on queue item and to retrieve that item based on the identifier
  • search and retrieve an item by specific data
  • delete the item from the queue
  • bulk change the status of a queue item based on specific criteria

Here is how I think we should proceed with these.

  1. Progress is relatively easy to be implemented and will be soon under development
    An activity (Set Transaction Item Progress any ideas about the name) will be made available and it will let you set a string item property. Nice to have will be an option to append (default) or overwrite the current values. Set Transaction Status will also let you change the progress of the field.

  2. In order to retrieve an item from the queue or to delete an item from the queue
    2.1. First step would be to manually let the users to do this from the Orchestrator web based interface. A new queue item status (deleted) is necessary. Business will be able to filter and delete queue items in bulk. I think that nice to have would be to be able to change back the status to new from the web interface. I think changing priority, postpone should be also made available from web.
    2.2. In order to retrieve an item that doesn’t have the new status an identifier will be needed. Taking into account that we have also the request to query an item based on specific data or check for duplicates I’m thinking that this identifier should be a hash generated on top of specific data. This identifier can be generated client side too.

  3. Query the queue client side based on specific criteria (filters). This will return an array of items that can be after processed one by one (call Delete Queue Item, Postpone Queue Item, Set Transaction Status, etc)

Waiting for feedback…


#2

Feedback:

2.1. Delete queue items will be done manually from the web interface
2.2. Item Identifier will be handled by the customer via a CustomId field. There’s no need for us to control this via a hashing mechanism. The customer will control it by setting that field. You will put there, for example an id from an external application.
The question is… shall we allow duplicates?


#3

Some additional feedback:

1 Progress field:
Definitely needed and would elevate the ability to reprocess transaction from specific step without splitting processing to multiple queues.
In conjunction with later points it would be nice to be able to also select (#2/#3) based on this content.

2.1 agreed.

2.2. CustomId is the common sense approach we’d agree with. Duplicates should be allowed or possibly a setting per queue, similar to autoretry (from functional perspective, it’s understandable that from implementation it’s a different beast).

3 How will the statuses of such bulk query be handled?
In particular in a multirobot environment, how to prevent overlap but make sure that there isn’t a surge of abandoned items in case of a crash when processing.
We were thinking of something functionally similar to how service bus handles reserved messages. If an item ends in fetch query results, change it’s status to reserved for X time, afterwhich it will be reverted. There should be a possibility of extending the time as well.

We were also wondering, that although this might not be the intended initial use case for bulk querying/cherry-picking transactions, what would you think about using this to optimize processing time for branching execution processes (where used systems are known based on specific transaction content).


#4

I think this can be handled at the workflow/process level right now. If n errors in a row stop the execution.


#5

Yes, but the remaining items will need to be returned to the queue.
In case that there are execution errors that’s fine to handle, just as it would normally be for single items, but I’m thinking about a situation where either robot crashes completely (unhandled exception, OOM, service crash etc.) or the network connection goes down.
With current rules if the status of the items is not changed on fetch query, they can be taken by multiple robots. If it is changed on fetch query, a crash will leave all unprocessed one in WIP->Abandoned status.


#6

If we are going to implement a CustomId field what datattype it should be in the database?

  • varchar(32)
  • int

?


#7

I’d vote for varchar, it should be as flexible as possible.
If filenames or combined indexes would be used 32 would not be enough. In some processes we use ~40 length filenames (timestamp alone is 16).

64 maybe?


#8

Hi,

It would be a ideal scenario if there is an option to populate the Queue in bulk upload rather than AddQueue item activity…


Key Identity field for data
#9

will do a setting per queue - allow dupes in 2017.1 SP


#10

Hi,

Can you please tell me if any transaction in queue has failed, can we try that transaction again.


#11

Just added an updated comment on Set Transaction Progress