Thanks for getting back so fast.
First I want to update my observations.
When a queue has “Unique Reference” enabled, it seems like “Bulk Add Queue Items” activities simply won’t work even with single row datatable as input. It will either say “Bad Request” (if commit type is AllOrNothing) or “Unable to Add” (if commit type is “ProcessAllIndepently”). Similar response can be observed by manual upload through Orchestrator website:
Add an index column (or a formula utilizing certain column value of the datatable) as input for “Bulk Add Queue Items” is first and foremost a way to ensure compatibility of these two features. Ideally, we should probably also add an option to change the behavior of “Unique Reference” to instead overwrite (or auto deletion) of the old item.
Think about a routine case with following sequence:
Dispatcher A add queue items requesting updates on certain attributes of some entities; each such entity has its own unique ID.
Performer B process the queue at fixed time of day and send revelent users (linked to entity) confirmation emails.
Note that A can be triggered multiple times throughout the day, thus may target same entity. However, the content of such requests may be different. In case of a duplication, the earlier request should be voided.
Also note that B needs to be scalable, so queue items created in A must be decentralized.
Due to the limitation mentioned earlier, my current solution requires me to use a combination of “Get Queue Items” (reference = entity ID) and “Delete Queue Items” prior to “Add Queue Item” in Dispatcher A. And I must loop through the input DataTable to create each Queue Item.
If the suggested updates can be implemented, all I need to do is to apply a “Bulk Add Queue Items” with “Unique Reference” set as overwrite.