I’m building several process that all do different things to the same case.
All-in-all 6 different processes with their own queues.
They are too big and too segmented to all be a part of a single queue or process.
The important part is that they have to be performed sequentially and not handle the same case at the same time.
I’m considering letting the processes feed each other like so, loading the queueItem for all of the processes in the beginning, and then handing off the payload to the next queue. The data handled in Process 1 dictates/loads if the “optional” processes 4, 5 and 6 are to be executed:
Have you encountered any issues with having large specific contents in your queue items?
The information extracted in Process 1 will have to carry through the other process until the end.
The specific content for Process 4 will have to be part of the specific content for Process 2 and Process 3 as well as Process 4.
The specific content for each process will be a few tables (hopefully not larger than columns and 10 rows) and a couple of string, int, datetimes.