Hello, I have a case where I need to run 2 queries to pull customer info (the volume is 200,000 records and keeps increasing every day by ~ 100) and I need to find matches for all of those customer records against a separate database based on certain match conditions. My question is, how do I do this in the most efficient way possible? Currently, I was doing the intake process to load the customer items to queue, then running the matches against the 2nd DB and making sure the matches are the output for each transaction, but this will not work when the volume is so large as it will take 2-5 mins per transaction.
Pull all the records from DB1 and DB2 and store those into a datatable let’s say dt1 and dt2
Then you can use another datatable variable let’s say dt_commonRows
Use Assign activity and write as
dt_CommonRows = dt1.AsEnumerable().Intersect(dt2.AsEnumerable(),System.Data.DataRowComparer.Default).CopyToDataTable
For Uncommon Rows
dt_UnCommonRows = dt1.AsEnumerable().Except(dt2.AsEnumerable(),System.Data.DataRowComparer.Default).CopyToDataTable
Hope this may help you
When dealing with large volumes of data, it’s important to optimize the process for performance, If the customer data and the separate database are both in SQL databases, you can use SQL queries to filter and join the data before bringing it into UiPath. This will reduce the amount of data that needs to be processed in UiPath and speed up the matching process.
But what if the other data table has millions of data rows?