Hi to all,
i need some help on how to work on a table with several thousand rows.
I have to read an excel file with several thousand lines, a list of SAP IDs, on a single column.
The number of lines in the file may change over time.
I can’t do a single SAP search for the whole list. Data could freeze and slow down SAP.
So I have to divide the master file into smaller tables (ex 1000 rows at a time) and then do many extractions, up to the end of the list.
The list of IDs is on an excel sheet, but it would not be a problem to import it to DB if it simplifies the work.
I don’t know how to create this flow.
Can someone help me?
can you please clarify your requirements.
As far as I understand you correctly, you have a long list with SAP IDs in an Excel sheet and you want to search for all these in SAP. You want to have a search query that contains several thousand search values.
Thanks and best regards
I have this ID list.
The number is almost always the same. It could increase by a few units.
The lines are not many. Just under 10k, but if I top up the whole list, SAP could stay saturated for a long time.
Perhaps because he is looking for activities from 2018 to today.
The idea was to create segmented searches.
ex: 1000 lines at a time, up to the end of the list.
and of course, save each chained extraction in a single file.
financial_control.xaml (113,5 KB)
I glue my current WF, but as it is made it is not performing.
the problem is that SAP gets stuck and slow in the network, and random, it goes into error.
Can anyone help me if it’s not too complex?
Thanks a Lot.
it is an unregistered transaction in the profile I am using.
I believe that it is not possible to request it, because costs are expected that the company certainly does not intend to bear for this use case.
So we’d need to stick to the segmentation. Are you able to split the table to separate files and then search for each one in SAP? Or maybe read the input file and search for 1000 records in one time and repeat the process?