I need to use the import txt file option in SAP used to upload data with a .txt file.
But here the file has around 2500 rows of data and this is too much for the SAP to process and eventually it gives an ABAP error after an hour …How do I ensure that the sap works fine and does not get loaded.
If splitting the file is one solution how can I do that can anyone please explain in detail
You can split the file into smaller chunks (200–500 rows each) and upload them one by one.
So the process each chunk in SAP separately.
This keeps each job small, avoids overload, and prevents ABAP dumps.
If you can change approach, use a BAPI or RFC to send batches programmatically (more reliable and faster than the GUI import). Also check SAP logs and ABAP limits (time, buffer sizes) and request a background job or increase timeouts if allowed.
@Chetan_Wagh
You can split the file like
Read text - get all the lines in array
Then use for each - so your files are splits in bunches
Inside the look use write text file activity
Then upload each sap file one by one
Use Generate Data Table From Text activity to a datatable and do the row spliting.
As it’s a text file, generate a text file with the desired number of rows and then pass the splited files to SAP.
Split the file into smaller batches and upload each batch separately, or avoid the GUI import and send data via a BAPI/RFC or IDoc which is more reliable for large volumes. To split and process in UiPath: read the .txt into a string or lines, choose a chunk size (for example 300–500 rows), create part files (part1.txt, part2.txt …) by writing each chunk with Write Text File, then loop through the part files and call the SAP import for each file. After each import check the result/RETURN table, add a small delay (5–15 seconds) between uploads, and retry or log errors if a part fails. If possible run the import as a background job in SAP or use a BAPI to push the data directly — this avoids GUI timeouts and ABAP buffer limits.