Job stopped with an unexpected exit code 0xE0434352 Windows Suspend

Hi all,

I get this Job stopped with an unexpected exit code 0xE0434352 error when read a huge text file, something like 2m rows. Its not happening on small text files. Then I realized it’s because windows suspend the application, a green leaf appears near to Robot Application after some point where Robot occupy around 2.8gb on ram (My computer 8gb). I tried on different computers but still get the same error. I could not find any solution somewhere and this is crucial about my project so I really need your help. I attach exapmle photos for better understanding.

Thanks in advance :slight_smile:

image

Hi
welcome to uipath community
i hope this error must be coming when using excel file
–kindly use EXCEL APPLICATION SCOPE activity when processing 2 m rows
and if we are using already then fine
–but followed by this scope activity use KILL PROCESS activity with processname property as “EXCEL”

Cheers @Dogu_Ciloglu_Alumni

Thanks for reply,

But no its coming after using read text file and generate data table activities.

image

Hey @Dogu_Ciloglu_Alumni

Can you share the workflow so that we can run it and see what is causing it?

1 Like

I cannot share the data cause its confidential but the workflow is just like the picture I shared with you on my last post, nothing else. Also I tried in csv format and it gives same error. I tried it on many different computers include server, all same.

The text file is something like 1gb so can I upload the data here?

i’m also getting same error when trying to read a text file which is of size around 700 MB

I am also facing the same issue with a text file size of 522 MB. I don’t know whether is there any restriction in size of the text file when we read it via “Read Text File” activity.

Please help us!

I am also facing this error with a file of only size 250 kb

Hi All,

I had similar problem. This is a system issue and probably there exist no UiPath configurations to make this sort out. But I sorted this using following approach.

I had a csv file with containing rows more than 40000 and 15 columns. I broke my file into many sub files (getting first 1000 lines and storing it to a data table). Then the problem get solved.

The best way to overcome this issue is to take a look at our process and think about minimizing the overhead involved from the system perspective.

Hi,

What type of files is this? Roughly how many records, lines are there?

I am having this error too… a large text file

Same error occurs when reading data from a database (a table with no more than 50 rows)