Pyspark in Ui Path .. ValueError : signal only works in main thread

Hi,

Is Ui Path allows running PySpark script ? I have a simple script that I am trying to invoke but it says
“signal only works in main thread” … Ui Path doesn’t allow multi threading?

here is the error:

RemoteException wrapping System.InvalidOperationException: Error invoking Python method —> RemoteException wrapping System.ServiceModel.FaultException`1[[System.ServiceModel.ExceptionDetail, System.ServiceModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]: ValueError : signal only works in main thread

Server stack trace:
at System.ServiceModel.Channels.ServiceChannel.ThrowIfFaultUnderstood(Message reply, MessageFault fault, String action, MessageVersion version, FaultConverter faultConverter)
at System.ServiceModel.Channels.ServiceChannel.HandleReply(ProxyOperationRuntime operation, ProxyRpc& rpc)
at System.ServiceModel.Channels.ServiceChannel.Call(String action, Boolean oneway, ProxyOperationRuntime operation, Object ins, Object outs, TimeSpan timeout)
at System.ServiceModel.Channels.ServiceChannelProxy.InvokeService(IMethodCallMessage methodCall, ProxyOperationRuntime operation)
at System.ServiceModel.Channels.ServiceChannelProxy.Invoke(IMessage message)

Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at UiPath.Python.Service.IPythonService.InvokeMethod(Guid instance, String method, IEnumerable1 args) at UiPath.Python.Impl.OutOfProcessEngine.<>c__DisplayClass12_0.<InvokeMethod>b__0() at System.Threading.Tasks.Task1.InnerInvoke()
at System.Threading.Tasks.Task.Execute()
— End of stack trace from previous location where exception was thrown —
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at UiPath.Python.Activities.InvokeMethod.d__16.MoveNext()
— End of inner exception stack trace —
at UiPath.Python.Activities.InvokeMethod.d__16.MoveNext()
— End of stack trace from previous location where exception was thrown —
at UiPath.Shared.Activities.AsyncTaskCodeActivity.EndExecute(AsyncCodeActivityContext context, IAsyncResult result)
at System.Activities.AsyncCodeActivity.System.Activities.IAsyncCodeActivity.FinishExecution(AsyncCodeActivityContext context, IAsyncResult result)
at System.Activities.AsyncCodeActivity.CompleteAsyncCodeActivityData.CompleteAsyncCodeActivityWorkItem.Execute(ActivityExecutor executor, BookmarkManager bookmarkManager)

I hope UiPath won’t :slight_smile:

Can you try running the entire script in the same file once @Veysel_Kocaman

But multithreading is not what I do intentionally… it’s the part of PySpark… so, Ui Path does not support Spark jobs?

Hi @Veysel_Kocaman

Welcome to our UiPath Forum! :slight_smile:

Any chance you could share with us a sample project that reproduces the issue?
It would make it easier to debug.

Moreover, if you feel like it, feel free to report the bug directly on our Community Github here:

Python package is one of the ones that were ‘outsourced’ to our Community. Quick immediate clarification: we are still doing the QA part and actually managing the changes, but you are free to see what could be wrong directly in the code of the Python package if you choose to do so :slight_smile:

1 Like

Hi @loginerror … UiPath supports PySpark or not ? here is the sample code… I hope you can help with that…

def main():

from pyspark.sql import SparkSession

import sparknlp

spark = sparknlp.start()

clinical_notes = [('cute chest pain, there are three',),('This clinical case reports on a 66-year-old hypertensive patient',)]

data_original = spark.createDataFrame(clinical_notes).toDF("text")

return data_original.take(1)

print (main())