UIPath Post Request error after making ML Skill public

Hi ,

We have build a customer ML model which works perfectly when connection mode it Robot. But gives post request error after making the skill as public. We are using ML services activities version 1.1.7

Error Extract :

MLSkill: Post request with accessUri https://ai-uipath.deskover.com/public/mlskills/… failed, error: Unexpected character encountered while parsing value: <. Path ‘’, line 0, position 0.

Any help will be much appreciated.

Hi Madhavan
What is the input you are sending? Also are you selecting the good type? Is it a text?


Hi Jeremy,

We are sending json as string

Could you send us the workflow by any chance?
Also with robot connection are you trying with the test windows or running the workflow?

Hi ,
Details on the workflow attached
Custom ML Algorithm performs a fuzzy match of name and address extracted from ocr against supplier master dump. we are sending the json of both the ocr extracted data and the supplier dump with a | seperater which is seperated inside the ML skill. out put is confidence score and the supplier id.

We are able to get the expected results via this workflow from the same ML skill when it is not public.

We have run both in test and run mode. Results are same in both.
fuzzymatch.xaml (23.6 KB)

I tried on my side and looks like I can reach the model without issue (btw I would recommend you change API Key once it’s working). I don’t have a correct input since I don’t have csv file but looks more like an issue with the input itself I’m surprised that exact same input work with robot connection.

yes Jeremy the same input works with robot connection. I tried changing the API key but it failed. I also tried it using another AICenter account and the results were same. i am getting the same error.

Can you send me the working input with robot connection.

Hi Jeremy,

Find attached the mocked up data. This mocked up data gives the expected results via robot connection
Test Data.zip (1.2 KB)

Hi Jeremy,
just curious to know if you where you able to replicate the issue.

Hi Madhavan

Yes I can replicate the issue, we are looking into it. I’ll update the thread once we have identified the bug.

Hi Jeremy,
Do you have any update for us or any tentative timelines please

Hi @madhavan
Still looking into it, could you share the ouput returned by the model when you use the robot connection?

Hi @Jeremy_Tederry
“Supplier ID = 1314 Confidence = 0.279875515317555” this is the response that we get while using the robot connection.

Ok we have identified the issue on our side and we will fix it. This is due to the fact that for endpoints we are handling only json string and not any string. So we are applying json.loads to your output and of course this fails.
We’ll fix that bug but if you need that now you have two workaround:

  1. Use the robot connection
  2. Modify the ML Package to return a json string. In you python code something like:
result = dict()
result["Supplier ID"] = <supplier_id_value>
result["Confidence"] = <confidence_value>
return json.dumps(result)

on your workflow you will get a json string as well looking like:
{“Supplier ID”:1314, “Confidence”:0.279875515317555}

I’ll also update the thread once fixed is deployed.



Hi @Jeremy_Tederry
Thanks for the workaround. It worked.

1 Like