400LLM output does not match the required output schema

I have an agent that, when pushed and ran from the orchestrator, won’t return an string. I instead get a string. I have specified in the system and user prompt to use JSON array, but that isn’t helping.

It wasn’t always preforming this way and I’ve looked at previous versions and I don’t know what changed. If anyone has experience with this error or a deaper understanding of the agent please help me solve my issue.

Hi @Mina_Sanford

This behavior usually happens when the Agent response is not strictly following the JSON format you defined, especially when running through Orchestrator. The Studio test might work fine, but once deployed, the model can output additional text, explanations, or Markdown , which causes UiPath to receive it as a string instead of a JSON array.

Here are a few things to check:

  1. Even if you ask for JSON, the LLM may still add natural language. Add a strict system prompt like:

‘You must respond with ONLY a valid JSON array. No explanations, no markdown, no text outside the array. If you cannot generate data, return .’

2.When the job runs in Orchestrator, add a Log Message after the agent call and log the raw model output.

  • Extra text before or after the JSON
  • Missing closing brackets

If the JSON is invalid, UiPath treats it as a simple string.

  1. In Orchestrator, the model may be hitting a shorter token limit or a different environment.

Try:

  • Temperature: 0 or 0.1
  • Max tokens: Increase by 30–50%

This makes JSON much more stable.

  1. If your workflow returns a value back to Orchestrator/Trigger:
  • Make sure your out argument is a String (holding serialized JSON).
  • If you try to use array/object types directly, Orchestrator may not deserialize them correctly.

It’s safer to return a stringified JSON array, then deserialize later if needed.

Happy Automation

1 Like

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.