Current Scenario (What works): I have successfully set up the connection in the Integration Service (Status is “Connected”). To verify the credentials and permissions, I performed a test in UiPath Studio (Desktop):
I used the activity “Generate Text Completion using Gemini”.
I selected the model gemini-2.0-flash.
Result: The activity executes successfully and generates a response. This confirms that the Service Account has the necessary Vertex AI User role and access to the GCP project.
¿Has anyone experienced this discrepancy where the connection works in standard activities but fails in the Agent Builder? Does the “Custom Model” field require a specific syntax (e.g., the full path publishers/google/models/...) or are there stricter IAM requirements (like Vertex AI Viewer) for the Agent validation process compared to standard runtime execution?
I have generated a new OpenAI V1 Compliant LLM connector and tried to use it with a “Custom model” configuration within an Agent in Studio Web. However, I am now encountering the following error:
With the current OpenAI V1 Compliant configuration, the “Model” section now shows many “pre-loaded” models (which didn’t happen before), but even so, it is still not working.