OpenRouter provides a unified API which should be compatible with OpenAI. I tried using tools and MCP integrations with the spring-ai-openai-spring-boot-starter
to get it to work.
I am testing with the Filesystem MCP Server. The integration works great when using the openai/gpt-4o-mini
model, but I am getting a warning when using google/gemini-2.5-pro-exp-03-25:free
model which should also be compatible with the tools (if OpenRouter is using a unified API and the model supports it):
2025-03-26T22:04:13.586+01:00 WARN 39241 --- [ai-demo] [ main] o.s.ai.openai.OpenAiChatModel : No choices returned for prompt: Prompt{messages=[UserMessage{content='What folders does the current active user have on the desktop?
Your response should be in JSON format.
Do not include any explanations, only provide a RFC8259 compliant JSON response following this format without deviation.
Do not include markdown code blocks in your response.
Remove the ```json markdown from the output.
Here is the JSON Schema instance your output must adhere to:
```{
"$schema" : ";,
"type" : "object",
"properties" : {
"desktopLocation" : {
"type" : "string"
},
"files" : {
"type" : "array",
"items" : {
"type" : "string"
}
}
},
"additionalProperties" : false
}```
', properties={messageType=USER}, messageType=USER}], modelOptions=OpenAiChatOptions: {"streamUsage":false,"model":"google/gemini-2.5-pro-exp-03-25:free","temperature":0.7}}
And the result is null
. Here is how I am calling the API:
@Configuration
class Testing {
@Bean
fun demo(chat: ChatClient.Builder, provider: ToolCallbackProvider): ApplicationRunner {
return ApplicationRunner {
println(
chat
.build()
.prompt("What folders does the current active user have on the desktop?")
.tools(provider)
.call()
.entity(Files::class.java)
)
}
}
data class Files(
val desktopLocation: String,
val files: List<String>
)
}
And my application.properties
:
spring.ai.openai.chat.options.model=google/gemini-2.5-pro-exp-03-25:free
spring.ai.openai.base-url=
spring.ai.openai.api-key=MY_TOKEN
spring.ai.mcp.client.stdio.servers-configuration=classpath:fs-config.json
If I comment out the last line in the properties file, everything works as expected, except the results are hallucinated. I thought this would work the same way as it does with the openai/gpt-4o-mini
model since they should both support tools.
Is there a way to configure spring-ai to use the tools with another model other than openai/gpt-4o-mini
?
-- EDIT --
After some more digging, I was able to dig this error out:
raw -> {
"error": {
"code": 400,
"message": "* GenerateContentRequest.tools[0].function_declarations[10].parameters.properties: should be non-empty for OBJECT type\n",
"status": "INVALID_ARGUMENT"
}
}
Is this a spring-ai issue or an issue on the OpenRouter side?