I am using langgraph to build an agent that will extract the content from a pdf file. I have a Together AI api key and I want to use a llm model from there.
But, while using create_react_agent(llm, tools)
from langgraph, the llm model (mistralai/Mistral-Small-24B-Instruct-2501) doesn't call any tools even though I can see in the langsmith trace the output specifying a tool call.
Here is my code to build the agent and to create the llm. It is worth saying that when using OpenRouter models and ChatOpenAI instead, it works well. How can I solve this, thanks.
from langchain_together import ChatTogether
from langgraph.prebuilt import create_react_agent
llm = ChatTogether(
model=model,
temperature=0,[enter image description here][1]
api_key=api_key,
)
inputs_tools = [extract_pdf, update_inputs]
agent = create_react_agent(
llm, tools=inputs_tools, prompt=inputs_retriever_prompt(), state_schema=State
)