最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - Issue with SchemaLLMPathExtractor and Ollama Models for Relationship Extraction in LangChain - Stack Overflow

programmeradmin0浏览0评论

I am working on relationship extraction using the PropertyGraphStore class from LangChain, following the approach outlined in the official guide. My goal is to restrict the nodes and relationships being extracted by using SchemaLLMPathExtractor.

Here's the issue:

  • When I use local models like Llama 3.1 or Mistral through Ollama, no relationships get extracted when SchemaLLMPathExtractor is applied.

  • If I remove the SchemaLLMPathExtractor, it extracts a large number of relationships, but they are not constrained as needed.

  • Interestingly, when I switch to OpenAI's API instead of Ollama, everything works as expected even with SchemaLLMPathExtractor.

I’ve followed the instructions from the LangChain documentation and ensured the local models are set up correctly. Here are the steps I’ve tried:

  1. Verified the compatibility of Ollama with the LangChain version I’m using.

  2. Double-checked the schema and input configurations to align with the examples provided in the guide.

  3. Tested with different prompts to ensure that the issue is not with the input format.

  4. Ensured the local Ollama models are set up correctly, as they produce outputs without SchemaLLMPathExtractor.

  5. Checked the schema definitions, ensuring they match the example in the documentation.

Still, I’m facing the same issue with Ollama. There are no errors; the output simply doesn't include any relationships when the SchemaLLMPathExtractor is used.

Key Details:

  • LangChain Version: 0.2.14

  • Ollama Version: 0.3.9

  • Local Models Tried: Llama 3.1, Mistral

  • OpenAI Model Used: gpt-4o-mini

Code for OpenAI:

# Creating an instance of Neo4jPropertyGraphStore with environment configuration
graph_store = Neo4jPropertyGraphStore(
    username=NEO4J_USER,
    password=NEO4J_PASS,
    url=NEO4J_URL,
    database=NEO4J_DB_NAME
)
vec_store = None

# Creating an instance of SchemaLLMPathExtractor with OpenAI model and schema configuration
kg_extractor = SchemaLLMPathExtractor(
    llm=OpenAI(model=LLM_MODEL, temperature=TEMPERATURE),
    possible_entities=entities,
    possible_relations=relations,
    kg_validation_schema=validation_schema,
    strict=True,
)

# Creating an instance of PropertyGraphIndex with documents and environment configuration
index = PropertyGraphIndex.from_documents(
    documents,
    embed_model=OpenAIEmbedding(model_name=EMBEDDING_MODEL),
    show_progress=True,
    kg_extractors=[kg_extractor],
    property_graph_store=graph_store,
    vector_store=vec_store,
)

Code for Ollama:

# Creating an instance of Neo4jPropertyGraphStore with environment configuration
graph_store = Neo4jPropertyGraphStore(
    username=NEO4J_USER,
    password=NEO4J_PASS,
    url=NEO4J_URL,
    database=NEO4J_DB_NAME
)
vec_store = None

# Creating an instance of SchemaLLMPathExtractor with Ollama model and schema configuration
kg_extractor = SchemaLLMPathExtractor(
    llm=Ollama(model="mistral:latest"),
    possible_entities=entities,
    possible_relations=relations,
    kg_validation_schema=validation_schema,
    strict=True,
)

# Creating an instance of PropertyGraphIndex with documents and environment configuration
index = PropertyGraphIndex.from_documents(
    documents,
    embed_model=OllamaEmbedding(model_name="mistral:latest", base_url="http://localhost:11434"),
    show_progress=True,
    kg_extractors=[kg_extractor],
    property_graph_store=graph_store,
    vector_store=vec_store,
)
  • Has anyone encountered a similar issue with Ollama and SchemaLLMPathExtractor?

  • Is there any specific configuration or adjustment required to make this setup work?

    Any insights or suggestions would be highly appreciated!

与本文相关的文章

发布评论

评论列表(0)

  1. 暂无评论