最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

docker - python script causing error - NameError: name 'Python' is not defined - Stack Overflow

programmeradmin1浏览0评论

I'm running a simple Streamlit + Ollama + Python to run an AI chatbot.

After asking the chat, it's causing this error:

[ERROR] Failed with error: Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/pandasai/pipelines/chat/code_execution.py", line 87, in execute
    result = self.execute_code(input, code_context)
  File "/usr/local/lib/python3.10/site-packages/pandasai/pipelines/chat/code_execution.py", line 172, in execute_code
    exec(code, environment)
  File "<string>", line 1, in <module>
NameError: name 'Python' is not defined

I'm not sure what can be happening here. So here is my scripts.

docker-compose.yml

version: '3.8'

services:
  app:
    build: .
    container_name: streamlit_app
    ports:
      - "8501:8501"
    volumes:
      - .:/app
    depends_on:
      - ollama

  ollama:
    image: ollama/ollama
    container_name: ollama
    ports:
      - "11434:11434"
    volumes:
      - ollama_data:/root/.ollama
    entrypoint: ["/bin/sh", "-c", "ollama serve"]

volumes:
  ollama_data:

Dockerfile

FROM python:3.10

WORKDIR /app

COPY requirements.txt .
COPY app.py .

RUN pip install --no-cache-dir -r requirements.txt

CMD ["streamlit", "run", "app.py", "--server.port=8501", "--server.address=0.0.0.0"]

app.py

    from pandasai.llm.local_llm import LocalLLM ## Importing LocalLLM for local Meta Llama 3 model
    import streamlit as st 
    import pandas as pd # Pandas for data manipulation
    from pandasai import SmartDataframe # SmartDataframe for interacting with data using LLM

    # Function to chat with CSV data
    def chat_with_csv(df,query):
        llm = LocalLLM(
        api_base="http://ollama:11434/v1",
        model="llama3")
        pandas_ai = SmartDataframe(df, config={"llm": llm})
        result = pandas_ai.chat(query)
        return result

    st.set_page_config(layout='wide')
    st.title("Multiple-CSV ChatApp powered by LLM")

    # Upload multiple CSV files
    input_csvs = st.sidebar.file_uploader("Upload your CSV files", type=['csv'], accept_multiple_files=True)

    # Check if CSV files are uploaded
    if input_csvs:
        # Select a CSV file from the uploaded files using a dropdown menu
        selected_file = st.selectbox("Select a CSV file", [file.name for file in input_csvs])
        selected_index = [file.name for file in input_csvs].index(selected_file)

        #load and display the selected csv file 
        st.info("CSV uploaded successfully")
        data = pd.read_csv(input_csvs[selected_index])
        st.dataframe(data.head(3),use_container_width=True)

        #Enter the query for analysis
        st.info("Chat Below")
        input_text = st.text_area("Enter the query")

        #Perform analysis
        if input_text:
            if st.button("Chat with csv"):
                st.info("Your Query: "+ input_text)
                result = chat_with_csv(data,input_text)
                st.success(result)

Anyone can help?

发布评论

评论列表(0)

  1. 暂无评论