最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

windows环境本地部署Dify

运维笔记admin11浏览0评论

windows环境本地部署Dify

一、安装docker Desktop

网上找到docker Desktop软件,并且安装(无配置,无需登录)

二、使用git下载dify代码

git clone https://github/langgenius/dify.git

三、部署PostgreSQL / Redis / Weaviate

  1. 运行之前安装的docker Desktop软件

  2. 进入dify代码文件夹,按住shift右击打开LInux shell

    cd docker
    cp middleware.env.example middleware.env
    docker compose -f docker-compose.middleware.yaml up -d
    

如果第三条命令执行不成功

解决方案:

  1. 自己电脑翻墙再下载

  2. 公司服务器docker已经安装环境(没有就在公司服务器上下载好),自己docker 再连公司服务器下载

    1. 公司服务器优先下载好对应镜像(根据docker-compose.middleware.yaml文件夹的镜像)

      postgres:15-alpine
      redis:6-alpine
      langgenius/dify-sandbox:0.2.10
      ubuntu/squid:latest
      semitechnologies/weaviate:1.19.0
      
    2. 打开docker Desktop,打开设置,加入私有服务器地址

      "insecure-registries": [
      	"192.168.0.110:88"// 公司服务器地址
      ],
      
    3. 拉取5条已经存在服务器上的docker镜像

      docker pull 192.168.0.110:88/library/postgres
      等等
      
    4. 给5条镜像改名(因为运行时需要名字匹配)

      docker tag 本地镜像 tag名称:新镜像名称
      docker tag 192.168.0.110:88/library/postgres 15-alpine:postgres:15-alpine
      
    5. 再次启动并运行Docker容器

      docker compose -f docker-compose.middleware.yaml up -d
      

四、部署后端API接口和Worker 服务

  1. 下载并安装Python 3.11 至 3.12(不可以是3.13)

  2. 下载并安装anaconda

  3. 进入dify代码文件夹,按住shift右击打开LInux shell

  4. 进入api文件夹

    cd api
    
  5. 复制.env文件

    cp .env.example .env
    
  6. 给.env文件生成新密钥

    awk -v key="$(openssl rand -base64 42)" '/^SECRET_KEY=/ {sub(/=.*/, "=" key)} 1' .env > temp_env && mv temp_env .env
    
  7. 安装依赖(这一步经常网络超时,可以多执行几次,让所有依赖都下载完毕无报错再执行下面步骤)

    poetry shell
    poetry install
    
  8. 当前位置打开cmd,创建虚拟环境

    conda create --name dify_env python=3.11
    
  9. 激活虚拟环境

    conda activate dify_env
    
  10. 执行数据库迁移(有问题,返回上一步,继续下载依赖,或者单独下载遗漏依赖)

poetry shell
flask db upgrade
  1. 启动 API 服务
 ```
 flask run --host 0.0.0.0 --port=5001 --debug
 ```

 看到如下为返回值则为成功

 ```
 * Debug mode: on
 INFO:werkzeug:WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
 * Running on all addresses (0.0.0.0)
 * Running on http://127.0.0.1:5001
 INFO:werkzeug:Press CTRL+C to quit
 INFO:werkzeug: * Restarting with stat
 WARNING:werkzeug: * Debugger is active!
 INFO:werkzeug: * Debugger PIN: 695-801-919
 ```
  1. 当前位置打开另一个cmd,激活虚拟环境
conda activate dify_env
  1. 启动 Worker 服务
  celery -A app.celery worker -P solo --without-gossip --without-mingle -Q dataset,generation,mail,ops_trace --loglevel INFO

看到如下为返回值则为成功

-------------- celery@TAKATOST.lan v5.2.7 (dawn-chorus)
--- ***** ----- 
-- ******* ---- macOS-10.16-x86_64-i386-64bit 2023-07-31 12:58:08
- *** --- * --- 
- ** ---------- [config]
- ** ---------- .> app:         app:0x7fb568572a10
- ** ---------- .> transport:   redis://:**@localhost:6379/1
- ** ---------- .> results:     postgresql://postgres:**@localhost:5432/dify
- *** --- * --- .> concurrency: 1 (gevent)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
-------------- [queues]
.> dataset          exchange=dataset(direct) key=dataset
.> generation       exchange=generation(direct) key=generation
.> mail             exchange=mail(direct) key=mail

[tasks]
. tasks.add_document_to_index_task.add_document_to_index_task
. tasks.clean_dataset_task.clean_dataset_task
. tasks.clean_document_task.clean_document_task
. tasks.clean_notion_document_task.clean_notion_document_task
. tasks.create_segment_to_index_task.create_segment_to_index_task
. tasks.deal_dataset_vector_index_task.deal_dataset_vector_index_task
. tasks.document_indexing_sync_task.document_indexing_sync_task
. tasks.document_indexing_task.document_indexing_task
. tasks.document_indexing_update_task.document_indexing_update_task
. tasks.enable_segment_to_index_task.enable_segment_to_index_task
. tasks.generate_conversation_summary_task.generate_conversation_summary_task
. tasks.mail_invite_member_task.send_invite_member_mail_task
. tasks.remove_document_from_index_task.remove_document_from_index_task
. tasks.remove_segment_from_index_task.remove_segment_from_index_task
. tasks.update_segment_index_task.update_segment_index_task
. tasks.update_segment_keyword_index_task.update_segment_keyword_index_task

[2023-07-31 12:58:08,831: INFO/MainProcess] Connected to redis://:**@localhost:6379/1
[2023-07-31 12:58:08,840: INFO/MainProcess] mingle: searching for neighbors
[2023-07-31 12:58:09,873: INFO/MainProcess] mingle: all alone
[2023-07-31 12:58:09,886: INFO/MainProcess] pidbox: Connected to redis://:**@localhost:6379/1.
[2023-07-31 12:58:09,890: INFO/MainProcess] celery@TAKATOST.lan ready.

五、部署前端

  1. 进入dify代码文件夹,按住shift右击打开LInux shell

  2. 进入 web 目录

    cd web
    
  3. 安装依赖包

    npm install
    
  4. 复制.env.local文件

    cp .env.example .env.local
    
  5. 前端代码打包

    npm run build
    
  6. 复制文件(每次打包都需要执行)

    cp -r .next/static .next/standalone/.next/static && cp -r public .next/standalone/public
    
  7. 找到package.json,创建一个新命令,并写入

    "s": "cross-env PORT=$npm_config_port HOSTNAME=$npm_config_host node .next/standalone/server.js",
    
  8. 启动前端服务

    npm run s
    

    看到如下为返回值则为成功

    ready - started server on 0.0.0.0:3000, url: http://localhost:3000
    warn  - You have enabled experimental feature (appDir) in next.config.js.
    warn  - Experimental features are not covered by semver, and may cause unexpected or broken application behavior. Use at your own risk.
    info  - Thank you for testing `appDir` please leave your feedback at https://nextjs.link/app-feedback
    

六、访问dify

http://127.0.0.1:3000

七、修改前端dify部分

  1. 不打包,直接运行开发环境即可
npm run dev
发布评论

评论列表(0)

  1. 暂无评论