最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

configuration - Use multiple API model providers with aider - Stack Overflow

programmeradmin3浏览0评论

Is it possible to use aider with the weak model from OpenAI and the default model form glhf.chat (and possibly the code model from another provider)? How?

My current configuration is as follow:

~/aider.conf.yml

openai-api-base: /api/openai/v1
openai-api-key: glhf_MY_SECRET_API_KEY
model-settings-file: ~/.aider.model.settings.yml
model: deepseek-ai/DeepSeek-R1
weak-model: deepseek-ai/DeepSeek-V3
editor-model: Qwen/Qwen2.5-Coder-32B-Instruct

~/.aider.model.settings.yml


- name: deepseek-ai/DeepSeek-V3
  edit_format: diff
  use_repo_map: true
  reminder: sys
  examples_as_sys_msg: true
  extra_params:
    max_tokens: 8192
  caches_by_default: true

- name: deepseek-ai/DeepSeek-R1
  edit_format: diff
  weak_model_name: deepseek-ai/DeepSeek-V3
  use_repo_map: true
  examples_as_sys_msg: true
  extra_params:
    max_tokens: 8192
    include_reasoning: true
  caches_by_default: true
  editor_model_name: deepseek-ai/DeepSeek-V3
  editor_edit_format: editor-diff

- name: Qwen/Qwen2.5-Coder-32B-Instruct
  edit_format: diff
  weak_model_name: Qwen/Qwen2.5-Coder-32B-Instruct
  use_repo_map: true
  editor_model_name: Qwen/Qwen2.5-Coder-32B-Instruct
  editor_edit_format: editor-diff

But I would like the weak model to be gpt-4o-mini (hosted by OpenAI) and the default model to be deepseek-ai/DeepSeek-R1 (hosted by glhf.chat).

Is it possible to use aider with the weak model from OpenAI and the default model form glhf.chat (and possibly the code model from another provider)? How?

My current configuration is as follow:

~/aider.conf.yml

openai-api-base: https://glhf.chat/api/openai/v1
openai-api-key: glhf_MY_SECRET_API_KEY
model-settings-file: ~/.aider.model.settings.yml
model: deepseek-ai/DeepSeek-R1
weak-model: deepseek-ai/DeepSeek-V3
editor-model: Qwen/Qwen2.5-Coder-32B-Instruct

~/.aider.model.settings.yml


- name: deepseek-ai/DeepSeek-V3
  edit_format: diff
  use_repo_map: true
  reminder: sys
  examples_as_sys_msg: true
  extra_params:
    max_tokens: 8192
  caches_by_default: true

- name: deepseek-ai/DeepSeek-R1
  edit_format: diff
  weak_model_name: deepseek-ai/DeepSeek-V3
  use_repo_map: true
  examples_as_sys_msg: true
  extra_params:
    max_tokens: 8192
    include_reasoning: true
  caches_by_default: true
  editor_model_name: deepseek-ai/DeepSeek-V3
  editor_edit_format: editor-diff

- name: Qwen/Qwen2.5-Coder-32B-Instruct
  edit_format: diff
  weak_model_name: Qwen/Qwen2.5-Coder-32B-Instruct
  use_repo_map: true
  editor_model_name: Qwen/Qwen2.5-Coder-32B-Instruct
  editor_edit_format: editor-diff

But I would like the weak model to be gpt-4o-mini (hosted by OpenAI) and the default model to be deepseek-ai/DeepSeek-R1 (hosted by glhf.chat).

Share Improve this question asked Mar 18 at 13:30 arthur.swarthur.sw 11.7k9 gold badges51 silver badges110 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 1

Billy (glhf.chat co-founder) here!

Unfortunately because Aider overwrites the OpenAI provider for custom OpenAI compatible providers, we can't also directly use OpenAI's API in aider.

One workaround is to use OpenRouter's proxy for gpt-4o-mini, allowing you to still use glhf.chat as the OpenAI provider. :)

export OPENROUTER_API_KEY=<key>
weak-model: openrouter/openai/gpt-4o-mini

Hope that helps!

发布评论

评论列表(0)

  1. 暂无评论