最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

azure - Connecting AzureOpenAI via LiteLLM - returning Authentication Error 401 - Stack Overflow

programmeradmin12浏览0评论

I am using DSPy framework (v2.6.4) which uses liteLLM (v1.63.7) to connect to LLMs.

While connecting Azure OpenAI via liteLLM (v1.63.7) using the below method (Azure AD Token Refresh - DefaultAzureCredential),

from litellm import completion
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(DefaultAzureCredential(), "/.default")


response = completion(
    model = "azure/<your deployment name>",             # model = azure/<your deployment name> 
    api_base = "<api-url>",                                      # azure api base
    api_version = "<api-version>",                                   # azure api version
    azure_ad_token_provider=token_provider
    messages = [{"role": "user", "content": "good morning"}],
)

I am getting the below error,

litellm\litellm_core_utils\exception_mapping_utils.py", line 2001, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.

Connecting without liteLLM (via openai.AzureOpenAI) works fine but the same cred when used via liteLLM, I am getting authentication error.

Code that works

import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(DefaultAzureCredential(), "/.default")

openai_client = openai.AzureOpenAI(
    api_version="<---version--->",
    azure_endpoint="<---endpoint--->",
    azure_deployment="<---deployment_name--->",
    azure_ad_token_provider=token_provider
)

def interact_with_model():
    try:
        response = openai_client.chatpletions.create(
            model="gpt-4o",
            messages=[    {"role": "system", "content": "You are a helpful assistant that helps me with my math homework!"},    {"role": "user", "content": "Hello! Could you solve 20 x 5?"}  ],
            max_tokens=100  
        )
        print(response)
        return response.choices[0].message.content
    except Exception as e:
        return f"Error: {e}"

if __name__ == "__main__":
    response = interact_with_model()
    print(f"Response from the model: {response}")

Anyone faced similar issues? Am I missing something here?

I am using DSPy framework (v2.6.4) which uses liteLLM (v1.63.7) to connect to LLMs.

While connecting Azure OpenAI via liteLLM (v1.63.7) using the below method (Azure AD Token Refresh - DefaultAzureCredential),

from litellm import completion
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure/.default")


response = completion(
    model = "azure/<your deployment name>",             # model = azure/<your deployment name> 
    api_base = "<api-url>",                                      # azure api base
    api_version = "<api-version>",                                   # azure api version
    azure_ad_token_provider=token_provider
    messages = [{"role": "user", "content": "good morning"}],
)

I am getting the below error,

litellm\litellm_core_utils\exception_mapping_utils.py", line 2001, in exception_type
    raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.

Connecting without liteLLM (via openai.AzureOpenAI) works fine but the same cred when used via liteLLM, I am getting authentication error.

Code that works

import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure/.default")

openai_client = openai.AzureOpenAI(
    api_version="<---version--->",
    azure_endpoint="<---endpoint--->",
    azure_deployment="<---deployment_name--->",
    azure_ad_token_provider=token_provider
)

def interact_with_model():
    try:
        response = openai_client.chatpletions.create(
            model="gpt-4o",
            messages=[    {"role": "system", "content": "You are a helpful assistant that helps me with my math homework!"},    {"role": "user", "content": "Hello! Could you solve 20 x 5?"}  ],
            max_tokens=100  
        )
        print(response)
        return response.choices[0].message.content
    except Exception as e:
        return f"Error: {e}"

if __name__ == "__main__":
    response = interact_with_model()
    print(f"Response from the model: {response}")

Anyone faced similar issues? Am I missing something here?

Share Improve this question edited Mar 28 at 5:34 4run4 asked Mar 27 at 8:08 4run44run4 254 bronze badges 5
  • 1 Check the token is correct by credential = DefaultAzureCredential() token = credential.get_token("https://cognitiveservices.azure/.default").token – Venkatesan Commented Mar 27 at 9:29
  • Thanks. I did the suggested changes but still getting the same error. what else could be wrong? – 4run4 Commented Mar 27 at 12:49
  • Make sure you have proper RBAC role to access the resource (Azure OpenAI). – Venkatesan Commented Mar 27 at 12:55
  • but the code using openai.AzureOpenAI works well. I have added the code in the question. – 4run4 Commented Mar 28 at 5:32
  • It worked after upgrading the version of LiteLLM to 1.65.1 – 4run4 Commented Apr 1 at 10:31
Add a comment  | 

1 Answer 1

Reset to default 1

litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.

The above error occurred may be passing wrong access token or incorrect way passing token to access the endpoint.

You can use the below code it will fetch the response using LiteLLM using Python.

Code:

from litellm import completion
from azure.identity import DefaultAzureCredential
import json

# Get Azure AD Token
credential = DefaultAzureCredential()
token = credential.get_token("https://cognitiveservices.azure/.default").token

# Call liteLLM with the token
response = completion(
    model="azure/<deployment name>",
    api_base="<Resource endpoint>",
    api_version="2023-05-15",
    azure_ad_token=token,  # Use azure_ad_token, not azure_ad_token_provider
    messages=[{"role": "user", "content": "good morning"}],
)

print(json.dumps(response.model_dump(), indent=4))

Output:

{
    "id": "cxxxcmpl-xxxxx",
    "created": 1743074722,
    "model": "xxxxxx",
    "object": "chatpletion",
    "system_fingerprint": "xxx",
    "choices": [
        {
            "finish_reason": "stop",
            "index": 0,
            "message": {
                "content": "Good morning! How can I assist you today?",
                "role": "assistant",
                "tool_calls": null,
                "function_call": null
            }
        }
    ],
    "usage": {
        "completion_tokens": 11,
        "prompt_tokens": 9,
        "total_tokens": 20,
        "completion_tokens_details": {
            "accepted_prediction_tokens": 0,
            "audio_tokens": 0,
            "reasoning_tokens": 0,
            "rejected_prediction_tokens": 0
        },
        "prompt_tokens_details": {
            "audio_tokens": 0,
            "cached_tokens": 0
        }
    },
    "service_tier": null
}

Reference: Azure OpenAI | liteLLM

发布评论

评论列表(0)

  1. 暂无评论