-
-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Labels
Description
What happened?
Some OpenAI-compatible providers (specifically observed with Apertis) return an "empty" error object in their response even when the request is successful. LiteLLM's response conversion logic in convert_to_model_response_object unconditionally raises an APIError if the \"error\" key is present and not null.
Raw Response Example
{
\"model\": \"minimax-m2.1\",
\"choices\": [
{
\"index\": 0,
\"message\": {
\"role\": \"assistant\",
\"content\": \"Hey! I'm doing well, thanks for asking! \ud83d\ude0a\\n\\nI'm ready to help you with whatever you need\u2014whether that's coding, writing, analyzing data, brainstorming ideas, or anything else you can think of.\\n\\nWhat's on your mind today?\"
},
\"finish_reason\": \"stop\"
}
],
\"usage\": {
\"prompt_tokens\": 49,
\"completion_tokens\": 87,
\"total_tokens\": 136
},
\"error\": {
\"message\": \"\",
\"type\": \"\",
\"param\": \"\",
\"code\": null
}
}Traceback
litellm.APIError: APIError: OpenAIException -
stack trace: Traceback (most recent call last):
File \"/usr/lib/python3.13/site-packages/litellm/llms/openai/openai.py\", line 855, in acompletion
final_response_obj = convert_to_model_response_object(
response_object=stringified_response,
...
File \"/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/llm_response_utils/convert_dict_to_response.py\", line 466, in convert_to_model_response_object
raise raised_exception
ExceptionRoot Cause
In litellm/litellm_core_utils/llm_response_utils/convert_dict_to_response.py, the following check is too broad:
### CHECK IF ERROR IN RESPONSE ### - openrouter returns these in the dictionary
if (
response_object is not None
and \"error\" in response_object
and response_object[\"error\"] is not None
):
# ...
raise raised_exceptionSince response_object[\"error\"] is {} (not None), it triggers the exception.
Suggested Fix
The check should verify if the error object actually contains a non-empty message or a non-null code.
LiteLLM version
v1.80.8 (and likely earlier)