Skip to content

haggaishachar/openai-agents-bedrock-litellm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LiteLLM Proxy Issue: Bedrock Tool Calling + Structured Output

Date: December 23, 2025
Severity: High - Blocking production feature
Status: ✅ ROOT CAUSE CONFIRMED - LiteLLM Translation Layer Issue


🔴 The Problem

When using OpenAI Agents SDK with Bedrock via LiteLLM, combining tools + structured outputs causes an immediate crash:

Error: Tool json_tool_call not found in agent Weather Assistant

What Happens

  1. Your code: Sends tools=['get_weather_forecast'] + response_format={...}
  2. LiteLLM: Converts response_format → fake tool json_tool_call (automatic, cannot be disabled)
  3. Bedrock: Calls BOTH tools: json_tool_call AND get_weather_forecast
  4. OpenAI SDK: Crashes trying to invoke unknown tool json_tool_call

Test Results

Scenario Bedrock OpenAI
Tools only ✅ Works ✅ Works
Structured output only ✅ Works ✅ Works
Both together ❌ Fails ✅ Works

🔬 Root Cause

Bedrock doesn't natively support OpenAI's response_format parameter.

LiteLLM emulates it by converting the JSON schema into a fake tool called json_tool_call. This works when used alone, but conflicts with real tools.

Request Flow

Your Code: tools=[get_weather] + response_format
    ↓
LiteLLM: Converts response_format → json_tool_call
    ↓
Bedrock: Receives tools=[get_weather, json_tool_call]
    ↓
Bedrock: Calls BOTH tools
    ↓
OpenAI SDK: "Unknown tool: json_tool_call" 💥

Evidence from Logs

"tool_calls": [
  {
    "function": {
      "name": "json_tool_call",        // ❌ FAKE (added by LiteLLM)
      "arguments": "{\"message\": \"...\", \"location_ids\": [...]}"
    }
  },
  {
    "function": {
      "name": "get_weather_forecast",  // ✅ REAL
      "arguments": "{\"location\": \"New York\"}"
    }
  }
]

Why OpenAI works: Native support for both features independently, no conversion needed.


✅ Conclusion

The issue is NOT with:

  • The code
  • AWS Bedrock (works fine with either feature alone)
  • OpenAI Agents SDK (works fine with OpenAI models)

The issue IS with:

  • LiteLLM's translation layer converting response_format to a fake tool
  • No configuration option to disable this behavior
  • Fundamental incompatibility when using both features together

🚀 Quick Start

1. Install Dependencies

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Install dependencies
make install

2. Set Environment Variables

export AWS_BEDROCK_ACCESS_KEY="your-access-key"
export AWS_BEDROCK_SECRET_KEY="your-secret-key"
export OPENAI_API_KEY="your-openai-api-key"
export AWS_REGION="us-east-1"

3. Run Demonstration

# Terminal 1: Start LiteLLM proxy
make run-litellm-proxy

# Terminal 2: Run tests
make run      # See the issue
make rca      # Detailed root cause analysis

Expected Output

✅ OPENAI-ASSISTANT TEST PASSED:
   - Tool called: ✅
   - Structured output: ✅

❌ BEDROCK-ASSISTANT TEST FAILED:
   - Error: Tool json_tool_call not found in agent Weather Assistant

📚 Environment

  • Model: bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0
  • LiteLLM: v1.80.11
  • OpenAI Agents SDK: v0.0.19
  • Python: 3.12

❓ Questions for LiteLLM Maintainers

  1. Can we disable the response_formatjson_tool_call conversion?
  2. Is there a better way to map OpenAI's response_format to Bedrock without injecting a fake tool?
  3. How should users combine Tools + Structured Output with Bedrock via the proxy?

📖 References

About

LiteLLM Proxy Issue: Bedrock Tool Calling + Structured Output

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published