Date: December 23, 2025
Severity: High - Blocking production feature
Status: ✅ ROOT CAUSE CONFIRMED - LiteLLM Translation Layer Issue
When using OpenAI Agents SDK with Bedrock via LiteLLM, combining tools + structured outputs causes an immediate crash:
Error: Tool json_tool_call not found in agent Weather Assistant
- Your code: Sends
tools=['get_weather_forecast']+response_format={...} - LiteLLM: Converts
response_format→ fake tooljson_tool_call(automatic, cannot be disabled) - Bedrock: Calls BOTH tools:
json_tool_callANDget_weather_forecast - OpenAI SDK: Crashes trying to invoke unknown tool
json_tool_call
| Scenario | Bedrock | OpenAI |
|---|---|---|
| Tools only | ✅ Works | ✅ Works |
| Structured output only | ✅ Works | ✅ Works |
| Both together | ❌ Fails | ✅ Works |
Bedrock doesn't natively support OpenAI's response_format parameter.
LiteLLM emulates it by converting the JSON schema into a fake tool called json_tool_call. This works when used alone, but conflicts with real tools.
Your Code: tools=[get_weather] + response_format
↓
LiteLLM: Converts response_format → json_tool_call
↓
Bedrock: Receives tools=[get_weather, json_tool_call]
↓
Bedrock: Calls BOTH tools
↓
OpenAI SDK: "Unknown tool: json_tool_call" 💥
"tool_calls": [
{
"function": {
"name": "json_tool_call", // ❌ FAKE (added by LiteLLM)
"arguments": "{\"message\": \"...\", \"location_ids\": [...]}"
}
},
{
"function": {
"name": "get_weather_forecast", // ✅ REAL
"arguments": "{\"location\": \"New York\"}"
}
}
]Why OpenAI works: Native support for both features independently, no conversion needed.
The issue is NOT with:
- The code
- AWS Bedrock (works fine with either feature alone)
- OpenAI Agents SDK (works fine with OpenAI models)
The issue IS with:
- LiteLLM's translation layer converting
response_formatto a fake tool - No configuration option to disable this behavior
- Fundamental incompatibility when using both features together
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install dependencies
make installexport AWS_BEDROCK_ACCESS_KEY="your-access-key"
export AWS_BEDROCK_SECRET_KEY="your-secret-key"
export OPENAI_API_KEY="your-openai-api-key"
export AWS_REGION="us-east-1"# Terminal 1: Start LiteLLM proxy
make run-litellm-proxy
# Terminal 2: Run tests
make run # See the issue
make rca # Detailed root cause analysis✅ OPENAI-ASSISTANT TEST PASSED:
- Tool called: ✅
- Structured output: ✅
❌ BEDROCK-ASSISTANT TEST FAILED:
- Error: Tool json_tool_call not found in agent Weather Assistant
- Model: bedrock/us.anthropic.claude-sonnet-4-5-20250929-v1:0
- LiteLLM: v1.80.11
- OpenAI Agents SDK: v0.0.19
- Python: 3.12
- Can we disable the
response_format→json_tool_callconversion? - Is there a better way to map OpenAI's
response_formatto Bedrock without injecting a fake tool? - How should users combine Tools + Structured Output with Bedrock via the proxy?
- Similar Issue: openai/openai-agents-python#1778
- OpenAI Agents SDK: https://github.com/openai/openai-agents-python
- LiteLLM Docs: https://docs.litellm.ai/