-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Labels
Description
What component(s) are affected?
- Opik Python SDK
- Opik Typescript SDK
- Opik Agent Optimizer SDK
- Opik UI
- Opik Server
- Documentation
Opik version
- Opik version: 1.9.51
Describe the problem
My langgraph agent uses OpenAI streaming, and for usage tracking I get a pydantic validation error.
OPIK: Failed to extract token usage from presumably OpenAI LLM langchain run. The run dictionary...
Reproduction steps and code snippets
No response
Error logs or stack trace
Traceback (most recent call last):
File "/.venv/lib/python3.14/site-packages/opik/integrations/langchain/provider_usage_extractors/openai_usage_extractor.py", line 74, in _try_get_token_usage
return llm_usage.OpikUsage.from_openai_completions_dict(token_usage)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^
File "/.venv/lib/python3.14/site-packages/opik/llm_usage/opik_usage.py", line 74, in from_openai_completions_dict
provider_usage = openai_chat_completions_usage.OpenAICompletionsUsage.from_original_usage_dict(
usage
)
File ".venv/lib/python3.14/site-packages/opik/llm_usage/openai_chat_completions_usage.py", line 61, in from_original_usage_dict
return cls(
**usage_dict,
completion_tokens_details=completion_tokens_details,
prompt_tokens_details=prompt_tokens_details,
)
File ".../.venv/lib/python3.14/site-packages/pydantic/main.py", line 250, in __init__
validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 3 validation errors for OpenAICompletionsUsage
completion_tokens
Field required [type=missing, input_value={'model_name': {'type': '...t_tokens_details': None}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.12/v/missing
prompt_tokens
Field required [type=missing, input_value={'model_name': {'type': '...t_tokens_details': None}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.12/v/missing
total_tokens
Input should be a valid integer [type=int_type, input_value={'default': None, 'type': 'integer'}, input_type=dict]
For further information visit https://errors.pydantic.dev/2.12/v/int_type
Healthcheck results
No response