Skip to content

openai.ChatCompletionRequest 序列化删除空值Stream参数导致无法开启非流模式 #764

@hfutkang

Description

@hfutkang

后端大模型默认是流式模式,但是由于openai.ChatCompletionRequest的json标签使用了omitempty,序列化会删除空值,如果设置Stream=false,序列化时会被删除,导致传递给大模型接口的Body没有Stream参数,无法开启非流模式。
type ChatCompletionRequest struct {
Model string json:"model"
Messages []ChatCompletionMessage json:"messages"
// MaxTokens The maximum number of tokens that can be generated in the chat completion.
// This value can be used to control costs for text generated via API.
// Deprecated: use MaxCompletionTokens. Not compatible with o1-series models.
// refs: https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens
MaxTokens int json:"max_tokens,omitempty"
// MaxCompletionTokens An upper bound for the number of tokens that can be generated for a completion,
// including visible output tokens and reasoning tokens https://platform.openai.com/docs/guides/reasoning
MaxCompletionTokens int json:"max_completion_tokens,omitempty"
Temperature *float32 json:"temperature,omitempty"
TopP float32 json:"top_p,omitempty"
N int json:"n,omitempty"
Stream bool json:"stream,omitempty"
Stop []string json:"stop,omitempty"
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions