Skip to content

Conversation

michaelnchin
Copy link
Collaborator

Updating ChatBedrock to correctly format prompts for Llama 4 models, following the official documentation:

https://www.llama.com/docs/model-cards-and-prompt-formats/llama4/

def _convert_one_message_to_text_llama4(message: BaseMessage) -> str:
if isinstance(message, ChatMessage):
message_text = (
f"<|header_start|>{message.role}<|header_end|>{message.content}<|eot|>"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Collaborator Author

@michaelnchin michaelnchin May 30, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a good point, I'm honestly not too sure.

This is in line with the llama3 ChatMessage translator implementation, where the model also does not explicitly support arbitrary roles. But in practice, at worst, llama3 and llama4 seem to ignore the custom role.

@michaelnchin michaelnchin merged commit 8183452 into langchain-ai:main May 30, 2025
12 checks passed
@michaelnchin michaelnchin deleted the llama4-prompt branch May 30, 2025 17:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants