Skip to content

partial_variables not working with ChatPromptTemplate (langchain v0.1.9)Β #17560

@hasansustcse13

Description

@hasansustcse13

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.

Example Code

   async def partial_test(self):
        prompt_template = "Tell me details about {name}. The output should be in {lang}"
        llm = LLMSelector(self.model).get_language_model()
        prompt = ChatPromptTemplate.from_messages([
            HumanMessagePromptTemplate.from_template(prompt_template, partial_variables={"lang": "Spanish"}),
        ])
        chain = prompt | llm | StrOutputParser()
        result = await chain.ainvoke({'name': 'Open AI'})
        print(result)

Error Message and Stack Trace (if applicable)

Errror: Input to ChatPromptTemplate is missing variables {'lang'}. Expected: ['lang', 'name'] Received: ['name']\"

Description

But this works with PromptTemplate also work with ChatPromptTemplate in previous version

System Info

langchain v0.1.7

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing feature

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions