-
Notifications
You must be signed in to change notification settings - Fork 20.3k
Closed
Labels
bugRelated to a bug, vulnerability, unexpected error with an existing featureRelated to a bug, vulnerability, unexpected error with an existing feature
Description
System Info
Mac OS Ventura 13.3.1 (a)
Python 3.10.8
LangChain 0.0.224
Who can help?
Information
- The official example notebooks/scripts
- My own modified scripts
Related Components
- LLMs/Chat Models
- Embedding Models
- Prompts / Prompt Templates / Prompt Selectors
- Output Parsers
- Document Loaders
- Vector Stores / Retrievers
- Memory
- Agents / Agent Executors
- Tools / Toolkits
- Chains
- Callbacks/Tracing
- Async
Reproduction
Reproduction Steps:
- Run the following
from langchain.llms import OpenAI
from langchain.indexes import GraphIndexCreator
from langchain.chains import GraphQAChain
from langchain.prompts import PromptTemplate
text = "Apple announced the Vision Pro in 2023."
index_creator = GraphIndexCreator(llm=OpenAI(openai_api_key='{OPEN_AI_KEY_HERE}', temperature=0))
graph = index_creator.from_text(text)
chain = GraphQAChain.from_llm(
OpenAI(temperature=0, openai_api_key='{OPEN_AI_KEY_HERE}'),
graph=graph,
verbose=True
)
chain.run("When did Apple announce the Vision Pro?")
- Observe the "Full Context" output in your terminal and notice that the two triplets are concatenated onto a single line with no spacing in between them.
I believe the issue is in the code here. When only 1 triplet is found in an iteration, .join does not add any \n characters, resulting in a context string with no separation between triplets.
Expected behavior
Expected: A multi-line string with each triplet text on its own line (delimited by "\n")
In the above repro steps, I would expect
Full Context:
Apple announced Vision Pro
Vision Pro was announced in 2023
dosubot
Metadata
Metadata
Assignees
Labels
bugRelated to a bug, vulnerability, unexpected error with an existing featureRelated to a bug, vulnerability, unexpected error with an existing feature