RunnablePassthrough Is the role of passing data. This class invoke() Through the method Returns the entered data as it is To.
This can be used to pass the data to the next level of the piprain without changing it.
RunnablePassthrough can be useful in the following scenarios.
If you don't need to convert or modify data
If you need to cross certain steps in the pipline
When data flow needs to be monitored for debugging or testing purposes
This class Runnable Interface implemented, so different Runnable It can be used in the pipeline with objects.
# Configuration file for managing API keys as environment variablesfrom dotenv import load_dotenv# Load API key informationload_dotenv()
True
# Set up LangSmith tracking. https://smith.langchain.com# !pip install langchain-teddynotefrom langchain_teddynote import logging# Enter a project name.logging.langsmith("LCEL-Advanced")
Data transfer
RunnablePassthrough You can pass it as it is without changing the input, or you can pass it by adding additional keys.
Generally RunnableParallel Used in conjunction with and used to assign data to new keys in the map.
RunnablePassthrough() When called alone, it simply takes input and passes it as it is.
RunnablePasstrough called with assign RunnablePassthrough.assign(...) ) Receives input and adds additional factors passed to the assign function.
RunnableParallel Using class Define actionable actions in parallel To.
passed In properties RunnablePassthrough Assign an instance to return the input as it is.
extra In properties RunnablePassthrough.assign() Defines the task of assigning the "num" value of the input multiplied by 3 to the "mult" key using a method.
modified The attribute defines the task of adding 1 to the "num" value of the input using the lambda function.
runnable.invoke() By calling the method {"num": 1} Run parallel operations with input.
In the example above passed tall RunnablePassthrough() It was called with, it is simply {'num': 1} Pass.
In the second line, along with the lambda function multiplying the numeric value by 3 RunnablePastshrough.assign Used. in this case, extra to the original value mult Added key {'num': 1, 'mult': 3} Set to.
Finally, modified I used a key to set the third key on the map, which used the lambda function to set a single value plus 1 for num, and as a result modified The value of the key 2 It has been.
Finder example
In the example below RunnablePassthrough You can look at the usage cases you use.
from langchain_core.runnables import RunnableParallel, RunnablePassthrough
runnable = RunnableParallel(
# Sets a Runnable that returns the input passed to it.
passed=RunnablePassthrough(),
# Set up a Runnable that returns the result of multiplying the input's "num" value by 3.
extra=RunnablePassthrough.assign(mult=lambda x: x["num"] * 3),
# Sets a Runnable that returns the result of adding 1 to the "num" value of the input.
modified=lambda x: x["num"] + 1,
)
# {"num": 1}Executes a Runnable with input.
runnable.invoke({"num": 1})
r = RunnablePassthrough.assign(mult=lambda x: x["num"] * 3)
r.invoke({"num": 1})
{'num': 1,'mult': 3}
from langchain_community.vectorstores import FAISS
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_openai import ChatOpenAI, OpenAIEmbeddings
# Generate FAISS vector storage from text.
vectorstore = FAISS.from_texts(
[
"Teddy worked at Langchain Inc.",
"Shirley worked for the same company as Teddy.",
"Teddy's job is a developer..",
"Shirley's job is a designer.",
],
embedding=OpenAIEmbeddings(),
)
# Use vector storage as a search engine.
retriever = vectorstore.as_retriever()
# Define a template.
template = """Answer the question based only on the following context:
{context}
Question: {question}
"""
# Generate chat prompts from templates.
prompt = ChatPromptTemplate.from_template(template)
# Initialize the ChatOpenAI model.
model = ChatOpenAI(model_name="gpt-4o-mini")
# Functions for frmatting documents
def format_docs(docs):
return "\n".join([doc.page_content for doc in docs])
# Constructs a search chain.
retrieval_chain = (
{"context": retriever | format_docs, "question": RunnablePassthrough()}
| prompt
| model
| StrOutputParser()
)
# Run a search chain to get answers to your questions.
retrieval_chain.invoke("What is Teddy's job?")
'Teddy's job is a developer.'
# Run a search chain to get answers to your questions.
retrieval_chain.invoke("What is Shirley's occupation?")