08. Add a node to ask a person
Add nodes to ask people
So far, we have relied on the state of messages.
You can do a lot with the correction of these state values, but if you want to define complex behaviors without relying solely on the message list, you can add additional fields to the state. This tutorial explains how to extend the chatbot by adding new nodes.
In the example above, whenever a tool is called Graph always stops through interrupt Human-in-the-loop was implemented to be.
This time, let's say you want to allow the chatbot to choose whether to rely on humans.
One way to do this is that the graph always stops "human" node Is to generate. This node only runs when LLM calls the "human" tool. For convenience, we will include the "ask_human" flag in the graph state to have LLM switch the flag when it calls this tool.
# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv
# Load API key information
load_dotenv() True # Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging
# Enter a project name.
logging.langsmith("CH17-LangGraph-Modules")Setting a node to ask people for comments
This time, you are asking whether to ask a person in the middle ( ask_human ) To add.
human Defines the schema used on request for.
Next, define the chatbot node.
The main fix here is the chatbot RequestAssistance When the flag is called ask_human Is to switch flags.
Next, create a graph builder and do the same as before chatbot and tools Add nodes to the graph.
Human node setting
next human Generate nodes.
This node acts primarily as a placeholder to trigger an interrupt in the graph. User interrupt If you don't update the status manually during, LLM inserts a tool message to let you know that the user has been asked but has not responded.
This node is also ask_human Release the flag so that the graph will never visit the node again unless there is an additional request.
Reference image

Next, define conditional logic.
select_next_node When the flag is set human Specify the path as a node. Otherwise, pre-built tools_condition Have the function select the next node.
tools_condition The function is simply chatbot In this response message tool_calls Make sure you used it.
When used, action Specify the path as a node. Otherwise, exit the graph.
Finally, connect the edge and compile the graph.
Visualize the graph.

chatbot Nodes do the following actions:
Chatbots can ask humans for help (chatbot->select->human)
Call the search engine tool (chatbot->select->action)
You can respond directly (chatbot->select-> end ).
Once an action or request is made, the graph chatbot Switch back to the node to continue working.
Notice: LLM Provided" HumanRequest "The tool was called, and the interrupt was set. Let's check the graph status.
Graph status is actually 'human' Before the node stop Will.
In this scenario, act as an "expert" and use input to new ToolMessage You can update the status manually by adding.
Next, to respond to the chatbot's request, do the following:
Including responses
ToolMessageGenerate. This ischatbotPassed back to.update_stateManually update the graph state by calling.
You can check the status to see if a response has been added.
Next, as input None Graph using resume To.
Check the final result.
Last updated