06. Human-in-the-loop (human intervention)

Agents may be unreliable and may require human input to successfully perform tasks.

Similarly, for some tasks, you may want to require human intervention and “approval” before execution to ensure that everything is running as intended.

LangGraph supports human-in-the-loop workflows in several ways.

We'll start this tutorial by using LangGraph's interrupt_before function to always interrupt the tool node.

# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()
True
# LangSmith set up tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH17-LangGraph-Modules")
Start tracking LangSmith.
[project name]
CH17-LangGraph-Modules
from typing import Annotated, List, Dict
from typing_extensions import TypedDict

from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langgraph.checkpoint.memory import MemorySaver
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition
from langchain_teddynote.graphs import visualize_graph
from langchain_teddynote.tools import GoogleNews


########## 1. state definition ##########
# state definition
class State(TypedDict):
    # add a message list comment
    messages: Annotated[list, add_messages]


########## 2. Tool definition and binding ##########
# Initialize tool
# Create a tool to search news by keyword
news_tool = GoogleNews()


@tool
def search_keyword(query: str) -> List[Dict[str, str]]:
    """Look up news by keyword"""
    news_tool = GoogleNews()
    return news_tool.search_by_keyword(query, k=5)


tools = [search_keyword]

# LLM Initialization
llm = ChatOpenAI(model="gpt-4o-mini")

# Combining tools and LLM
llm_with_tools = llm.bind_tools(tools)


########## 3. add note ##########
# Defining a chatbot function
def chatbot(state: State):
    # Calling and returning messages
    return {"messages": [llm_with_tools.invoke(state["messages"])]}


# create a state graph
graph_builder = StateGraph(State)

# Add a chatbot node
graph_builder.add_node("chatbot", chatbot)


# Creating and adding tool nodes
tool_node = ToolNode(tools=tools)

# Add tool node
graph_builder.add_node("tools", tool_node)

# Conditional Edge
graph_builder.add_conditional_edges(
    "chatbot",
    tools_condition,
)

########## 4. add edge ##########

# tools > chatbot
graph_builder.add_edge("tools", "chatbot")

# START > chatbot
graph_builder.add_edge(START, "chatbot")

# chatbot > END
graph_builder.add_edge("chatbot", END)

########## 5. Add MemorySaver ##########

# Initialize memory storage
memory = MemorySaver()

Now compile the graph and specify interrupt_before before the tools node.

Let's check the graph status to make sure it's working properly.

(In the previous tutorial) .next did not exist because END was reached.

However, now .next is designated as tools.

Now let's check the tool call.

Next, we will continue the graph from where it left off.

LangGraph makes it easy to continue traversing a graph.

  • Just pass None as input.

Now, we have added human-intervention-capable execution to our chatbot using interrupts, allowing for human supervision and intervention when needed. This could potentially provide a UI for our system later on.

Since we have already added a checkpointer, the graph will be paused indefinitely and can be resumed at any time. Below is how to get the state history using the get_state_history method.

You can specify a desired state through the state history and start over from that point.

It is important to note that a checkpoint is saved for every step of the graph.

The desired point is stored in the to_replay variable. This can be used to specify a point from which to restart.

to_replay.config contains checkpoint_id.

Providing this checkpoint_id value allows LangGraph's checkpointer to load the state at that point in time.

  • However, in this case, the input value must be passed as None.

Let's check it out with the example below.

Last updated