# 04. Add memory to Agent

## Add memory to Agent <a href="#agent-memory" id="agent-memory"></a>

Currently, chatbots cannot remember their past interactions themselves, so there are limits to conducting consistent multi-turn conversations.

To solve this in this tutorial **memory** Add.

**Reference**

This time pre-built `ToolNode` Wow `tools_condition` Use it.

1. [ToolNode ](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.tool_node.ToolNode): Node for tool call
2. [tools\_condition ](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.tool_node.tools_condition): Condition branch based on whether the tool is called

Our chatbots can now use tools to answer user questions, but of previous interaction **context** I don't remember. This limits the ability to conduct multi-turn conversations.

`LangGraph` has **persistent checkpointing** Solve this problem.

When compiling graphs `checkpointer` When providing and calling the graph `thread_id` If you provide, `LangGraph` After each step **Save status automatically** To. same `thread_id` When you call the graph again using, the graph loads the saved state so that the chatbot can continue the conversation at the point where it was previously stopped.

**checkpointing** Is much more powerful than LangChain's memory function. (Maybe you can check this naturally by completing this tutorial.)

```python
# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()
```

```
 True 
```

```python
# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH17-LangGraph-Modules")
```

```
 Start tracking LangSmith. 
[Project name] 
CH17-LangGraph-Modules 
```

But before going too far, to enable multi-turn conversations **checkpointing** Let's add it.

`MemorySaver` Generates checkpointer.

```python
from langgraph.checkpoint.memory import MemorySaver

# Create a memory store
memory = MemorySaver()
```

**Reference**

In this tutorial `in-memory checkpointer` Use.

But in the production phase, this `SqliteSaver` or `PostgresSaver` You can change to and connect to your own DB.

```python
from typing import Annotated
from typing_extensions import TypedDict
from langchain_openai import ChatOpenAI
from langchain_teddynote.tools.tavily import TavilySearch
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition


########## 1. State Definition ##########
# State Definition
class State(TypedDict):
    # Add a message list comment
    messages: Annotated[list, add_messages]


########## 2. Tool definition and binding ##########
# Initialize tool
tool = TavilySearch(max_results=3)
tools = [tool]

# LLM Initialization
llm = ChatOpenAI(model="gpt-4o-mini")

# Combining tools and LLM
llm_with_tools = llm.bind_tools(tools)


########## 3. add note ##########
# Defining a chatbot function
def chatbot(state: State):
    # Calling and returning messages
    return {"messages": [llm_with_tools.invoke(state["messages"])]}


# Create a state graph
graph_builder = StateGraph(State)

# Add a chatbot node
graph_builder.add_node("chatbot", chatbot)

# Creating and adding tool nodes
tool_node = ToolNode(tools=[tool])

# Add tool node
graph_builder.add_node("tools", tool_node)

# Conditional Edge
graph_builder.add_conditional_edges(
    "chatbot",
    tools_condition,
)

########## 4. add edge ##########

# tools > chatbot
graph_builder.add_edge("tools", "chatbot")

# START > chatbot
graph_builder.add_edge(START, "chatbot")

# chatbot > END
graph_builder.add_edge("chatbot", END)
```

Finally, provided `checkpointer` Compile graphs using

```python
# Compile the graph builder
graph = graph_builder.compile(checkpointer=memory)
```

Graph connectivity `LangGraph-Agent` Same as

Just, what was added this time was that the graph handled each node `State` Just checkpointing.

```python
from langchain_teddynote.graphs import visualize_graph

# Graph visualization
visualize_graph(graph)
```

![](https://wikidocs.net/images/page/265658/langgraph-01.jpeg)

### RunnableConfig Settings <a href="#runnableconfig" id="runnableconfig"></a>

`RunnableConfig` Define `recursion_limit` and `thread_id` Set.

* `recursion_limit` : Maximum number of nodes to visit. More than that, RecursionError
* `thread_id` : Thread ID setting

`thread_id` Is used to separate conversation sessions. In other words, the storage of memory `thread_id` It is done individually according to.

```python
from langchain_core.runnables import RunnableConfig

config = RunnableConfig(
    recursion_limit=10,  # Visit up to 10 nodes, more than that will result in RecursionError
    configurable={"thread_id": "1"},  # 스레드 ID 설정
)
```

```python
# First question
question = (
    "My name is `Teddy Note`. I run a YouTube channel. Nice to meet you."
)

for event in graph.stream({"messages": [("user", question)]}, config=config):
    for value in event.values():
        value["messages"][-1].pretty_print()
```

```
 ================================== Ai Message ================================== 

Hello, Teddy Note! Nice to meet you. I'm running a YouTube channel, so it's cool! What content are you mainly dealing with? 
```

```python
# 이어지는 질문
question = "What did you say my name was?"

for event in graph.stream({"messages": [("user", question)]}, config=config):
    for value in event.values():
        value["messages"][-1].pretty_print()
```

```
 ================================== Ai Message ================================== 

Your name is Teddy Note. 
```

this time `RunnableConfig` of `thread_id` After changing, I'll ask if you remember the previous conversation.

```python
from langchain_core.runnables import RunnableConfig

question = "What did you say my name was?"

config = RunnableConfig(
    recursion_limit=10,  # Visit up to 10 nodes, more than that will result in RecursionError
    configurable={"thread_id": "2"},  # Thread ID setting
)

for event in graph.stream({"messages": [("user", question)]}, config=config):
    for value in event.values():
        value["messages"][-1].pretty_print()
```

```
 ================================== Ai Message ================================== 

Sorry, there is no function to remember your name. If you haven't mentioned your name in the previous conversation, I don't know. I hope you give me your name! 
```

### Snapshot: Check the saved State <a href="#state" id="state"></a>

So far, I have made several checkpoints in two different threads.

`Checkpoint` Is the current status value, its configuration, and to be processed `next` Node included.

Graph at a given setting `state` Anytime to check `get_state(config)` Call

```python
from langchain_core.runnables import RunnableConfig

config = RunnableConfig(
    configurable={"thread_id": "1"},  # Thread ID setting
)
# Create a graph state snapshot
snapshot = graph.get_state(config)
snapshot.values["messages"]
```

```
 [HumanMessage (content=' My name is `Tedinot`. I am running a YouTube channel. Nice to meet you', additional_kwargs={}, response_metadata={}, id='0b4fb756-225d-49a0-a90b-acb39cb42d48'), AIMessage (content='Hello, Teddy Nice to meet you. I'm running a YouTube channel, so it's cool! '1':'Toks={'fusal': None}, response_metadata={'token_usage': {'completion_tokens': 45,'prompt_tokens': 105,'  lun-c9143f45-843c-478c-9360-47d2bce2207e-0', usage_metadata={'input_tokens': 105,'total_tokens': 150,' Nok:'Get',None={, respons_metadata}'token_usage': ={'completion_tokens': 12,'prompt_tokens': 165,'total_tokens':1  gpt-4o-mini-2024-07-18','system_fingerprint':'fp_0ba0d124f1','finish_reason':'stop','logprobs': None}, id='run-7cc53da-982d- response_metadata={}, id='c0c1ad0a-7c51-49ba-9aa8-fa125b501ab2'), AIMessage (content=' Your name is Tedinot.', additional_kwargs={'refus  None,'reasoning_tokens': 0},'prompt_tokens_details': {'audio_tokens': None,'cached_tokens':'gpt-4o-mini-2024-07  'stop','logprobs': None}, id='run-e65d423b-fb03-4237-8f51-15d8b56df860-0', usage_metadata={'input_tokens': 192,'  'stop','logprobs': None}, id='run-e65d423b-fb03-4237-8f51-15d8b56df860-0', usage_metadata={'input_tokens': 192,' 
```

`snapshot.config` You can check the config information set to output.

```python
# Set config information
snapshot.config
```

```
 {'configurable': {'thread_id': '1','checkpoint_ns':'','checkpoint_id': '1ef99493-62b0-6020-8007-4fb8c65ebabc'}} 
```

`snapshot.value` You can check the state values stored so far by outputting.

```python
# stored values
snapshot.values
```

```
 {'messages': [HumanMessage (content=' My name is `Tedinot`. I am running a YouTube channel. Nice to meet you', additional_kwargs={}, response_metadata={}, id='0b4fb756-225d-49a0-a90b-acb39cb42d48'), AIMessage (content='Hello, Teddy Nice to meet you. I'm running a YouTube channel, so it's cool! '1':'Toks={'fusal': None}, response_metadata={'token_usage': {'completion_tokens': 45,'prompt_tokens': 105,'  id='run-c9143f45-843c-478c-9360-47d2bce207e-0', usage_metadata={'input_tokens': 45,'total_tokens'Your name is Tedinot.', additional_kwargs={'refusal': None}, response_metadata={'token_usage': {'completion_tokens': 12,'prompt_tokens': 165,'tot  '4p','fp_0ba0d124f1','fp','fin','gout','gop','logprobs': None} response_metadata={}, id='c0c1ad0a-7c51-49ba-9aa8-fa125b501ab2'), AIMessage (content=' Your name is Tedinot.', additional_kwargs={'refus  {'audio_tokens': None,'reasoning_tokens': 0},'prompt_tokens_details': {'audio_tokens': None,'model_name':'gpt 0}})]}  'fp_0ba0d124f1','finish_reason':'stop','logprobs': None}, id='run-e65d423b-fb03-4237-8f51-15d8b56df860-0', usage_met  'fp_0ba0d124f1','finish_reason':'stop','logprobs': None}, id='run-e65d423b-fb03-4237-8f51-15d8b56df860-0', usage_met 
```

`snapshot.next` Output to find forward at this point **Check the following nodes** You can.

**END** Since it has been reached, the next node is output with an empty value.

```python
# next node
snapshot.next
```

```
 () 
```

```python
snapshot.metadata["writes"]["chatbot"]["messages"][0]
```

```
 AIMessage (content=' Your name is Tedinot.', additional_kwargs={'fusal': None}, response_metadata={'token_usage': 12,'promrun-e65d423b-fb03-4237-8f51-15d8b56df860-0', usage_metadata={'input_tokens': 192,'output_tokens': 12,'total_tokens': 204,'  reasoning': 0}}) 
```

To visualize metadata of complex structures `display_message_tree` Use functions.

```python
from langchain_teddynote.messages import display_message_tree

# Metadata (output in tree format)
display_message_tree(snapshot.metadata)
```

![](https://wikidocs.net/images/page/265658/langgraph-02.png)

<br>
