# 10. How to call a tool using ToolNode

## How to call a tool using ToolNode <a href="#toolnode" id="toolnode"></a>

In this tutorial, LangGraph's pre-built for calling tools `pre-built` of `ToolNode` Covers how to use it.

`ToolNode` Is LangChain Runnable, which takes the graph status with a list of messages as input and updates the status as a result of the tool call.

It is designed for immediate use with LangGraph's pre-built Agent, and has a suitable redo for the state. `messages` All keys included `StateGraph` Can work with

```python
# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()
```

```
 True 
```

```python
# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH17-LangGraph-Modules")
```

```
 Start tracking LangSmith. 
[Project name] 
CH17-LangGraph-Modules 
```

### Tool definition <a href="#id-1" id="id-1"></a>

First, let's define the tools.

```python
from langchain_core.messages import AIMessage
from langchain_core.tools import tool
from langchain_experimental.tools.python.tool import PythonAstREPLTool
from langchain_teddynote.tools import GoogleNews
from typing import List, Dict


# Create a tool
@tool
def search_news(query: str) -> List[Dict[str, str]]:
    """Search Google News by input keyword"""
    news_tool = GoogleNews()
    return news_tool.search_by_keyword(query, k=5)


@tool
def python_code_interpreter(code: str):
    """Call to execute python code."""
    return PythonAstREPLTool().invoke(code)
```

Next `ToolNode` Let's look at how to call the tool using.

```python
from langgraph.prebuilt import ToolNode, tools_condition

# Create a tool list
tools = [search_news, python_code_interpreter]

# Initializing ToolNode
tool_node = ToolNode(tools)
```

### `ToolNode` Calling manually <a href="#toolnode_1" id="toolnode_1"></a>

`ToolNode` works in graph state with message list.

* **important** : At this time, the last message in the list `tool_calls` Contains properties `AIMessage` Should be.

First, let's look at how to manually call the tool node.

```python
# Create an AI message object containing a single tool call.ww
# www
message_with_single_tool_call = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "search_news",  # tool name
            "args": {"query": "AI"},  # tool argument
            "id": "tool_call_id",  # Tool Call ID
            "type": "tool_call",  # Tool call type
        }
    ],
)

# Process messages and execute weather information requests via tool nodes
tool_node.invoke({"messages": [message_with_single_tool_call]})
```

```
 {'messages': [ToolMessage (content=' [{" url": "https://news.google.com/rss/articles/CBMiVkFV", "content": "中, meta AI \'Lama\' using military chatbot development -TAG1>, } "https://news.google.com/rss/articles/CBMifEFVX", "content": "Where's the University of Popularity?News>Natural ‘AI, Semiconductor ’, Humanities ‘Management ’ -Trends" }, {"url": "https://news.google.com/rss/articles/CBMiW0FVX", "content": "Natural to Occupational Difficulty ‘AI.  search_news', tool_call_id='tool_call_id')]} 
```

Generally `AIMessage` There is no need to create manually, it is automatically generated by all LangChain chat models that support tool calls.

Also `AIMessage` of `tool_calls` If you pass multiple tool calls to the parameter `ToolNode` You can use to make parallel tool calls.

```python
# Create and initialize an AI message object that includes multiple tool calls.
message_with_multiple_tool_calls = AIMessage(
    content="",
    tool_calls=[
        {
            "name": "search_news",
            "args": {"query": "AI"},
            "id": "tool_call_id",
            "type": "tool_call",
        },
        {
            "name": "python_code_interpreter",
            "args": {"code": "print(1+2+3+4)"},
            "id": "tool_call_id",
            "type": "tool_call",
        },
    ],
)

# Execute multiple tool calls by passing the generated message to the tool node.ww
tool_node.invoke({"messages": [message_with_multiple_tool_calls]})
```

```
 {'messages': [ToolMessage (content=' [{" url": "https://news.google.com/rss/articles/CBMiVkFV", "content": "中, meta AI \'Lama\' using military chatbot development -TAG1>, } "https://news.google.com/rss/articles/CBMifEFVX", "content": "Where's the University of Popularity?·Natural ‘AI· Semiconductor ’, Humanities ‘Management ’ -Trends" }, {"url": "https://news.google.com/rss/articles/CBMiW0FV"}]', name='search_news', tool_call_id='tool_call  10\n', name='python_code_interpreter', tool_call_id='tool_call_id')]} 
```

### for use with llm <a href="#llm" id="llm"></a>

In order to use a chat model with a tool call function, you must first ensure that the model recognizes the available tools.

This is `ChatOpenAI` In the model `.bind_tools` This is done by calling the method.

```python
from langchain_openai import ChatOpenAI

# LLM model initialization and tool binding
model_with_tools = ChatOpenAI(model="gpt-4o-mini", temperature=0).bind_tools(tools)
```

```python
# Tool call confirmation
model_with_tools.invoke("처음 5개의 소수를 출력하는 python code 를 작성해줘").tool_calls
```

```
 [{'name':'python_code_interpreter','args': {'code':'def first_n_primes(n):\n primes = []\n num = 2 # Starting from the first prime numbernn if is_prime:\n primes.append(num)\n num += 1\n return primes\n\n# Get the first 5 prime numbers\nfirst_n_primes(5)'},'id':'call_I4g0TPHwqHNQJt5 
```

As you can see, the AI message generated by the chat model already `tool_calls` Because it is stuffed, `ToolNode` You can pass it directly to.

```python
# Processing messages via tool nodes and generating tool-based responses in the LLM model
tool_node.invoke(
    {
        "messages": [
            model_with_tools.invoke(
                "Write a python code that prints the first 5 prime numbers."
            )
        ]
    }
)
```

```
 {'messages': [ToolMessage (content='[2, 3, 5, 7, 11]', name='python_code_interpreter', tool_call_id='call_STkrvwhghY2TvFHE4ih 
```

### Use with Agent <a href="#agent" id="agent"></a>

Next, within the LangGraph graph `ToolNode` Let's look at how to use.

Let's set up a graph implementation for Agent. this **Agent** Receive the query as input and repeatedly call the tools until you get enough information to solve the query.

With the tools you just defined `ToolNode` And you will use the OpenAI model.

```python
# Importing types for LangGraph workflow state and message processing
from langgraph.graph import StateGraph, MessagesState, START, END


# Process messages and generate responses using the LLM model, returning responses that include tool calls.
def call_model(state: MessagesState):
    messages = state["messages"]
    response = model_with_tools.invoke(messages)
    return {"messages": [response]}


# Initialize a message state-based workflow graph
workflow = StateGraph(MessagesState)

# Define agent and tool nodes and add them to the workflow graph.
workflow.add_node("agent", call_model)
workflow.add_node("tools", tool_node)

# Connecting from a workflow start point to an agent node
workflow.add_edge(START, "agent")

# Set conditional branch from agent node, link to tool node or end pointwwwwwwwww
workflow.add_conditional_edges("agent", tools_condition)

# Circular connection from tool node to agent node
workflow.add_edge("tools", "agent")

# Connecting from agent node to endpoint
workflow.add_edge("agent", END)


# Compile defined workflow graph and generate executable application
app = workflow.compile()
```

```python
from langchain_teddynote.graphs import visualize_graph

visualize_graph(app)
```

![](https://wikidocs.net/images/page/265763/langgraph-13.jpeg)

I'll run and check the results.

```python
# Run and check the results
for chunk in app.stream(
    {"messages": [("human", "처음 5개의 소수를 출력하는 python code 를 작성해줘")]},
    stream_mode="values",
):
    # Print last message
    chunk["messages"][-1].pretty_print()
```

```
 ================================ Human Message ================================= 
Create a python code that outputs the first 5 decimals 
================================== Ai Message ================================== 
Tool Calls: 
  python_code_interpreter (call_dIBd3Y7ibDapRosEbMAunfdl) 
 Call ID: call_dIBd3Y7ibDapRosEbMAunfdl 
  Args: 
    code: default first_n_primes(n): 
    primes = [] 
    num = 2 # Starting from the first prime number 
    while len (primes) < n: 
        is_prime = True 
        for i in range (2, int (num**0.5) + 1): 
            if num% i == 0: 
                is_prime = False 
                break 
        if is_prime: 
            primes.append(num) 
        num += 1 
    return primes 

# Get the first 5 prime numbers 
first_n_primes(5) 
================================= Tool Message ================================= 
Name: python_code_interpreter 

[2, 3, 5, 7, 11] 
================================== Ai Message ================================== 

The first five decimals are: [2, 3, 5, 7, 11] 
```

```python
# Perform a search query
for chunk in app.stream(
    {"messages": [("human", "search google news about AI")]},
    stream_mode="values",
):
    chunk["messages"][-1].pretty_print()
```

```
 ================================ Human Message ================================= 
search google news about AI 
================================== Ai Message ================================== 
Tool Calls: 
  search_news (call_prnhiLquU2lQmS6kmmP1UCJh) 
 Call ID: call_prnhiLquU2lQmS6kmmP1UCJh 
  Args: 
    query: AI 
================================= Tool Message ================================= 
Name: search_news 

News [{"url": "https://news.google.com/rss/articles/CBMiVkFVX3", "content": "中, meta AI'Rama' used to develop a military chatbot -ZDnet Korea"}, {"url": "https://news.google.com/rss/articles/CBMiW0FV", "content": "Big Tech Big4,'AI "Ariving War'. Channel closed by repulsion in one week -AI time" }, {"url": "https://news.google.com/rss/articles/CBMiekFVX3", "content": "AI has gone to fate... NVIDIA pushes Intel and transfers Daiji-Trends" }] 
================================== Ai Message ================================== 
Here are some recent news articles about AI: 

One. **[Use meta AI'Lama' to develop military chatbots-ZDnet Korea] (https://news.google.com/rss/articles/CBMiVkFVX)**  
   Meta is developing a military chatbot using its AI model'Llama'. 

2. **[Big Tech Big4,'AI No wonder War'... This year's investment 288 trillion won prospects-Unified News] (https://news.google.com/rss/articles/CBMiW0FVX)**  
   The big four tech companies are projected to invest 288 trillion won in AI this year. 

3. **[Natural sequence ‘AI·semiconductor ’ bullish... Humanities ‘Department of Business Administration ’ - KBS News] (https://news.google.com/rss/articles/CBMiW0FVX)**  
   In the job market, there is a strong demand for AI and semiconductor skills, while humanities graduates are leaning towards business studies. 

4. **[Polish station fired by people and introduced AI Human, channel closed by backlash in a week-AITimes] (https://news.google.com/rss/articles/CBMiakFVX3)**  
   A Polish broadcaster that replaced human workers with AI has closed its channel after a week due to backlash. 

5. **[AI has gone fateful... NVIDIA pushes Intel and moves in Dai-Trends] (https://news.google.com/rss/articles/CBMiekFVX3l)**  
   AI has played a crucial role in NVIDIA surpassing Intel to be included in the Dow Jones index. 

Feel free to click on the links for more details! 
```

```python
# Performing queries without requiring tool calls
for chunk in app.stream(
    {"messages": [("human", "Hello? Nice to meet you.")]},
    stream_mode="values",
):
    chunk["messages"][-1].pretty_print()
```

```
 ================================ Human Message ================================= 
Hello? Nice to meet you 
================================== Ai Message ================================== 
Hello! Nice to meet you. How can I help you? 
```

`ToolNode` It can also handle errors that occur during tool execution.

`handle_tool_errors=True` You can enable/disable this function by setting (it is enabled by default)

<br>
