# 07. Multi-Agent Supervisor

## Multi-Agent Supervisor <a href="#multi-agent-supervisor" id="multi-agent-supervisor"></a>

In this tutorial **LangGraph** Let's take a look at how to leverage to build a multi-agent system, efficiently coordinate inter-agent operations, and manage them through Supervisor.\
It covers multiple agents at the same time, manages each agent to perform its own role, and covers the process of properly handling it upon completion of work.

***

**summary**

Previous tutorials showed how to automatically route messages based on the output of the initial Researcher agent.\
However, if the agents are growing in multiple, and if they need to be adjusted, there is a limit to simple branch logic alone.\
here [Supervisor using LLM ](https://langchain-ai.github.io/langgraph/concepts/multi_agent/#supervisor)Introducing how to manage agents through and coordinate the entire team based on the results of each agent node.

**Focus** :\
-Supervisor brings together various professional agents and serves as a team (team).\
-Supervisor agents observe the progress of the team and perform logic such as calling the appropriate agent for each step or ending the task.

![](https://wikidocs.net/images/page/270690/langgraph-multi-agent-supervisor.png)

***

**What to cover in this tutorial**

* **Setup** : How to install the required package and set the API key
* **Tool Creation** : Define tools for agents to use, such as web browsing and plot creation
* **Helper Utilities** : Define utility functions required to generate agent nodes
* **Agent supervisor creation (Creating the Supervisor)** : Selection of worker nodes and creation of Supervisor with processing logic upon completion of work
* **Graph Configuration (Constructing the Graph)** : Define the state and worker nodes to configure the entire graph
* **Team Call (Invoking the Team)** : Call graph to see how multiple agent systems actually work

LangGraph's pre-built in this process [create\_react\_agent ](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)Utilize functions to simplify each agent node.

This way of using "advanced agents" is to demonstrate specific design patterns in LangGraph, and you can combine them with other basic patterns as needed to achieve optimal results.

***

**Reference**

* [LangGraph official document](https://langchain-ai.github.io/langgraph/)
* [Multi-agent Supervisor concept](https://langchain-ai.github.io/langgraph/concepts/multi_agent/#supervisor)
* [documentation of create\_react\_agent function](https://langchain-ai.github.io/langgraph/reference/prebuilt/#langgraph.prebuilt.chat_agent_executor.create_react_agent)

### Preferences <a href="#id-1" id="id-1"></a>

```python
# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()
```

```
 True 
```

```python
# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH17-LangGraph-Use-Cases")
```

```
 Start tracking LangSmith. 
[Project name] 
CH17-LangGraph-Use-Cases 
```

Set the model name to use for this tutorial.

```python
from langchain_teddynote.models import get_model_name, LLMs

# Get the latest version of the model name.
MODEL_NAME = get_model_name(LLMs.GPT4)
print(MODEL_NAME)
```

```
 gpt-4o-mini
```

### Status definition <a href="#id-2" id="id-2"></a>

Defines the state to utilize in the multi-agent system.

```python
import operator
from typing import Sequence, Annotated
from typing_extensions import TypedDict

from langchain_core.messages import BaseMessage


# state definition
class AgentState(TypedDict):
    messages: Annotated[Sequence[BaseMessage], operator.add]  # 메시지
    next: str  # next agent to route to
```

### Agent creation <a href="#id-3" id="id-3"></a>

#### Create tool <a href="#tool" id="tool"></a>

In this example, we use search engines to create agents that perform web investigations and agents that generate plots.

Define the tools to use below.

* **Research** : `TavilySearch` Perform a web survey using tools.
* **Coder** : `PythonREPLTool` Run code using tools.

```python
from langchain_teddynote.tools.tavily import TavilySearch
from langchain_experimental.tools import PythonREPLTool

# Initialize Tavily search tool that returns up to 5 search results
tavily_tool = TavilySearch(max_results=5)

# Initialize a Python REPL tool to run code locally (may be unsafe)
python_repl_tool = PythonREPLTool()
```

### Implement Utility to create Agent <a href="#agent-utility" id="agent-utility"></a>

When building a multi-agent system using LangGraph, **helper function** plays an important role in creating and managing agent nodes. These functions increase the reusability of the code and simplify the interaction between agents.

* **Generate agent nodes** : Define functions to generate nodes for each agent's role
* **Workflow management** : Provide utilities to adjust and optimize workflow between agents
* **Error processing** : Includes mechanisms to efficiently handle errors that may occur during agent execution next `agent_node` LA is an example that defines a function.

This function creates an agent node using a given state and agent. Later this function `functools.partial` I will call it using.

```python
from langchain_core.messages import HumanMessage


# Create an agent node using the specified agent and name.
def agent_node(state, agent, name):
    # call agent
    agent_response = agent.invoke(state)
    # Returns the last message of gent as a HumanMessage.
    return {
        "messages": [
            HumanMessage(content=agent_response["messages"][-1].content, name=name)
        ]
    }
```

**Reference**

`functools.partial` Role of

`functools.partial` Is used to create new functions by pre-fixing some of the existing functions or keyword factors. In other words, it helps to simplify the frequently used function calling pattern.

**role**

1. **Generate new functions with predefined values** : Returns a new function by pre-specifying some of the existing functions.
2. **Code brevity** : Simplifies frequently used function call patterns to reduce code redundancy.
3. **Improved readability** : Customize the behavior of the function to suit a specific task, making it more intuitively available.

> Yeshikid

```python
esearch_node = functools.partial(agent_node, agent=research_agent, names="Researcher")
```

1. `agent_node` Ra assumes that there is an existing function.
2. This function can receive multiple factors and keyword factors.
3. `functools.partial` Silver to this function `agent=research_agent` Wow `names="Researcher"` Fixed value.
4. In other words, now `research_node` has `agent_node` When calling `agent` Wow `names` You don't have to specify a value separately.
5. For example: `python agent_node(state, agent=research_agent, names="Researcher")` instead, `python research_node(state)` You can use it like. Below `functools.partial` Using `research_node` This is an example that produces.

```python
import functools
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI


# Create a Research Agent
research_agent = create_react_agent(ChatOpenAI(model="gpt-4o"), tools=[tavily_tool])

# Create research node
research_node = functools.partial(agent_node, agent=research_agent, name="Researcher")
```

Run the code to confirm the result.

```python
research_node(
    {
        "messages": [
            HumanMessage(content="Code hello world and print it to the terminal")
        ]
    }
)
```

````
 {'messages': [HumanMessage (content='To print "Hello, World!" to the terminal in various programming languages, you can use the following code snippets:\n\n### Python\n```python\nprint("Hello, World!")\n``\n\n### JavaScript (Node.js)\n```javascript\nconsole.log("Hello, World!");\n``\n\n### Java\n```java\npublic class HelloWorld {\n public static void main (String[] args) {\n System.out.println("Hello, World!");\n }\n}\n```\n\n### C\n```c\n#include   \n\nint main() {\n printf("Hello, World!\\n");\n return 0;\n}\n```\n\n### C++\n```cpp\n#include   \n\nint main() {\n std::cout << "Hello, World!" << std::endl;\n return 0;\n}\n```\n\n### Ruby\n```ruby\nputs "Hello, World!"\n``\n\n### Go\n```go\npackage main\n\nimport "fmt"\n\nfunc main() {\n fmt.Println("Hello, World!")\n}\n``\n\n### PHP\n```php\n  \n``\n\n### Bash\n``bash\necho "Hello, World!"\n``\n\nChoose the language you are working with and run the respective code snippet in the approximate environment or terminal.', additional_kwargs={}, response_metadata={}, name=< 
````

#### Agent Supervisor creation <a href="#agent-supervisor" id="agent-supervisor"></a>

Generate a supervisor agent that oversees the agent.

```python
from pydantic import BaseModel
from typing import Literal

# Define the member Agent list
members = ["Researcher", "Coder"]

# Define the following worker selection options list
options_for_next = ["FINISH"] + members


# Define a worker selection response model: Select the next worker or indicate task completion.
class RouteResponse(BaseModel):
    next: Literal[*options_for_next]
```

```python
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_openai import ChatOpenAI


# Defining System Prompts: A supervisory role that manages conversations between workers.
system_prompt = (
    "You are a supervisor tasked with managing a conversation between the"
    " following workers:  {members}. Given the following user request,"
    " respond with the worker to act next. Each worker will perform a"
    " task and respond with their results and status. When finished,"
    " respond with FINISH."
)

# ChatPromptTemplate generation
prompt = ChatPromptTemplate.from_messages(
    [
        ("system", system_prompt),
        MessagesPlaceholder(variable_name="messages"),
        (
            "system",
            "Given the conversation above, who should act next? "
            "Or should we FINISH? Select one of: {options}",
        ),
    ]
).partial(options=str(options_for_next), members=", ".join(members))


# LLM Initialization
llm = ChatOpenAI(model=MODEL_NAME, temperature=0)


# Create a Supervisor Agent
def supervisor_agent(state):
    # Combine prompts and LLM to build a chain
    supervisor_chain = prompt | llm.with_structured_output(RouteResponse)
    # call agent
    return supervisor_chain.invoke(state)
```

### Graph configuration <a href="#id-4" id="id-4"></a>

```python
Copyfrom langchain_teddynote.graphs import visualize_graph

visualize_graph(graph)
```

Visualize the graph.

```python
from langgraph.graph import END, StateGraph, START
from langgraph.checkpoint.memory import MemorySaver

# create a graph
workflow = StateGraph(AgentState)

# add a node to the graph
workflow.add_node("Researcher", research_node)
workflow.add_node("Coder", coder_node)
workflow.add_node("Supervisor", supervisor_agent)


# Add Edge from Member Node > Supervisor Node
for member in members:
    workflow.add_edge(member, "Supervisor")

# Add conditional edge (
conditional_map = {k: k for k in members}
conditional_map["FINISH"] = END


def get_next(state):
    return state["next"]


# Add conditional edge on supervisor node
workflow.add_conditional_edges("Supervisor", get_next, conditional_map)

# starting point
workflow.add_edge(START, "Supervisor")

# compile the graph
graph = workflow.compile(checkpointer=MemorySaver())
```

```python
import functools
from langgraph.prebuilt import create_react_agent


# Research Agent generation
research_agent = create_react_agent(llm, tools=[tavily_tool])
research_node = functools.partial(agent_node, agent=research_agent, name="Researcher")

code_system_prompt = """
Be sure to use the following font in your code for visualization.

##### fonr settingsw #####
import platfrm

# OS 판단
current_os = platform.system()

if current_os == "Windows":
    # Windows environment font settings
    font_path = "C:/Windows/Fonts/malgun.ttf"  # clear gothic font path
    fontprop = fm.FontProperties(fname=font_path, size=12)
    plt.rc("font", family=fontprop.get_name())
elif current_os == "Darwin":  # macOS
    # Mac environment font settings
    plt.rcParams["font.family"] = "AppleGothic"
else:  # Other OS such as Linux
    # Try setting the default Korean font
    try:
        plt.rcParams["font.family"] = "NanumGothic"
    except:
        print("The Korean font could not be found. The system default font will be used.")

##### Prevent minus font breakage #####
plt.rcParams["axes.unicode_minus"] = False  # Minus font காயாஜிய்கு

"""


# Create a Coder Agent
coder_agent = create_react_agent(
    llm,
    tools=[python_repl_tool],
    state_modifier=code_system_prompt,
)
coder_node = functools.partial(agent_node, agent=coder_agent, name="Coder")
```

Now you are ready to build the graph. Below, using the function you just defined `state` Wow `worker` Define nodes.

Team call

The generated graph can now confirm performance.

```python
from langchain_core.runnables import RunnableConfig
from langchain_teddynote.messages import random_uuid, invoke_graph

# config settings (max recursion count, thread_id)
config = RunnableConfig(recursion_limit=10, configurable={"thread_id": random_uuid()})

# enter your question
inputs = {
    "messages": [
        HumanMessage(
            content="2010년 ~ Please visualize the trend of GDP per capita in South Korea until 2024 in a graph."
        )
    ],
}

# 그래프 실행
invoke_graph(graph, inputs, config)
```

```python
 ================================================== 
🔄 Node: Supervisor 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
next: 
Researcher 
================================================== 

================================================== 
🔄 Node: agent in [Researcher] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================== Ai Message ================================== 
Tool Calls: 
  tavily_web_search (call_4Q2Se5fqEU7Dj6FzuvSrMmpW) 
 Call ID: call_4Q2Se5fqEU7Dj6FzuvSrMmpW 
  Args: 
    query: South Korea GDP per capita 2010 to 2024 
================================================== 

================================================== 
🔄 Node: tools in [Researcher] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================= Tool Message ================================= 
Name: tavily_web_search 

[{"title": "South Korea GDP per capita 2024 - StatisticsTimes.com", "url": "https://statisticstimes.com/economy/country/south-korea-gdp-per-capita.php", "content": "GDP per capita of South Korea According to the IMF World Economic Outlook (October 2024), South Korea Based on PPP, South Korea's GDP per capita in 2024 is forecast at 62,960 billion international dolls.  
(...sway...) 
 $1,056 | 26.58% | | | $| 834 | 35.09% || 1969 | $243 | 22.67% | | | <3.13% $n| 1962 | $106 | 13.14% |\n| 1961 | $94 | -40.72% | 
================================================== 

================================================== 
🔄 Node: agent in [Researcher] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================== Ai Message ================================== 

Korea's GDP per capita from 2010 to 2024 is as follows: 

-**2010 years**: $23,079 
-**2011 years**: $25,098 
-**2012 years**: $25,459 
-**2013 years**: $27,180 
-**2014 years**: $29,253 
-**2015 years**: $28,737 
-**2016 years**: $29,280 
-**2017 years**: $31,601 
-**2018 years**: $33,447 
-**2019 years**: $31,902 
-**2020 years**: $31,721 
-**2021 years**: $35,126 
-**2022 years**: $32,395 
-**2023 years**: $33,121 
-**2024 years (expected)**: $36,132 

Based on this data, I will visualize the graph.  


import matplotlib.pyplot as pltyears = [2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024]gdp_per_capita = [23079, 25098, 25459, 27180, 29253, 28737, 29280, 31601, 33447, 31902, 31721, 35126, 32395, 33121, 36132]plt.figure(figsize=(10, 5))plt.plot(years, gdp_per_capita, marker='o')plt.title('South Korea GDP per Capita (2010-2024)')plt.xlabel('Year')plt.ylabel('GDP per Capita (USD)')plt.xticks(years)plt.grid()plt.show()


By running the above code, you can see a graph visualizing the GDP trend per capita in Korea. 
================================================== 

================================================== 
🔄 Node: Researcher 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================ Human Message ================================= 
Name: Researcher 

Korea's GDP per capita from 2010 to 2024 is as follows: 

-**2010 years**: $23,079 
-**2011 years**: $25,098 
-**2012 years**: $25,459 
-**2013 years**: $27,180 
-**2014 years**: $29,253 
-**2015 years**: $28,737 
-**2016 years**: $29,280 
-**2017 years**: $31,601 
-**2018 years**: $33,447 
-**2019 years**: $31,902 
-**2020 years**: $31,721 
-**2021 years**: $35,126 
-**2022 years**: $32,395 
-**2023 years**: $33,121 
-**2024 years (expected)**: $36,132 

Based on this data, I will visualize the graph.  


import matplotlib.pyplot as pltyears = [2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024]gdp_per_capita = [23079, 25098, 25459, 27180, 29253, 28737, 29280, 31601, 33447, 31902, 31721, 35126, 32395, 33121, 36132]plt.figure(figsize=(10, 5))plt.plot(years, gdp_per_capita, marker='o')plt.title('South Korea GDP per Capita (2010-2024)')plt.xlabel('Year')plt.ylabel('GDP per Capita (USD)')plt.xticks(years)plt.grid()plt.show()


By running the above code, you can see a graph visualizing the GDP trend per capita in Korea. 
================================================== 

================================================== 
🔄 Node: Supervisor 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
next: 
Coder 
================================================== 
```

```
================================================== 
🔄 Node: agent in [Coder] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================== Ai Message ================================== 
Tool Calls: 
  Python_REPL (call_1XGmCbnhcrVJctuWVgEIwjoI) 
 Call ID: call_1XGmCbnhcrVJctuWVgEIwjoI 
  Args: 
    query: import matplotlib.pyplot as plt 

# data 
years = [2010, 2011, 2012, 2013, 2014, 2015, 2016, 2017, 2018, 2019, 2020, 2021, 2022, 2023, 2024] 
gdp_per_capita = [23079, 25098, 25459, 27180, 29253, 28737, 29280, 31601, 33447, 31902, 31721, 35126, 

# Graph settings 
plt.figure(figsize=(10, 5)) 
plt.plot (years, gdp_per_capita, marker='o') 
plt.title ('South Korea GDP per Capita (2010-2024)') 
plt.xlabel ('Year') 
plt.ylabel ('GDP per Capita (USD)') 
plt.xticks (years) 
plt.grid() 

# Graph output 
plt.show() 
================================================== 

================================================== 
🔄 Node: tools in [Coder] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================= Tool Message ================================= 
Name: Python_REPL 


================================================== 

================================================== 
🔄 Node: agent in [Coder] 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================== Ai Message ================================== 

Graphs were successfully visualized. Below is a graph showing GDP trends per capita in Korea from 2010 to 2024. 

![South Korea GDP per Capita (2010-2024)] (attachment://South_Korea_GDP_per_Capita_2010-2024.png) 

This graph shows the overall increase in GDP per capita in Korea. GDP in 2024 is included as an estimate. 
================================================== 

================================================== 
🔄 Node: Coder 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
================================ Human Message ================================= 
Name: Coder 

Graphs were successfully visualized. Below is a graph showing GDP trends per capita in Korea from 2010 to 2024. 

![South Korea GDP per Capita (2010-2024)] (attachment://South_Korea_GDP_per_Capita_2010-2024.png) 

This graph shows the overall increase in GDP per capita in Korea. GDP in 2024 is included as an estimate. 
================================================== 

================================================== 
🔄 Node: Supervisor 🔄 
- - - - - - - - - - - - - - - - - - - - - - - - - - - -  
next: 
FINISH 
================================================== 
```

\ <br>
