# 15. Everything in LangGraph streaming mode

## Everything in LangGraph streaming mode <a href="#langgraph" id="langgraph"></a>

`graph` The entire state of **streaming** How to do LangGraph supports multiple streaming modes.

The main modes are:

* `values` : This streaming mode streams the values of the graph. This is after each node is called **Full state of graph** Means.
* `updates` : This streaming mode streams the graph's updates. This is after each node is called **Updates to graph status** Means.
* `messages` : This streaming mode streams messages from each node. At this time **Output streaming in token units in LLM** It is also possible.

### Preferences <a href="#id-1" id="id-1"></a>

```python
# API 키를 환경변수로 관리하기 위한 설정 파일
from dotenv import load_dotenv

# API 키 정보 로드
load_dotenv()
```

```
 True 
```

```python
# LangSmith 추적을 설정합니다. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# 프로젝트 이름을 입력합니다.
logging.langsmith("CH17-LangGraph")
```

```
 Start tracking LangSmith. 
[Project name] 
CH17-LangGraph 
```

### Define graph <a href="#id-2" id="id-2"></a>

I will use a simple agent in this guide.

```python
from typing import Annotated, List, Dict
from typing_extensions import TypedDict
from langchain.tools import tool
from langchain_teddynote.tools import GoogleNews
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition


########## 1. 상태 정의 ##########
# 상태 정의
class State(TypedDict):
    # 메시지 목록 주석 추가
    messages: Annotated[list, add_messages]


########## 2. 도구 정의 및 바인딩 ##########
# 도구 초기화
# 키워드로 뉴스 검색하는 도구 생성
@tool
def search_keyword(query: str) -> List[Dict[str, str]]:
    """Look up news by keyword"""
    news_tool = GoogleNews()
    return "\n".join([f'- {news["content"]}' for news in news_tool.search_by_keyword(query, k=5)])


# 도구 리스트 생성
tools = [search_keyword]

# LLM 초기화
llm = ChatOpenAI(model="gpt-4o-mini")

# 도구와 LLM 결합
llm_with_tools = llm.bind_tools(tools)


########## 3. 노드 추가 ##########
# 챗봇 함수 정의
def chatbot(state: State):
    # 메시지 호출 및 반환
    return {"messages": [llm_with_tools.invoke(state["messages"])]}


# 상태 그래프 생성
graph_builder = StateGraph(State)

# 챗봇 노드 추가
graph_builder.add_node("chatbot", chatbot)

# 도구 노드 생성 및 추가
tool_node = ToolNode(tools=[search_keyword])

# 도구 노드 추가
graph_builder.add_node("tools", tool_node)

# 조건부 엣지
graph_builder.add_conditional_edges(
    "chatbot",
    tools_condition,
)

########## 4. 엣지 추가 ##########

# tools > chatbot
graph_builder.add_edge("tools", "chatbot")

# START > chatbot
graph_builder.add_edge(START, "chatbot")

# chatbot > END
graph_builder.add_edge("chatbot", END)

# 그래프 컴파일
graph = graph_builder.compile()
```

```python
from langchain_teddynote.graphs import visualize_graph

visualize_graph(graph)
```

![](https://wikidocs.net/images/page/265770/langgraph-23.jpeg)

### Step-by-step output of the node <a href="#id-3" id="id-3"></a>

**Streaming mode** - `values` : Output the current state value for each step - `updates` : Output only status updates for each step (default) - `messages` : Message output for each step

The meaning of streaming here is not the concept of streaming in token units during LLM output, but rather output step by step.

#### `stream_mode = "values"` <a href="#stream_mode-values" id="stream_mode-values"></a>

`values` Mode outputs the current state value for each step.

**Reference**

`chunk.items()`

* `key` : State key value
* `value` : Value for State's key

**Synchronous streaming**

* `chunk` Is a dictionary form (key: State key, value: State value)

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 동기 스트림 처리(stream_mode="values")
for chunk in graph.stream(inputs, stream_mode="values"):

    # chunk 는 dictionary 형태(key: State 의 key, value: State 의 value)
    for state_key, state_value in chunk.items():
        if state_key == "messages":
            state_value[-1].pretty_print()
```

```
 ================================ Human Message ================================= 

Search for the latest news related to AI 
================================== Ai Message ================================== 
Tool Calls: 
  search_keyword (call_UtfLmsQ5mOYxjIfK1zTVm7D4) 
 Call ID: call_UtfLmsQ5mOYxjIfK1zTVm7D4 
  Args: 
    query: AI 
================================= Tool Message ================================= 
Name: search_keyword 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 
================================== Ai Message ================================== 

The latest AI-related news is: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

If you have any further questions, please tell me! 
```

**Asynchronous streaming**

**Reference**

* `astream()` The method executes the graph through asynchronous stream processing and generates chunk-unit responses in value mode.
* `async for` Perform asynchronous stream processing using the door.

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 비동기 스트림 처리(stream_mode="values")
async for chunk in graph.astream(inputs, stream_mode="values"):
    # chunk 는 dictionary 형태(key: State 의 key, value: State 의 value)
    for state_key, state_value in chunk.items():
        if state_key == "messages":
            state_value[-1].pretty_print()
```

```
 ================================ Human Message ================================= 

Search for the latest news related to AI 
================================== Ai Message ================================== 
Tool Calls: 
  search_keyword (call_fcwR1gNrgl8htN52ZfqtvZRS) 
 Call ID: call_fcwR1gNrgl8htN52ZfqtvZRS 
  Args: 
    query: AI 
================================= Tool Message ================================= 
Name: search_keyword 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 
================================== Ai Message ================================== 

Here are the latest news related to AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

These news covers various applications of AI and changes within the industry. 
```

If you only want to check the final result, we will process it like this:

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

final_result = None

# 비동기 스트림 처리(stream_mode="values")
async for chunk in graph.astream(inputs, stream_mode="values"):
    final_result = chunk

# 최종 결과 출력
print(final_result["messages"][-1].content)
```

```
 The latest AI-related news is: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Tell us if you need more details! 
```

#### `stream_mode = "updates"` <a href="#stream_mode-updates" id="stream_mode-updates"></a>

`updates` Mode only exports updated State for each step.

* The output is the node name as key, and the updated value is values `dictionary` is.

**Reference**

`chunk.items()`

* `key` : Name of Node
* `value` : Output value (dictionary) at that node stage. That is, it is a dictionary with multiple key-value pairs.

**Synchronous streaming**

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 동기 스트림 처리(stream_mode="updates")
for chunk in graph.stream(inputs, stream_mode="updates"):
    # chunk 는 dictionary 형태(key: 노드, value: 노드의 상태 값)
    for node, value in chunk.items():
        if node:
            print(f"\n[Node: {node}]\n")
        if "messages" in value:
            value["messages"][-1].pretty_print()
```

```
 [Node: chatbot] 

================================== Ai Message ================================== 
Tool Calls: 
  search_keyword (call_NVvgX5iVK44aePNoO40Kulvj) 
 Call ID: call_NVvgX5iVK44aePNoO40Kulvj 
  Args: 
    query: AI 

[Node: tools] 

================================= Tool Message ================================= 
Name: search_keyword 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 

[Node: chatbot] 

================================== Ai Message ================================== 

Here are the latest news related to AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumbing... 'Romance' chat popular [AIbriefing]** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... 子 Semiconductor Industry, Modular Speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

For more information, please select a specific article! 
```

**Asynchronous streaming**

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 비동기 스트림 처리(stream_mode="updates")
async for chunk in graph.astream(inputs, stream_mode="updates"):
    # chunk 는 dictionary 형태(key: 노드, value: 노드의 상태 값)
    for node, value in chunk.items():
        if node:
            print(f"\n[Node: {node}]\n")
        if "messages" in value:
            value["messages"][-1].pretty_print()
```

```
 [Node: chatbot] 

================================== Ai Message ================================== 
Tool Calls: 
  search_keyword (call_rOEV3zUW5PsgGh1gQXJfMmlD) 
 Call ID: call_rOEV3zUW5PsgGh1gQXJfMmlD 
  Args: 
    query: AI 

[Node: tools] 

================================= Tool Message ================================= 
Name: search_keyword 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 

[Node: chatbot] 

================================== Ai Message ================================== 

Here is the latest news on AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

These news cover developments in various fields of AI and the resulting industrial changes. 
```

#### `stream_mode = "messages"` <a href="#stream_mode-messages" id="stream_mode-messages"></a>

`messages` Mode streams messages for each step.

**Reference**

* `chunk` is a tuple with two elements.
* `chunk_msg` : Real-time output message
* `metadata` : Node information

**Synchronous streaming**

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 동기 스트림 처리(stream_mode="messages")
# chunk_msg: 실시간 출력 메시지, metadata: 노드 정보
for chunk_msg, metadata  in graph.stream(inputs, stream_mode="messages"):

    # chatbot 노드에서 출력된 메시지만 출력
    if metadata["langgraph_node"] == "chatbot":
        if chunk_msg.content:
            print(chunk_msg.content, end="", flush=True)

    else:
        print(chunk_msg.content)
        print(f"\n\nmetadata: \n{metadata}\n\n")
```

```
 -中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 


metadata:  
{'langgraph_step': 2,'langgraph_node':'tools','langgraph_triggers': ['branch:chatbot:tools_condition:tools'],'langgraph_path': ('__pregel_pull','tools 


The latest AI-related news is: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Please tell me if you have any further questions! 
```

**Asynchronous streaming**

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 비동기 스트림 처리(stream_mode="messages")
# chunk_msg: 실시간 출력 메시지, metadata: 노드 정보
async for chunk_msg, metadata in graph.astream(inputs, stream_mode="messages"):
    # chatbot 노드에서 출력된 메시지만 출력
    if metadata["langgraph_node"] == "chatbot":
        if chunk_msg.content:
            print(chunk_msg.content, end="", flush=True)
    else:
        print(chunk_msg.content)
```

```
 -中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-President of Taekwon SK Group “The AI market expansion before and after 2027...O/I Hurry up ” -Electronic 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 
The latest AI-related news is: 

One. **China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumbing... 'Romance' chat popular** [AIbriefing] -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... 子 Semiconductor Industry, Modifier Speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Tell us if you need more details! 
```

### Streaming output to specific nodes <a href="#id-4" id="id-4"></a>

**Reference**

* `metadata["langgraph_node"]` Only messages output from a specific node can be output.

```python
from typing import Annotated, List, Dict
from typing_extensions import TypedDict
from langchain.tools import tool
from langchain_teddynote.tools import GoogleNews
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition


########## 1. 상태 정의 ##########
# 상태 정의
class State(TypedDict):
    # 메시지 목록 주석 추가
    messages: Annotated[list, add_messages]


########## 2. 도구 정의 및 바인딩 ##########
# 도구 초기화
# 키워드로 뉴스 검색하는 도구 생성
@tool
def search_keyword(query: str) -> List[Dict[str, str]]:
    """Look up news by keyword"""
    news_tool = GoogleNews()
    return "\n".join([f'- {news["content"]}' for news in news_tool.search_by_keyword(query, k=5)])


# 도구 리스트 생성
tools = [search_keyword]

# LLM 초기화
llm = ChatOpenAI(model="gpt-4o-mini")

# 도구와 LLM 결합 (tags 추가)
llm_with_tools = llm.bind_tools(tools).with_config(tags=["WANT_TO_STREAM"])

########## 3. 노드 추가 ##########
# 챗봇 함수 정의
def chatbot(state: State):
    # 메시지 호출 및 반환
    return {"messages": [llm_with_tools.invoke(state["messages"])]}


# 상태 그래프 생성
graph_builder = StateGraph(State)

# 챗봇 노드 추가
graph_builder.add_node("chatbot", chatbot)

# 도구 노드 생성 및 추가
tool_node = ToolNode(tools=[search_keyword])

# 도구 노드 추가
graph_builder.add_node("tools", tool_node)

# 조건부 엣지
graph_builder.add_conditional_edges(
    "chatbot",
    tools_condition,
)

########## 4. 엣지 추가 ##########

# tools > chatbot
graph_builder.add_edge("tools", "chatbot")

# START > chatbot
graph_builder.add_edge(START, "chatbot")

# chatbot > END
graph_builder.add_edge("chatbot", END)

# 그래프 컴파일
graph = graph_builder.compile()
```

```python
visualize_graph(graph, xray=True)
```

![](https://wikidocs.net/images/page/265770/langgraph-24.jpeg)

If you want to output from a specific node, you can set it through stream\_mode="messages".

`stream_mode="messages"` When setting, ( `chunk_msg` , `metadata` ) Receive messages in form. - `chunk_msg` Real-time output message, - `metadata` means node information.

`metadata["langgraph_node"]` Only messages output from a specific node can be output.

(Example) When only the message output from the chatbot node

`metadata["langgraph_node"] == "chatbot"`

```python
from langchain_core.messages import HumanMessage

# 사용자의 메시지를 딕셔너리 형태로 입력 데이터 구성
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# stream_mode="messages" 를 통한 스트리밍 처리
for chunk_msg, metadata in graph.stream(inputs, stream_mode="messages"):
    # HumanMessage 가 아닌 최종 노드의 유효한 컨텐츠만 출력 처리
    if (
        chunk_msg.content
        and not isinstance(chunk_msg, HumanMessage)
        and metadata["langgraph_node"] == "chatbot"
    ):
        print(chunk_msg.content, end="", flush=True)
```

```
 Here is the latest news on AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Extreme SK Group Chairman “AI market expansion before and after 2027...O/I Hurry up ”** - Electronic newspaper 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Please tell me if you have any further questions! 
```

You can check the node information by outputting metadata.

```python
metadata
```

```
 {'langgraph_step': 3,'langgraph_node':'chatbot','langgraph_triggers': ['tools'],'langgraph_path': ('__pregel_pull','chatbotbot') 
```

### Customization `tag` Filtered streaming <a href="#tag" id="tag"></a>

If the output of LLM occurs in multiple places, you may want to output only the messages output from a specific node.

In this case, `tags` You can select only the nodes you want to output by adding.

Here's how to add tags to llm: `tags` Can be added in the form of a list.

`llm.with_config(tags=["WANT_TO_STREAM"])`

This allows you to filter events more accurately to keep only events from that model. The example below `WANT_TO_STREAM` An example that prints only when tagged.

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 비동기 이벤트 스트림 처리(astream_events)
async for event in graph.astream_events(inputs, version="v2"):
    # 이벤트 종류와 태그 정보 추출
    kind = event["event"]
    tags = event.get("tags", [])

    # 채팅 모델 스트림 이벤트 및 최종 노드 태그 필터링
    if kind == "on_chat_model_stream" and "WANT_TO_STREAM" in tags:
        # 이벤트 데이터 추출
        data = event["data"]

        # 출력 메시지
        if data["chunk"].content:
            print(data["chunk"].content, end="", flush=True)
```

```
 The latest AI news is: 

One. **中, using the meta AI'Lama' to develop a military chatbot ** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... 子 Semiconductor Industry, Modifier Speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

For more information, please select a specific article! 
```

### Streaming output for tool calls <a href="#id-5" id="id-5"></a>

* `AIMessageChunk` : Real-time token output message
* `tool_call_chunks` : Tool call chunk. if `tool_call_chunks` If present, the tool call chunks are cumulative and output. (Tool tokens are output by viewing and judging this property)

```python
from langchain_core.messages import AIMessageChunk, HumanMessage

# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 첫 번째 메시지 처리 여부 플래그 설정
first = True

# 비동기 스트림 처리를 통한 메시지 및 메타데이터 순차 처리
for msg, metadata in graph.stream(inputs, stream_mode="messages"):
    # 사용자 메시지가 아닌 경우의 컨텐츠 출력 처리
    if msg.content and not isinstance(msg, HumanMessage):
        print(msg.content, end="", flush=True)

    # AI 메시지 청크 처리 및 누적
    if isinstance(msg, AIMessageChunk):
        if first:
            gathered = msg
            first = False
        else:
            gathered = gathered + msg

        # 도구 호출 청크 존재 시 누적된 도구 호출 정보 출력
        if msg.tool_call_chunks:
            print(gathered.tool_calls[0]["args"])
```

```
 {} 
{} 
{} 
{'query':''} 
{'query':'AI'} 
{'query':'AI'} 
-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-Taiwon Choi “2027 AI market expansion... To take growth opportunities, you need to complete the operational improvement ” -World Ilbo 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel and NVIDIA fate... Deodoru Cataclysm "Historical Moments"-The following news is the latest news about AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Tell me if you want to know more! 
```

### Subgraphs streaming output <a href="#subgraphs" id="subgraphs"></a>

This time, we'll see how to check the streaming output through Subgraphs.

Subgraphs is a function that defines part of a graph as a subgraph.

**flow**

* Subgraphs reuses the ability to search for the latest existing news.
* Parent Graph adds the ability to generate SNS posts based on the latest news found.

```python
from typing import Annotated, List, Dict
from typing_extensions import TypedDict
from langchain.tools import tool
from langchain_teddynote.tools import GoogleNews
from langchain_openai import ChatOpenAI
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langgraph.prebuilt import ToolNode, tools_condition


########## 1. 상태 정의 ##########
# 상태 정의
class State(TypedDict):
    # 메시지 목록 주석 추가
    messages: Annotated[list, add_messages]


########## 2. 도구 정의 및 바인딩 ##########
# 도구 초기화
# 키워드로 뉴스 검색하는 도구 생성
@tool
def search_keyword(query: str) -> List[Dict[str, str]]:
    """Look up news by keyword"""
    news_tool = GoogleNews()
    return "\n".join([f'- {news["content"]}' for news in news_tool.search_by_keyword(query, k=5)])


# 도구 리스트 생성
tools = [search_keyword]

# LLM 초기화
llm = ChatOpenAI(model="gpt-4o-mini")

# 도구와 LLM 결합 (tags 추가)
llm_with_tools = llm.bind_tools(tools).with_config(tags=["WANT_TO_STREAM"])

########## 3. 노드 추가 ##########
# 챗봇 함수 정의
def chatbot(state: State):
    # 메시지 호출 및 반환
    return {"messages": [llm_with_tools.invoke(state["messages"])]}

# SNS 포스트 생성 함수 정의
def create_sns_post(state: State):
    # SNS 포스트 생성을 위한 프롬프트
    sns_prompt = """
    이전 대화 내용을 바탕으로 SNS 게시글 형식으로 변환해주세요.
    다음 형식을 따라주세요:
    - 해시태그 포함
    - 이모지 사용
    - 간결하고 흥미로운 문체 사용
    - 200자 이내로 작성
    """
    messages = state["messages"] + [("human", sns_prompt)]
    sns_llm = ChatOpenAI(model="gpt-4o-mini").with_config(tags=["WANT_TO_STREAM2"])
    return {"messages": [sns_llm.invoke(messages)]}

# 서브그래프 생성
def create_subgraph():
    # 서브그래프용 상태 그래프 생성 
    subgraph = StateGraph(State)

    # 챗봇 노드 추가
    subgraph.add_node("chatbot", chatbot)

    # 도구 노드 생성 및 추가
    tool_node = ToolNode(tools=[search_keyword])
    subgraph.add_node("tools", tool_node)

    # 조건부 엣지 추가
    subgraph.add_conditional_edges(
        "chatbot",
        tools_condition,
    )

    # tools > chatbot
    subgraph.add_edge("tools", "chatbot")

    # START > chatbot
    subgraph.add_edge(START, "chatbot")

    # chatbot > END 
    subgraph.add_edge("chatbot", END)

    return subgraph.compile()

# 메인 그래프 생성
graph_builder = StateGraph(State)

# 서브그래프 추가
subgraph = create_subgraph()
graph_builder.add_node("news_subgraph", subgraph)

# SNS 포스트 생성 노드 추가
graph_builder.add_node("sns_post", create_sns_post)

# START > news_subgraph
graph_builder.add_edge(START, "news_subgraph")

# news_subgraph > sns_post
graph_builder.add_edge("news_subgraph", "sns_post")

# sns_post > END
graph_builder.add_edge("sns_post", END)

# 그래프 컴파일
graph = graph_builder.compile()
```

Visualize the graph.

```python
# 그래프 시각화
visualize_graph(graph, xray=True)
```

![](https://wikidocs.net/images/page/265770/langgraph-25.jpeg) ### Subgraphs output'cast'

```python
# 질문 입력
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 노드 업데이트 정보 순차적 처리 및 출력
for chunk in graph.stream(inputs, stream_mode="updates"):
    # node_name: 현재 처리 중인 노드명, node_chunk: 해당 노드의 청크 데이터
    for node_name, node_chunk in chunk.items():
        # 현재 처리 중인 노드 구분선 출력
        print(f"\n========= Update from node {node_name} =========\n")
        # 해당 노드의 업데이트된 데이터 출력
        if "messages" in node_chunk:
            node_chunk["messages"][-1].pretty_print()
        else:
            print(node_chunk)

```

```
 ========= Update from node news_subgraph ========= 

================================== Ai Message ================================== 

Here are the latest news related to AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Tell us if you need more details! 

========= Update from node sns_post ========= 

================================== Ai Message ================================== 

🌐 The era of AI is opening! 💡 China develops a military chatbot with Meta AI'Rama', and President Choi Tae-won foreshadows the expansion of the AI market in 2027! 🚀 The semiconductor industry is also improving constitution with AI chips! #AI #tech news #future technology #innovation 
```

#### Subgraphs output is also'included' <a href="#subgraphs_1" id="subgraphs_1"></a>

**Reference**

* `subgraphs=True` You can also include the output of Subgraphs.
* `(namespace, chunk)` Output in form.

```python
# 사용자의 메시지를 딕셔너리 형태로 입력 데이터 구성
inputs = {"messages": [("human", "AI 관련된 최신 뉴스를 검색해줘")]}

# 네임스페이스 문자열을 보기 좋은 형식으로 변환하는 포맷팅 함수
def format_namespace(namespace):
    return (
        namespace[-1].split(":")[0]
        if len(namespace) > 0
        else "parent graph"
    )

# subgraphs=True 를 통해 서브그래프의 출력도 포함(namespace, chunk) 형태로 출력됩니다.
for namespace, chunk in graph.stream(inputs, stream_mode="updates", subgraphs=True):
    # node_name: 현재 처리 중인 노드명, node_chunk: 해당 노드의 청크 데이터
    for node_name, node_chunk in chunk.items():
        print(f"\n========= Update from node [{node_name}] in [{format_namespace(namespace)}] =========\n")

        # 노드의 청크 데이터 출력
        if "messages" in node_chunk:
            node_chunk["messages"][-1].pretty_print()
        else:
            print(node_chunk)
```

```
 ========= Update from node [chatbot] in [news_subgraph] ========= 

================================== Ai Message ================================== 
Tool Calls: 
  search_keyword (call_7PzXjDPDdpWe0v29ZQ7pKXrm) 
 Call ID: call_7PzXjDPDdpWe0v29ZQ7pKXrm 
  Args: 
    query: AI 

========= Update from node [tools] in [news_subgraph] ========= 

================================= Tool Message ================================= 
Name: search_keyword 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-Taiwon Choi “2027 AI market expansion... To take growth opportunities, you need to complete the operational improvement ” -World Ilbo 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 

========= Update from node [chatbot] in [news_subgraph] ========= 

================================== Ai Message ================================== 

Here is the latest news on AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Please tell me if you have any further questions! 

========= Update from node [news_subgraph] in [parent graph] ========= 

================================== Ai Message ================================== 

Here is the latest news on AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate...Daidsu cataclysm "historical moment"** -Unified News 

Please tell me if you have any further questions! 

========= Update from node [sns_post] in [parent graph] ========= 

================================== Ai Message ================================== 

🚀 The future of AI is bright! 🌟 China is developing a military chatbot with Meta's'Rama', and the Korean semiconductor industry is making a difference with AI chips! 📈 Choi Tae-won foreshadows the AI market expansion in 2027! #AI #tech #future #innovation #semiconductor 
```

**Streaming LLM output token units inside Subgraphs**

**Reference**

* `kind` Indicates the type of event.
* Event type [StreamEvent type theorem ](https://wikidocs.net/265576)Check it out!

```python
# 네임스페이스 정보를 파싱하는 함수
def parse_namespace_info(info: tuple) -> tuple[str, str]:
    if len(info) > 1:
        namespace, node_name = info
        return node_name.split(":")[0], namespace.split(":")[0]
    return info[0].split(":")[0], "parent graph"

kind = None

async for event in graph.astream_events(inputs, version="v2", subgraphs=True):
    kind = event["event"]

    # 이벤트 종류와 태그 정보 추출
    if kind == "on_chat_model_start":
        print(f"\n========= on_chat_model_start =========\n")

    # 채팅 모델 스트림 이벤트 및 최종 노드 태그 필터링
    elif kind == "on_chat_model_stream":
        # 이벤트 데이터 추출
        data = event["data"]

        # 토큰 단위의 스트리밍 출력
        if data["chunk"].content:
            print(data["chunk"].content, end="", flush=True)

    elif kind == "on_tool_start":
        print(f"\n========= tool_start =========\n")
        data = event["data"]
        if "input" in data:
            tool_msg = data["input"]
            print(tool_msg)            

    elif kind == "on_tool_end":
        print(f"\n========= tool_end =========\n")
        data = event["data"]
        if "output" in data:
            tool_msg = data["output"]
            print(tool_msg.content)

```

```
 ========= on_chat_model_start ========= 


========= tool_start ========= 

{'query':'AI'} 

========= tool_end ========= 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-Taiwon Choi “2027 AI market expansion... To take growth opportunities, you need to complete the operational improvement ” -World Ilbo 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 

========= on_chat_model_start ========= 

Here is the latest news on AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, speed of body quality improvement ** -Dong Ilbo 
5. **AI has gone to Intel and NVIDIA fate... Daiji Cataclysm "Historical Moment"** -Unified News 

If you have any further questions, please tell me! 
========= on_chat_model_start ========= 

🚀 The era of AI is opening! 🌐 China uses Meta's'Rama' to develop a military chatbot, and Choi Tae-won foreshadowed the expansion of the AI market in 2027! 🔍 The semiconductor industry is also improving constitution with AI chips! 📈 #AI # chatbot # semiconductor innovation # future technology 
```

**When streaming only certain tags**

* `ONLY_STREAM_TAGS` You can only set the tags you want to stream through.
* Here we confirm that "WANT\_TO\_STREAM2" is excluded from the output and only "WANT\_TO\_STREAM" is output.

```python
# 네임스페이스 정보를 파싱하는 함수
def parse_namespace_info(info: tuple) -> tuple[str, str]:
    if len(info) > 1:
        namespace, node_name = info
        return node_name.split(":")[0], namespace.split(":")[0]
    return info[0].split(":")[0], "parent graph"

# 스트리밍 출력하고 싶은 tags 만 설정 (여기서는 "WANT_TO_STREAM" 는 출력에서 배제)
ONLY_STREAM_TAGS = ["WANT_TO_STREAM"]

kind = None
tags = None

async for event in graph.astream_events(inputs, version="v2", subgraphs=True):
    kind = event["event"]
    tags = event.get("tags", [])

    # 이벤트 종류와 태그 정보 추출
    if kind == "on_chat_model_start":
        print(f"\n========= tags: {tags} =========\n")

    # 채팅 모델 스트림 이벤트 및 최종 노드 태그 필터링
    elif kind == "on_chat_model_stream":
        for tag in tags:
            if tag in ONLY_STREAM_TAGS:
                # 이벤트 데이터 추출
                data = event["data"]

                # 출력 메시지
                if data["chunk"].content:
                    print(data["chunk"].content, end="", flush=True)
    elif kind == "on_tool_start":
        print(f"\n========= tool_start =========\n")
        data = event["data"]
        if "input" in data:
            tool_msg = data["input"]
            print(tool_msg)            

    elif kind == "on_tool_end":
        print(f"\n========= tool_end =========\n")
        data = event["data"]
        if "output" in data:
            tool_msg = data["output"]
            print(tool_msg.content)

```

```
 ========= tags: ['seq:step:1','WANT_TO_STREAM'] ========= 


========= tool_start ========= 

{'query':'AI'} 

========= tool_end ========= 

-中, use meta AI'Rama' to develop military chatbots-ZDnet Korea 
-AI and Sumta... 'Romance' chat popular [AIbriefing] -INews24 
-Taiwon Choi “2027 AI market expansion... To take growth opportunities, you need to complete the operational improvement ” -World Ilbo 
-Reduce general purpose Dram and increase AI chip... 子 Semiconductor industry, somatic line speed-Dong-bo 
-AI has gone to Intel·Nvidia fate... Deodoru cataclysm "historical moment"-Unified News 

========= tags: ['seq:step:1','WANT_TO_STREAM'] ========= 

Here are the latest news related to AI: 

One. ** China uses meta AI'Rama' to develop military chatbots** -ZDnet Korea 
2. **AI and Thumba...'Romance' chat popular** -INews24 
3. **Taiwon “2027 AI market expansion... To get a chance to grow, you need to complete the operating line ”** -World Ilbo 
4. ** Reduce universal Dram and increase AI chips... Korean semiconductor industry, somatic line speed** -Dong-bo 
5. **AI has gone to Intel and NVIDIA fate... Daiji Cataclysm "Historical Moment"** -Unified News 

Please tell me if you have any further questions! 
========= tags: ['seq:step:1','WANT_TO_STREAM2'] ========= 
```
