15. Everything in LangGraph streaming mode

graph The entire state of streaming How to do LangGraph supports multiple streaming modes.

The main modes are:

  • values : This streaming mode streams the values of the graph. This is after each node is called Full state of graph Means.

  • updates : This streaming mode streams the graph's updates. This is after each node is called Updates to graph status Means.

  • messages : This streaming mode streams messages from each node. At this time Output streaming in token units in LLM It is also possible.

Preferences

Copy

# API 키를 환경변수로 관리하기 위한 설정 파일
from dotenv import load_dotenv

# API 키 정보 로드
load_dotenv()

Copy

 True 

Copy

# LangSmith 추적을 설정합니다. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# 프로젝트 이름을 입력합니다.
logging.langsmith("CH17-LangGraph")

Copy

Define graph

I will use a simple agent in this guide.

Copy

Copy

Step-by-step output of the node

Streaming mode - values : Output the current state value for each step - updates : Output only status updates for each step (default) - messages : Message output for each step

The meaning of streaming here is not the concept of streaming in token units during LLM output, but rather output step by step.

stream_mode = "values"

values Mode outputs the current state value for each step.

Reference

chunk.items()

  • key : State key value

  • value : Value for State's key

Synchronous streaming

  • chunk Is a dictionary form (key: State key, value: State value)

Copy

Copy

Asynchronous streaming

Reference

  • astream() The method executes the graph through asynchronous stream processing and generates chunk-unit responses in value mode.

  • async for Perform asynchronous stream processing using the door.

Copy

Copy

If you only want to check the final result, we will process it like this:

Copy

Copy

stream_mode = "updates"

updates Mode only exports updated State for each step.

  • The output is the node name as key, and the updated value is values dictionary is.

Reference

chunk.items()

  • key : Name of Node

  • value : Output value (dictionary) at that node stage. That is, it is a dictionary with multiple key-value pairs.

Synchronous streaming

Copy

Copy

Asynchronous streaming

Copy

Copy

stream_mode = "messages"

messages Mode streams messages for each step.

Reference

  • chunk is a tuple with two elements.

  • chunk_msg : Real-time output message

  • metadata : Node information

Synchronous streaming

Copy

Copy

Asynchronous streaming

Copy

Copy

Streaming output to specific nodes

Reference

  • metadata["langgraph_node"] Only messages output from a specific node can be output.

Copy

Copy

If you want to output from a specific node, you can set it through stream_mode="messages".

stream_mode="messages" When setting, ( chunk_msg , metadata ) Receive messages in form. - chunk_msg Real-time output message, - metadata means node information.

metadata["langgraph_node"] Only messages output from a specific node can be output.

(Example) When only the message output from the chatbot node

metadata["langgraph_node"] == "chatbot"

Copy

Copy

You can check the node information by outputting metadata.

Copy

Copy

Customization tag Filtered streaming

If the output of LLM occurs in multiple places, you may want to output only the messages output from a specific node.

In this case, tags You can select only the nodes you want to output by adding.

Here's how to add tags to llm: tags Can be added in the form of a list.

llm.with_config(tags=["WANT_TO_STREAM"])

This allows you to filter events more accurately to keep only events from that model. The example below WANT_TO_STREAM An example that prints only when tagged.

Copy

Copy

Streaming output for tool calls

  • AIMessageChunk : Real-time token output message

  • tool_call_chunks : Tool call chunk. if tool_call_chunks If present, the tool call chunks are cumulative and output. (Tool tokens are output by viewing and judging this property)

Copy

Copy

Subgraphs streaming output

This time, we'll see how to check the streaming output through Subgraphs.

Subgraphs is a function that defines part of a graph as a subgraph.

flow

  • Subgraphs reuses the ability to search for the latest existing news.

  • Parent Graph adds the ability to generate SNS posts based on the latest news found.

Copy

Visualize the graph.

Copy

### Subgraphs output'cast'

Copy

Copy

Subgraphs output is also'included'

Reference

  • subgraphs=True You can also include the output of Subgraphs.

  • (namespace, chunk) Output in form.

Copy

Copy

Streaming LLM output token units inside Subgraphs

Reference

Copy

Copy

When streaming only certain tags

  • ONLY_STREAM_TAGS You can only set the tags you want to stream through.

  • Here we confirm that "WANT_TO_STREAM2" is excluded from the output and only "WANT_TO_STREAM" is output.

Copy

Copy

Last updated