08. Add a node to ask a person

So far, we have relied on the state of messages.

You can do a lot with the correction of these state values, but if you want to define complex behaviors without relying solely on the message list, you can add additional fields to the state. This tutorial explains how to extend the chatbot by adding new nodes.

In the example above, whenever a tool is called Graph always stops through interrupt Human-in-the-loop was implemented to be.

This time, let's say you want to allow the chatbot to choose whether to rely on humans.

One way to do this is that the graph always stops "human" node Is to generate. This node only runs when LLM calls the "human" tool. For convenience, we will include the "ask_human" flag in the graph state to have LLM switch the flag when it calls this tool.

Copy

# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()

Copy

 True 

Copy

# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH17-LangGraph-Modules")

Copy

Setting a node to ask people for comments

Copy

This time, you are asking whether to ask a person in the middle ( ask_human ) To add.

Copy

human Defines the schema used on request for.

Copy

Next, define the chatbot node.

The main fix here is the chatbot RequestAssistance When the flag is called ask_human Is to switch flags.

Copy

Next, create a graph builder and do the same as before chatbot and tools Add nodes to the graph.

Copy

Human node setting

next human Generate nodes.

This node acts primarily as a placeholder to trigger an interrupt in the graph. User interrupt If you don't update the status manually during, LLM inserts a tool message to let you know that the user has been asked but has not responded.

This node is also ask_human Release the flag so that the graph will never visit the node again unless there is an additional request.

Reference image

Copy

Next, define conditional logic.

select_next_node When the flag is set human Specify the path as a node. Otherwise, pre-built tools_condition Have the function select the next node.

tools_condition The function is simply chatbot In this response message tool_calls Make sure you used it.

When used, action Specify the path as a node. Otherwise, exit the graph.

Copy

Finally, connect the edge and compile the graph.

Copy

Visualize the graph.

Copy

chatbot Nodes do the following actions:

  • Chatbots can ask humans for help (chatbot->select->human)

  • Call the search engine tool (chatbot->select->action)

  • You can respond directly (chatbot->select-> end ).

Once an action or request is made, the graph chatbot Switch back to the node to continue working.

Copy

Copy

Notice: LLM Provided" HumanRequest "The tool was called, and the interrupt was set. Let's check the graph status.

Copy

Copy

Graph status is actually 'human' Before the node stop Will.

In this scenario, act as an "expert" and use input to new ToolMessage You can update the status manually by adding.

Next, to respond to the chatbot's request, do the following:

  1. Including responses ToolMessage Generate. This is chatbot Passed back to.

  2. update_state Manually update the graph state by calling.

Copy

Copy

You can check the status to see if a response has been added.

Copy

Copy

Next, as input None Graph using resume To.

Copy

Copy

Check the final result.

Copy

Copy

Last updated