06. LCEL interface

LCEL interface

To make custom chains as easy as possible, Runnable Protocol implemented.

Runnable The protocol is implemented in most components.

This is a standard interface, which easily defines custom chains and makes calls in a standard way. Standard interfaces include:

  • stream : Stream chunks of responses.

  • invoke : Call the chain for input.

  • batch : Call the chain for the input list.

There are also asynchronous methods.

  • astream : Asynchronously stream the chunk of the response.

  • ainvoke : Asynchronously calls the chain for input.

  • abatch : Asynchronously calls the chain for the input list.

  • astream_log : Stream the intermediate steps that occur as well as the final response.

Copy

# Configuration file for managing API KEY as environment variable
from dotenv import load_dotenv

# Load API KEY information
load_dotenv()

Copy

Copy

Generate chain using LCEL grammar.

Copy

stream: real-time output

This function chain.stream Use methods to create a data stream for a given topic, and repeat this stream to the content of each data ( content ) Output immediately. end="" The factor is set not to give a lapel after output, flush=True The factor immediately empties the output buffer.

Copy

Copy

invoke: call

chain Object invoke The method takes the subject as a factor and performs processing on that subject.

Copy

Copy

batch: batch (unit execution)

function chain.batch Is a list containing multiple dictionary, which is in each dictionary topic Perform batch processing using the value of the key.

Copy

Copy

max_concurrency You can set the number of concurrent requests using parameters

config Dictary max_concurrency Sets the maximum number of jobs that can be handled simultaneously through the key. It is set here to handle up to 3 tasks simultaneously.

Copy

Copy

async stream: asynchronous stream

function chain.astream It generates an asynchronous stream, and asynchronously processes messages for a given topic.

Asynchronous for loop ( async for Using ) to receive messages sequentially from the stream, print The content of the message through the function ( s.content ) Output immediately. end="" Is set to not give a tape measure after output, flush=True Force the output buffer to be emptied and output immediately.

Copy

Copy

async invoke: asynchronous call

chain Object ainvoke Methods perform tasks using factors given asynchronously. here topic Come on NVDA (Evidia's Tiker) Ra is passing a dictary with a value as a factor. This method can be used to asynchronously request processing for a specific topic.

Copy

Copy

Copy

async batch: asynchronous placement

function abatch Asynchronously, a series of tasks are batch processed.

In this example chain Object abatch Using methods topic I am dealing with the work on asynchronously.

await Keywords are used to wait for the asynchronous operation to complete.

Copy

Copy

Copy

Parallel: Parallel

Let's take a look at how LangChain Expression Language supports parallel requests. For example, RunnableParallel When used (written in the form of a often dictionary), each element is executed in parallel.

langchain_core.runnables Of module RunnableParallel Shows examples of running two tasks in parallel using classes.

ChatPromptTemplate.from_template Given using methods country for Capital Wow area Two chains to save chain1 , chain2 ).

Each of these chains model And pipe ( | ) Connected through the operator. Finally, RunnableParallel Use class to chain these two capital Wow area Ira can be combined with a key and run simultaneously combined Create an object.

Copy

chain1.invoke() function chain1 Object invoke Call the method.

At this time, country Ira is tall korea Rag passes a dictation with a value.

Copy

Copy

this time chain2.invoke() Call. country Other countries on the key USA Pass.

Copy

Copy

combined Object invoke Methods given country Perform processing on.

In this example korea LA theme invoke Pass to the method to run it.

Copy

Copy

Parallel processing in batches

Parallel processing can be combined with other executable codes. Let's use batch and parallel processing.

chain1.batch The function takes a list containing multiple dicks as an factor, processing the value corresponding to the "topic" key for each dictionary. In this example, we are deploying two topics, "Korea" and "USA".

Copy

Copy

chain2.batch The function receives multiple dicks in list form and performs batch processing.

In this example korea Wow USA Ra requests processing for two countries.

Copy

Copy

combined.batch Functions are used to process a given data into batches. In this example, we take a list containing two dictionary objects as a factor, each 대한민국 Wow 미국 Placing data for both countries.

Copy

Copy

Last updated