Template used to create a complete prompt string using the user's input variable
Usage
template : Template string. Braces within this string {} Indicates a variable.
input_variables : Defines the name of the variable to be in braces as a list.
input_variables
input_variables is a list that defines the name of a variable used in PromptTemplate.
from langchain_teddynote.messages import stream_response # Streaming Output
from langchain_core.prompts import PromptTemplate
from_template() Create PromptTemplate objects using methods
# template definition
template = "{country}What is the capital of?"
# from_template Using the method PromptTemplate Object creation
prompt_template = PromptTemplate.from_template(template)
prompt_template
Here we use LCEL to combine various components into a single chain
| The symbol unix pipe operator Similar to, it connects different components and passes the output of one component to the input of the next component.
In this chain, user input is passed to the prompt template, then the prompt template output is passed to the model. If you look at each component individually, you can understand what is going on.
invoke() call
Pass the input in the form of a python dictionary (Key: value)
When invoke() function is called, it passes the input value.
from langchain_openai import ChatOpenAI
model = ChatOpenAI(
model="gpt-3.5-turbo",
max_tokens=2048,
temperature=0.1,
)
chain = prompt | model | output_parser
# prompt cast PromptTemplate Create as an object.
prompt = PromptTemplate.from_template("{topic} Please explain it to me in simple terms.")
model = ChatOpenAI()
chain = prompt | model
# input Set the topic in the dictionary as 'Learning principles of artificial intelligence models'.
input = {"topic": "Learning principles of artificial intelligence models"}
# prompt with objects model Pipe an object(|) Connect with operator invoke 메서드를 사용하여 input을 전달합니다.
# This returns the message generated by the AI model.
chain.invoke(input)
The learning principle of the AIMessage (content=' artificial model is to learn patterns using data. The model accepts input data and internally weights it to output the desired result. During the learning process, the model uses input and correct answer data to calculate the error and update the weights in the direction to minimize this error. This repetitive learning allows the model to learn patterns from input data to predict accurate results. '4fens':'completion_tokens': 214,'prompt_tokens': 33,'total_tokens': 247<TAG1
The learning principle of an artificial intelligence model is the process of accepting data as input to learn patterns and making predictions or classifications based on this.
The learning process largely uses an artificial neural network consisting of an input layer, a hidden layer, and an output layer. A structure that receives data from the input layer and outputs the result through the silky layer into the output layer.
At this time, the model proceeds learning in the direction of adjusting weights and minimizing errors through a given data. To do this, we perform predictions for a given data, calculate the error compared to the actual value, and then update the weights to reduce this error.
This repetitive process allows models to learn patterns between data and make accurate predictions about new data. Models learned in this way can make generalized predictions about new data.
from langchain_core.output_parsers import StrOutputParser
output_parser = StrOutputParser()
# Construct a processing chain by connecting prompts, models, and output parsers..
chain = prompt | model | output_parser
# chain of the object invoke using the method input we deliver.
input = {"topic": "Learning principles of artificial intelligence models"}
chain.invoke(input)
'The learning principle of the artificial model is to receive data as input to learn patterns. Models are learned to receive input data and output the desired results internally while adjusting weights. At this time, the model learns the relationship between input data and output data to predict output for new input data. This process is done repeatedly, and the model gradually improves accuracy through learning. In this way, the AI model can improve the ability to judge and predict based on a given data.'
The learning principle of an artificial intelligence model is the process of learning patterns using data. First, the model accepts and processes input data, at which time compares input data with correct answer data to calculate the error. To minimize this error, the model gradually learns accurate patterns while adjusting weights and bias. Repeating these processes to learn so that the model can make accurate predictions about the data is a key principle of the artificial model.
template = """
You are an English teacher with 10 years of experience teaching English. In the situation [FORMAT]Please write an English conversation..
situation:
{question}
FORMAT:
- English conversation:
- Korean interpretation:
"""
# Create a prompt using a prompt template.
prompt = PromptTemplate.from_template(template)
# ChatOpenAI Initialize the chat model.
model = ChatOpenAI(model_name="gpt-4-turbo")
# Initializes the string output parser.
output_parser = StrOutputParser()
# Make up a chain.
chain = prompt | model | output_parser
# Completed Chain Get the answer by running.
# Request for streaming output
answer = chain.stream({"question": "I want to go to a restaurant and order food"})
# Streaming Output
stream_response(answer)
English conversation:
-Hello, could I see the menu, please?
-I'd like to order the grilled salmon and a side of mashed potatoes.
- Could I have a glass of water as well?
-Thank you!
Hanul interpretation:
-Hello, can I see the menu version?
-I want to order grilled salmon and mashid potato.
-Can you give me a glass of water?
- Thank you!
# This time question Run it with 'Order Pizza in USA' set to.
# Request for streaming output
answer = chain.stream({"question": "Ordering Pizza in the USA"})
# Streaming Output
stream_response(answer)
English conversation:
-Employee: "Hello, Tony's Pizza. How can I help you?"
-Customer: "Hi, I'd like to place an order for delivery, please."
-Employee: "Sure thing! What would you like to order?"
-Customer: "I'll have a large pepperoni pizza with extra cheese and a side of garlic bread."
-Employee: "Anything to drink?"
-Customer: "Yes, a 2-liter bottle of Coke, please."
-Employee: "Alright, your total comes to $22.50. Can I have your delivery address?"
- Customer: "It's 742 Evergreen Terrace."
-Employee: "Thank you. Your order will be be there in about 30-45 minutes. Is there anything else I can help you with?"
-Customer: "No, that's everything. Thank you!"
-Employee: "Thank you for choosing Tony's Pizza. Have a great day!"
Hanul interpretation:
-Employee: "Hello, this is Tony's pizza. How can I help you?"
-Customer: "Hello, I want to order delivery."
-Employee: "Yes, what would you order?"
-Customer: "Add cheese to a large size pepperoni pizza and give me a garlic bread."
-Employee: "Would you like a drink?"
-Customer: "Yes, give me a bottle of coke 2 liters."
-Employee: "Okay, the sum is $22.50. Could you please provide the delivery address?"
-Customer: "742 Evergreen Terrace."
-Employee: "Thank you. The food you ordered will arrive in approximately 30-45 minutes. Need other help?"
-Customer: "No, this is it. Thank you!"
-Employee: "Thank you for choosing Tony's pizza. Have a nice day!"