06. Embedding-based evaluation (embedding_distance)

Generates an evaluator that measures the distance between the answer and the correct answer.

Copy

# installation
# !pip install -U langsmith langchain-teddynote

Copy

# Configuration file for managing API KEY as environment variable
from dotenv import load_dotenv

# Load API KEY information
load_dotenv()

Copy

 True 

Copy

# LangSmith set up tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH16-Evaluations")

Copy

Define functions for RAG performance testing

We will create a RAG system to use for testing.

Copy

Copy

ask_question Generate a function with the name Lee. Input inputs Ra receives a dickery, answer Ra returns the dictionary.

Copy

Embedding street based Evaluator

Copy

If multiple Embedding models are used for one metric, the results are calculated as average values.

(Example) - cosine : BGE-m3 - euclidean : OpenAI, Upstage

euclidean In the case, the average value of each model is calculated.

Copy

Last updated