04. Agent utilizing Claude, Gemini, Ollama, Together.ai

Tool calling agent other than OpenAI

In addition to OpenAI Anthropic , Google Gemini , Together.ai , Ollama , Mistral Supports a wider range of supplier implementations such as.

In this chapter, we will look at how to create and run tool calling agents using various LLMs.

Reference link

Copy

# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()

Copy

 True 

Copy

# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH15-Agents")

Copy

Copy

Copy

Copy

Create prompt for Agent

  • chat_history : Variables that store previous conversations (if you don't support multi-turns, you can omit them)

  • agent_scratchpad : Variables that the agent temporarily saves

  • input : User input

Copy

Various LLM lists supporting Tool Calling

You need to set the content below to proceed with the lab.

Anthropic

Gemini

Together AI

Ollama

langchain-ollama installation

Copy

Copy

Generate Agent based on LLM.

Copy

Execute after creating AgentExecutor and confirm results

Copy

Copy

Run the agent using various llm.

Here is a function that creates and executes Agent using the entered llm to output the result.

Copy

Generate and run agents with each llm star to output results.

Copy

Copy

Copy

Copy

Copy

Copy

Copy

Copy

Copy

Copy

Copy

Last updated