04. Agent utilizing Claude, Gemini, Ollama, Together.ai

Tool calling agent other than OpenAI

In addition to OpenAI Anthropic , Google Gemini , Together.ai , Ollama , Mistral Supports a wider range of supplier implementations such as.

In this chapter, we will look at how to create and run tool calling agents using various LLMs.

Reference link

# Configuration file for managing API keys as environment variables
from dotenv import load_dotenv

# Load API key information
load_dotenv()
 True 
# Set up LangSmith tracking. https://smith.langchain.com
# !pip install -qU langchain-teddynote
from langchain_teddynote import logging

# Enter a project name.
logging.langsmith("CH15-Agents")
 Start tracking LangSmith. 
[Project name] 
CH15-Agents 

Create prompt for Agent

  • chat_history : Variables that store previous conversations (if you don't support multi-turns, you can omit them)

  • agent_scratchpad : Variables that the agent temporarily saves

  • input : User input

Various LLM lists supporting Tool Calling

You need to set the content below to proceed with the lab.

Anthropic

Gemini

Together AI

Ollama

langchain-ollama installation

Generate Agent based on LLM.

Execute after creating AgentExecutor and confirm results

Run the agent using various llm.

Here is a function that creates and executes Agent using the entered llm to output the result.

Generate and run agents with each llm star to output results.

Last updated