AG2 example connecting to Sap Generative AI Hub via LiteLLM Proxy

How AG2 works

LLM access via LiteLLM Proxy

AG2 supports the LiteLLM Proxy for access via OpenAI API calls.

Follow the details in LiteLLM Proxy setup for SAP Gen. AI Hub

Installation

%pip install “ag2[openai]”

Set env variables

Add the following variables from the service key in a file called “.env” and put it in the same folder where you run the notebook:

LITELLM_PROXY_API_KEY=sk-1234
PROXY_BASE_URL=http://localhost:4000

Run the CrewAI with LiteLLM and SAP LLMs

[ ]:
import os
from typing import Any

import litellm
from autogen import ConversableAgent, LLMConfig
from autogen.agentchat import initiate_group_chat
from autogen.agentchat.group.patterns import AutoPattern
from dotenv import load_dotenv

Load your credentials as environment variables.

[ ]:
litellm.use_litellm_proxy = True
load_dotenv()
api_base = os.getenv("PROXY_BASE_URL")
api_key = os.getenv("LITELLM_PROXY_API_KEY")

Set up the model with your proxy params

[ ]:
llm_config = LLMConfig(config_list={"model": "sap/gpt-4o", "base_url": api_base, "api_key": api_key})

Define the agent tools.

[ ]:
def get_weather(city: str) -> str:
    """Moke function"""
    city_normalized = city.lower().replace(" ", "")

    mock_weather_db = {
        "newyork": "The weather in New York is sunny with a temperature of 25°C.",
        "london": "It's cloudy in London with a temperature of 15°C.",
        "tokyo": "Tokyo is experiencing light rain and a temperature of 18°C.",
    }

    if city_normalized in mock_weather_db:
        return mock_weather_db[city_normalized]
    else:
        return f"The weather in {city} is sunny with a temperature of 20°C."

Define the function for check if agent conversation is finished

[ ]:
def is_termination_msg(msg: dict[str, Any]) -> bool:
    content = msg.get("content", "")
    return (content is not None) and "==== REPORT GENERATED ====" in content

Define the Agent with the SAP LLM and the tool.

[ ]:
assistant = ConversableAgent(name="assistant",
                             llm_config=llm_config,
                             system_message="""
                                        You are a helpful weather assistant.
                                        When the user asks for the weather in a specific city, use the 'get_weather' tool to find the information.
                                        If the tool returns an error, inform the user politely.
                                        If the tool is successful, write a couple of sentences for a TV weather report in the given city including a small joke."
                                        Once you've generated the report append this to the summary:
                                        ==== REPORT GENERATED ====
                                        """,
                             functions=[get_weather])

Set up the conversation pattern with the agent, model and function for conversation termination

[ ]:
pattern = AutoPattern(initial_agent=assistant,
                      agents=[assistant],
                      group_manager_args={
                          "llm_config": llm_config,
                          "is_termination_msg": is_termination_msg
                      },
                      )

Run the conversation with message

[ ]:
result, _, _ = initiate_group_chat(pattern=pattern,
                                   messages="What is the weather like in Tbilisi?",
                                   )