Microsoft Agents example connecting to Sap Generative AI Hub via LiteLLM

How Microsoft Agents works

LLM access via LiteLLM Proxy

Microsoft Agent supports the LiteLLM Proxy for access via OpenAI API calls.

Follow the details in LiteLLM Proxy setup for SAP Gen. AI Hub

Installation

%pip install agent-framework

Set env variables

Add the following variables from the service key in a file called “.env” and put it in the same folder where you run the notebook:

LITELLM_PROXY_API_KEY=sk-1234
PROXY_BASE_URL=http://localhost:4000

Run the Microsoft Agents with LiteLLM and SAP LLMs

[ ]:
import os
from typing import Annotated

import litellm
from agent_framework import ChatAgent
from agent_framework.openai import OpenAIChatClient
from dotenv import load_dotenv
from pydantic import Field

Load your credentials as environment variables.

[ ]:
litellm.use_litellm_proxy = True
load_dotenv()
api_base = os.getenv("PROXY_BASE_URL")
api_key = os.getenv("LITELLM_PROXY_API_KEY")

Define the agent tool.

[ ]:
def get_weather(
    city: Annotated[str, Field(description="The location to get weather for")]
) -> str:
    city_normalized = city.lower().replace(" ", "")

    mock_weather_db = {
        "newyork": "The weather in New York is sunny with a temperature of 25°C.",
        "london": "It's cloudy in London with a temperature of 15°C.",
        "tokyo": "Tokyo is experiencing light rain and a temperature of 18°C.",
    }

    if city_normalized in mock_weather_db:
        return mock_weather_db[city_normalized]
    else:
        return f"The weather in {city} is sunny with a temperature of 20°C."

Create agent using OpenAIClient with the proxy credentials and the tool

[ ]:
agent = ChatAgent(
    chat_client=OpenAIChatClient(model_id="sap/gpt-4o",
                                 api_key=api_key,
                                 base_url=api_base,),
    instructions="""
        You are a helpful weather assistant.
        When the user asks for the weather in a specific city, use the 'get_weather' tool to find the information.
        If the tool returns an error, inform the user politely.
        If the tool is successful, write a couple of sentences for a TV weather report in the given city including a small joke.
        """,
    name="litellm_agent",
    tools=[get_weather],
)

Define the async function with agent run

[ ]:
async def tools_example():
    result = await agent.run("What's the weather like in Tokyo?")
    print(result.text)

Run the function

[ ]:
await tools_example()