AWS Strands Agents example connecting to Sap Generative AI Hub via LiteLLM

How AWS Strands Agents work

Installation

%pip install strands-agents litellm

Credentials for SAP Gen AI Hub

Get the service key from your SAP BTP tenant with AI subscription.

Add the following variables from the service key in a file called “.env” and put it in the same folder where you run the notebook:

AICORE_AUTH_URL="https://* * * .authentication.sap.hana.ondemand.com/oauth/token"
AICORE_CLIENT_ID=" *** "
AICORE_CLIENT_SECRET=" *** "
AICORE_RESOURCE_GROUP=" *** "
AICORE_BASE_URL="https://api.ai.***.cfapps.sap.hana.ondemand.com/

Run the Strands Agents with LiteLLM and SAP LLMs

[ ]:
from dotenv import load_dotenv
from strands import Agent, tool
from strands.models.litellm import LiteLLMModel

Load your credentials as environment variables that Litellm can use automatically.

[ ]:
load_dotenv()

Define the agent tools.

[ ]:
@tool
def get_weather(city: str):
    city_normalized = city.lower().replace(" ", "")

    mock_weather_db = {
        "newyork": "The weather in New York is sunny with a temperature of 25°C.",
        "london": "It's cloudy in London with a temperature of 15°C.",
        "tokyo": "Tokyo is experiencing light rain and a temperature of 18°C.",
    }

    if city_normalized in mock_weather_db:
        return mock_weather_db[city_normalized]
    else:
        return f"The weather in {city} is sunny with a temperature of 20°C."

Create the agent with a LiteLLMModel object as a model.

[ ]:
agent = Agent(
    system_prompt="You are a helpful weather assistant. "
           "When the user asks about a specific city, "
            "use the 'get_weather' tool to find the weather information. "
            "Provide the TV weather report in two sentences including a small joke.",
    model=LiteLLMModel(model_id="sap/gpt-5"),
    tools=[get_weather],
)

Run agent with your prompt

[ ]:
response = agent("london")
print(response)