LlamaIndex example connecting to Sap Generative AI Hub via LiteLLM
How LlamaIndex works
Installation
%pip install llama-cloud-services llama-index llama-index-llms-litellm python-dotenv
Credentials for SAP Gen AI Hub
Get the service key from your SAP BTP tenant with AI subscription.
Add the following variables from the service key in a file called “.env” and put it in the same folder where you run the notebook:
AICORE_AUTH_URL="https://* * * .authentication.sap.hana.ondemand.com/oauth/token"
AICORE_CLIENT_ID=" *** "
AICORE_CLIENT_SECRET=" *** "
AICORE_RESOURCE_GROUP=" *** "
AICORE_BASE_URL="https://api.ai.***.cfapps.sap.hana.ondemand.com/"
Run the LlamaIndex with LiteLLM and SAP LLMs
[ ]:
from dotenv import load_dotenv
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.llms import ChatMessage
from llama_index.core.tools import FunctionTool
from llama_index.llms.litellm import LiteLLM
Load your credentials as environment variables that Litellm can use automatically.
[ ]:
load_dotenv()
Define the model with the SAP LLM.
[ ]:
llm = LiteLLM("sap/gpt-5",
temperature=1,)
Define the agent tool
[ ]:
def get_weather(city: str) -> dict:
"""Retrieves the current weather report for a specified city."""
city_normalized = city.lower().replace(" ", "")
mock_weather_db = {
"newyork": {"status": "success", "report": "The weather in New York is sunny with a temperature of 25°C."},
"london": {"status": "success", "report": "It's cloudy in London with a temperature of 15°C."},
"tokyo": {"status": "success", "report": "Tokyo is experiencing light rain and a temperature of 18°C."},
}
if city_normalized in mock_weather_db:
return mock_weather_db[city_normalized]
else:
return {"status": "error", "error_message": f"Sorry, I don't have weather information for '{city}'."}
Registrate the tool
[ ]:
tool = FunctionTool.from_defaults(
get_weather
)
Create the message
[ ]:
message = ChatMessage(role="user", content="What is the weather like in London?")
Define the agent with the model an the tool
[ ]:
agent = ReActAgent(llm=llm, tools=[tool])
Define the async function with agent run
[ ]:
async def main():
response = await agent.run(user_msg=message)
print(response)
Run the function
[ ]:
await main()