Skip to content
Snippets Groups Projects
Commit 690e0e1c authored by Robert Haase's avatar Robert Haase
Browse files

Update 2 files

- /agents/requirements.txt
- /agents/react_agent_llama_index.ipynb
parent f88dccee
No related branches found
No related tags found
1 merge request!5Adds example for running agentic workflows using llama-index and the ScaDS.AI LLM server
%% Cell type:markdown id:4be6be18-242e-46e3-a63f-d104d08ab2de tags:
# Agentic workflows using Llama-Index
The [Llama-Index](https://github.com/run-llama/llama_index) library and its submodules such as for [Ollama](https://docs.llamaindex.ai/en/stable/api_reference/llms/ollama/) and [OpenAI-like](https://docs.llamaindex.ai/en/stable/api_reference/llms/openai_like/) allow building agentic workflows using open-weight models. In such workflows the LLM decides which functions to call and it can call multiple functions in a row to answer a question.
See also
* [Agents in the ScaDS.AI Generative AI Notebooks](https://scads.github.io/generative-ai-notebooks/35_agents/readme.html)
%% Cell type:code id:729e51e6-a8fd-4c5e-8b25-220e5b4b5421 tags:
``` python
from llama_index.core.agent import ReActAgent
from llama_index.llms.ollama import Ollama
from llama_index.llms.openai_like import OpenAILike
from llama_index.core.tools import FunctionTool
import os
```
%% Cell type:markdown id:1c0851d5-81d6-4dfb-b2cd-4b89bbaf23ef tags:
In the following example we use the OpenAI-like API to access our institutional LLM server. Alternatively, one can setup a local installation of [Ollama](https://ollama.com), which would work, too.
%% Cell type:code id:3afefc4d-918a-4367-9e6b-f1ce1c328452 tags:
``` python
## Use this to play with a local model via ollama
#llm = Ollama(model="llama3.2", request_timeout=120.0)
llm = OpenAILike(model="meta-llama/Llama-3.3-70B-Instruct", request_timeout=120.0, api_base="https://llm.scads.ai/v1", api_key=os.environ.get('SCADSAI_API_KEY'))
```
%% Cell type:markdown id:b24d4b77-5b26-4966-a6f2-0ed24c0892ab tags:
We define a couple of tools, functions to answer specific questions. In the case shown here, we would like to deal with orders from customers and estimate their delivery dates when they approach us.
%% Cell type:code id:2115f4a5-8850-4dd3-b6a9-a34d957c5334 tags:
``` python
tools = []
@tools.append
def estimate_delivery_date_of_order(order_id:str) -> str:
"""Estimate the delivery date of a package identified by its order id."""
if order_id == "292123":
return "Friday"
if order_id == "292456":
return "Thursday"
if order_id == "292789":
return "Saturday"
return "Unknown"
@tools.append
def get_recent_order_id(customer_name:str) -> str:
"""Get the most recent order id for a given customer"""
if customer_name.lower() == "robert":
return "292123"
if customer_name.lower() == "alice":
return "292456"
if customer_name.lower() == "ivy":
return "292789"
if customer_name.lower() == "dennis":
return "2921010"
return "unknown"
```
%% Cell type:markdown id:3fe99085-155e-4962-a838-fd017a11d88b tags:
We also add an unrelated function, which should never be called when dealing with customers and deliveries.
%% Cell type:code id:f2f50948-1703-4dca-bf3b-9d46b762fcee tags:
``` python
@tools.append
def get_weather(city:str)->str:
"""Returns the current weather in a given city."""
return "sunny"
```
%% Cell type:markdown id:f467dd3b-1909-49e2-a48f-50c15d272f08 tags:
Next, we setup a [ReActAgent](https://docs.llamaindex.ai/en/stable/api_reference/agent/react/) to deal with the given function and prompts.
%% Cell type:code id:180e8f3a-2b0f-49eb-a002-ede430b932b4 tags:
``` python
agent = ReActAgent.from_tools([FunctionTool.from_defaults(fn=t) for t in tools], llm=llm, verbose=True)
```
%% Cell type:markdown id:7f8cadf1-f15c-4b10-9481-3bcc08ab7a69 tags:
We can now ask questions to the agent. In verbose mode, you will see which functions it will call to answer our request.
%% Cell type:code id:dcd2f1e2-fd13-4116-b8a3-510c298279dd tags:
``` python
response = agent.chat("My Name is Robert. I ordered something. And I would like to know when it will arrive.")
response.response
```
%% Output
> Running step 89187412-52e5-43fb-8b04-1b17a14c4477. Step input: My Name is Robert. I ordered something. And I would like to know when it will arrive.
Thought: The current language of the user is: English. I need to use a tool to help me answer the question.
Action: get_recent_order_id
Action Input: {'customer_name': 'Robert'}
Observation: 292123
> Running step 490db98a-c140-46a6-a139-145656631264. Step input: None
Thought: I have the order id. Now I can use another tool to estimate the delivery date.
Action: estimate_delivery_date_of_order
Action Input: {'order_id': '292123'}
Observation: Friday
> Running step 20836298-e613-43e9-a86b-b3f4d80d2e51. Step input: None
Thought: I can answer without using any more tools. I'll use the user's language to answer
Answer: Your order will arrive on Friday.

'Your order will arrive on Friday.'
%% Cell type:markdown id:2f2262cb-12ab-42df-a6e0-c1a741e356d7 tags:
For demonstration purposes, we run this again.
%% Cell type:code id:41e0da5d-a362-4042-b61a-9ad47b11fd09 tags:
``` python
agent = ReActAgent.from_tools([FunctionTool.from_defaults(fn=t) for t in tools], llm=llm)
response = agent.chat("My Name is Alice. I ordered something. And I would like to know when it will arrive.")
response.response
```
%% Output
'Your order will arrive on Thursday.'
%% Cell type:code id:eaaa7621-27ed-4477-bccf-d7cb6d786fb2 tags:
``` python
```
openai
llama-index==0.12.6
llama-index-llms-ollama==0.5.0
llama-index-llms-openai-like==0.3.3
tokenizers==0.21.0
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment