Hi Community,
In this article, I will introduce my application iris-AgenticAI .
The rise of agentic AI marks a transformative leap in how artificial intelligence interacts with the world—moving beyond static responses to dynamic, goal-driven problem-solving. Powered by OpenAI’s Agentic SDK , The OpenAI Agents SDK enables you to build agentic AI apps in a lightweight, easy-to-use package with very few abstractions. It's a production-ready upgrade of our previous experimentation for agents, Swarm.
This application showcases the next generation of autonomous AI systems capable of reasoning, collaborating, and executing complex tasks with human-like adaptability.
Application Features
- Agent Loop 🔄 A built-in loop that autonomously manages tool execution, sends results back to the LLM, and iterates until task completion.
- Python-First 🐍 Leverage native Python syntax (decorators, generators, etc.) to orchestrate and chain agents without external DSLs.
- Handoffs 🤝 Seamlessly coordinate multi-agent workflows by delegating tasks between specialized agents.
- Function Tools ⚒️ Decorate any Python function with @tool to instantly integrate it into the agent’s toolkit.
- Vector Search (RAG) 🧠 Native integration of vector store (IRIS) for RAG retrieval.
- Tracing 🔍 Built-in tracing to visualize, debug, and monitor agent workflows in real time (think LangSmith alternatives).
- MCP Servers 🌐 Support for Model Context Protocol (MCP) via stdio and HTTP, enabling cross-process agent communication.
- Chainlit UI 🖥️ Integrated Chainlit framework for building interactive chat interfaces with minimal code.
- Stateful Memory 🧠 Preserve chat history, context, and agent state across sessions for continuity and long-running tasks.
Agent
Agents are the core building block in your apps. An agent is a large language model (LLM), configured with instructions and tools. Basic configuration The most common properties of an agent you'll configure are:
Instructions: also known as a developer message or system prompt.
model: which LLM to use, and optional model_settings to configure model tuning parameters like temperature, top_p, etc.
tools: Tools that the agent can use to achieve its tasks.
from agents import Agent, ModelSettings, function_tool
@function_tool
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny"
agent = Agent(
name="Haiku agent",
instructions="Always respond in haiku form",
model="o3-mini",
tools=[get_weather],
)
PythonPython
Running agents
You can run agents via the Runner
class. You have 3 options:
Runner.run()
, which runs async and returns aRunResult
.Runner.run_sync()
, which is a sync method and just runs.run()
under the hood.Runner.run_streamed()
, which runs async and returns aRunResultStreaming
. It calls the LLM in streaming mode, and streams those events to you as they are received.
from agents import Agent, Runner
async def main():
agent = Agent(name="Assistant", instructions="You are a helpful assistant")
result = await Runner.run(agent, "Write a haiku about recursion in programming.")
print(result.final_output)
# Code within the code,
# Functions calling themselves,
# Infinite loop's dance.
PythonPython
Agent Architecture
The application comprises 7 specialized agents:
- Triage Agent 🤖
- Role: Primary router that receives user input and delegates tasks via handoffs
- Example: Routes "Show production errors" → IRIS Production Agent
- Vector Search Agent 🤖
- Role: Provide IRIS 2025.1 Release notes details (RAG Functionality)
- Example: Routes "Provide me summary of Release Notes" → Vector Search Agent
- IRIS Dashboard Agent 🤖
- Function: Provides real-time management portal metrics: plaintext Copy
ApplicationErrors, CSPSessions, CacheEfficiency, DatabaseSpace, DiskReads, DiskWrites, ECPAppServer, ECPDataServer, GloRefs, JournalStatus, LicenseCurrent, LockTable, Processes, SystemUpTime, WriteDaemon, [...]
- Function: Provides real-time management portal metrics: plaintext Copy
- IRIS Running Process Agent 🤖
- Function: Monitors active processes with details:
Process ID
|Namespace
|Routine
|State
|PidExternal
- Function: Monitors active processes with details:
- IRIS Production Agent 🤖
- Role: Provide production details along with the functionality to start and stop the production.
- WebSearch Agent 🤖
- Capability: Performs contextual web searches via API integrations
- Order Agent 🤖
- Function: Retrieves order status using an order ID
Handoffs
Handoffs allow an agent to delegate tasks to another agent. This is particularly useful in scenarios where different agents specialize in distinct areas. For example, a customer support app might have agents that each specifically handle tasks like order status, refunds, FAQs, etc.
The triage agent is our main agent, which delegates tasks to another agent based on user input
#TRIAGE AGENT, Main agent receives user input and delegates to other agent by using handoffs
triage_agent = Agent(
name="Triage agent",
instructions=(
"Handoff to appropriate agent based on user query."
"if they ask about Release Notes, handoff to the vector_search_agent."
"If they ask about production, handoff to the production agent."
"If they ask about dashboard, handoff to the dashboard agent."
"If they ask about process, handoff to the processes agent."
"use the WebSearchAgent tool to find information related to the user's query and do not use this agent is query is about Release Notes."
"If they ask about order, handoff to the order_agent."
),
handoffs=[vector_search_agent,production_agent,dashboard_agent,processes_agent,order_agent,web_search_agent]
)
PythonPython
Tracing
The Agents SDK includes built-in tracing, collecting a comprehensive record of events during an agent run: LLM generations, tool calls, handoffs, guardrails, and even custom events that occur. Using the Traces dashboard, you can debug, visualize, and monitor your workflows during development and in production.
https://platform.openai.com/logs
Application Interface
.png)
Application Workflow Process
Vector Search Agent
Vector Search Agent automatically ingests New in InterSystems IRIS 2025.1 text information into IRIS Vector Store only once if the data doesn't already exist.
Use the query below to retrieve the data
SELECT
id, embedding, document, metadata
FROM SQLUser.AgenticAIRAG
SQLSQL
The Triage Agent receives user input, routing the question to the Vector Search Agent.
IRIS Dashboard Agent
The Triage Agent receives user input, routing the question to the IRIS Dashboard Agent.
.png)
IRIS Processes Agent
The Triage Agent receives user input, routing the question to the IRIS Processes Agent.
.png)
IRIS Production Agent
Start and Stop the Production by using Production Agent.
Get Production Details by using Production Agent.
Local Agent
The Triage Agent receives user input, routing the question to the Local Order Agent.
.png)
WebSearch Agent
Here, the triage Agent receives two questions, routing both to the WebSearcg Agent.
MCP Server application
MCP Server is running at https://localhost:8000/sse
Below is the code to start the MCP server:
import os
import shutil
import subprocess
import time
from typing import Any
from dotenv import load_dotenv
load_dotenv()
#Get OPENAI Key, if not fond in .env then get the GEIMINI API KEY
#IF Both defined then take OPENAI Key
openai_api_key = os.getenv("OPENAI_API_KEY")
if not openai_api_key:
raise ValueError("OPENAI_API_KEY is not set. Please ensure to defined in .env file.")
if __name__ == "__main__":
# Let's make sure the user has uv installed
if not shutil.which("uv"):
raise RuntimeError(
"uv is not installed. Please install it: https://docs.astral.sh/uv/getting-started/installation/"
)
# We'll run the SSE server in a subprocess. Usually this would be a remote server, but for this
# demo, we'll run it locally at http://localhost:8000/sse
process: subprocess.Popen[Any] | None = None
try:
this_dir = os.path.dirname(os.path.abspath(__file__))
server_file = os.path.join(this_dir, "MCPserver.py")
print("Starting SSE server at http://localhost:8000/sse ...")
# Run `uv run server.py` to start the SSE server
process = subprocess.Popen(["uv", "run", server_file])
# Give it 3 seconds to start
time.sleep(3)
print("SSE server started. Running example...\n\n")
except Exception as e:
print(f"Error starting SSE server: {e}")
exit(1)
PythonPython
The MCP Server is equipped with the following tools:
- Provide IRIS 2025.1 Release notes details (Vector Search)
- IRIS Info tool
- Check Weather tool
- Find secret word tool (Local function)
- Addition Tool (Local function)
MCP application is running at http://localhost:8001
MCP Server Vector Search (RAG) functionality
The MCP Server is equipped with InterSystems IRIS vector search ingestion capabilities and Retrieval-Augmented Generation (RAG) functionality.
MCP Server other functionality
The MCP Server dynamically delegates tasks to the appropriate tool based on user input.
For more details, please visit iris-AgenticAI open exchange application page.
Thanks