Skip to content

Commit a0436a2

Browse files
committed
Added a2a agent code.
1 parent 122b12e commit a0436a2

File tree

20 files changed

+4302
-0
lines changed

20 files changed

+4302
-0
lines changed
Lines changed: 200 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,200 @@
1+
# Agent-to-Agent (A2A) Communication on OCI Model Deployment
2+
3+
This project demonstrates a sophisticated agent-to-agent communication system deployed on Oracle Cloud Infrastructure (OCI) Model Deployment service. The system consists of two specialized agents that work together to provide comprehensive OCI realm status information through collaborative AI interactions.
4+
5+
## Architecture Overview
6+
7+
```
8+
┌─────────────────────────────────────────────────────────────────┐
9+
│ Client Application │
10+
│ │ │
11+
│ ▼ │
12+
┌─────────────────────────────────────────────────────────────────┐
13+
│ OCI Model Deployment Service │
14+
│ │
15+
│ ┌─────────────────┐ ┌─────────────────┐ │
16+
│ │ Agent A │ │ Agent B │ │
17+
│ │ (Primary Agent) │ │ (Specialized) │ │
18+
│ │ │ │ │ │
19+
│ │ • Handles OC4-6 │◄─── A2A Protocol ─►│ • Handles OC1-3 │ │
20+
│ │ • Orchestrates │ │ • Status │ │
21+
│ │ Communication │ │ Reporter │ │
22+
│ │ • Aggregates │ │ • Data │ │
23+
│ │ Results │ │ Processing │ │
24+
│ └─────────────────┘ └─────────────────┘ │
25+
└─────────────────────────────────────────────────────────────────┘
26+
```
27+
28+
## System Capabilities
29+
30+
### Agent A (Primary Agent)
31+
- **Role**: Orchestrator and aggregator
32+
- **Responsibilities**:
33+
- Receives client requests for OCI realm status (OC1-OC6)
34+
- Manages its own status data for OC4-OC6
35+
- Communicates with Agent B to retrieve status for OC1-OC3
36+
- Aggregates and returns comprehensive status information
37+
- **Port**: 9999
38+
- **Skills**: OCI realm status aggregation and inter-agent communication
39+
40+
### Agent B (Specialized Agent)
41+
- **Role**: Specialized status provider
42+
- **Responsibilities**:
43+
- Provides status information for OC1-OC3 realms
44+
- Responds to A2A protocol requests from Agent A
45+
- Maintains focused expertise on specific realm data
46+
- **Port**: 9998
47+
- **Skills**: OCI realm status reporting for older realms
48+
49+
## Quick Start
50+
51+
### Prerequisites
52+
- Oracle Cloud Infrastructure (OCI) account
53+
- OCI CLI installed locally
54+
- Python UV package manger: https://docs.astral.sh/uv/guides/install-python/
55+
- Docker installed locally (for testing)
56+
- Access to OCI Model Deployment service
57+
58+
### Local Development Setup
59+
60+
1. **Clone and navigate to the project**:
61+
```bash
62+
cd model-deployment/A2A_agents_on_MD/
63+
```
64+
65+
2. **Set up Agent A**:
66+
```bash
67+
cd agent_a
68+
uv sync
69+
uv run .
70+
```
71+
72+
3. **Set up Agent B** (in a separate terminal):
73+
```bash
74+
cd agent_b
75+
uv sync
76+
uv run .
77+
```
78+
79+
## Docker Deployment
80+
81+
### Local Docker Testing
82+
83+
**Agent A**:
84+
```bash
85+
cd agent_a
86+
docker build -t agent-a .
87+
docker run -p 9999:9999 \
88+
-e AGENT_A_URL="http://localhost:9999/a2a" \
89+
-e AGENT_B_URL="http://localhost:9998/a2a" \
90+
agent-a
91+
```
92+
93+
**Agent B**:
94+
```bash
95+
cd agent_b
96+
docker build -t agent-b .
97+
docker run -p 9998:9998 \
98+
-e AGENT_A_URL="http://localhost:9999/a2a" \
99+
-e AGENT_B_URL="http://localhost:9998/a2a" \
100+
agent-b
101+
```
102+
103+
## OCI Model Deployment
104+
105+
### Step 1: Build and Push Docker Images
106+
107+
**For Agent A**:
108+
```bash
109+
# Login to OCIR
110+
docker login <region>.ocir.io
111+
112+
# Build and tag
113+
docker build -t <region>.ocir.io/<tenancy>/<repo>/agent-a:latest ./agent_a
114+
115+
# Push to OCIR
116+
docker push <region>.ocir.io/<tenancy>/<repo>/agent-a:latest
117+
```
118+
119+
**For Agent B**:
120+
```bash
121+
# Build and tag
122+
docker build -t <region>.ocir.io/<tenancy>/<repo>/agent-b:latest ./agent_b
123+
124+
# Push to OCIR
125+
docker push <region>.ocir.io/<tenancy>/<repo>/agent-b:latest
126+
```
127+
128+
### Step 2: Deploy on OCI Model Deployment Service
129+
130+
1. **Deploy Agent B First**:
131+
- Navigate to OCI Data Science → Model Deployments
132+
- Create new deployment using the Agent B Docker image
133+
- Configure environment variables:
134+
```
135+
AGENT_A_URL=https://<agent-a-deployment-url>/predict/a2a
136+
AGENT_B_URL=https://<agent-b-deployment-url>/predict/a2a
137+
CUSTOM_PREDICT_URL_ID=agent-b
138+
MODEL_DEPLOY_CUSTOM_ENDPOINTS=[{"endpointURI": "/a2a/", "httpMethods": ["POST"]},{"endpointURI": "/a2a", "httpMethods": ["POST"]},{"endpointURI": "/.well-known/agent.json", "httpMethods": ["GET"]},{"endpointURI": "/a2a/.well-known/agent.json", "httpMethods": ["GET"]},{"endpointURI": "/health", "httpMethods": ["GET"]}]
139+
WEB_CONCURRENCY=1
140+
```
141+
- Set port to 9998
142+
- Deploy and note the deployment URL
143+
144+
2. **Deploy Agent A**:
145+
- Create new deployment using the Agent A Docker image
146+
- Configure environment variables:
147+
```
148+
AGENT_A_URL=https://<agent-a-deployment-url>/predict/a2a
149+
AGENT_B_URL=https://<agent-b-deployment-url>/predict/a2a
150+
CUSTOM_PREDICT_URL_ID=agent-a
151+
MODEL_DEPLOY_CUSTOM_ENDPOINTS=[{"endpointURI": "/a2a/", "httpMethods": ["POST"]},{"endpointURI": "/a2a", "httpMethods": ["POST"]},{"endpointURI": "/.well-known/agent.json", "httpMethods": ["GET"]},{"endpointURI": "/a2a/.well-known/agent.json", "httpMethods": ["GET"]},{"endpointURI": "/health", "httpMethods": ["GET"]}]
152+
WEB_CONCURRENCY=1
153+
```
154+
- Set port to 9999
155+
- Deploy
156+
157+
### Step 3: Configure Authentication
158+
159+
Both agents use OCI Resource Principal Signer (RPS) for authentication when deployed on OCI. The authentication is handled automatically by the A2A SDK in the Agent A code.
160+
161+
## Configuration
162+
163+
### Environment Variables
164+
165+
| Variable | Description |
166+
|----------|-------------|
167+
| `AGENT_A_URL` | Agent A's deployment URL |
168+
| `AGENT_B_URL` | Agent B's deployment URL |
169+
| `CUSTOM_PREDICT_URL_ID` | Custom URL identifier for Model Deployment |
170+
| `MODEL_DEPLOY_CUSTOM_ENDPOINTS` | Custom endpoints configuration for A2A protocol |
171+
| `WEB_CONCURRENCY` | Number of worker processes for web server |
172+
173+
### Port Configuration (in byoc panel)
174+
175+
- **Agent A**: Port 9999
176+
- **Agent B**: Port 9998
177+
178+
## Usage Examples
179+
180+
### Using the Test Client (Recommended)
181+
182+
The `agent_a/test_client.py` file provides a complete example of how to interact with the A2A agents using OCI authentication.
183+
184+
```bash
185+
# Navigate to agent_a directory
186+
cd agent_a
187+
188+
# Run the test client
189+
uv run python test_client.py
190+
```
191+
192+
### Expected Response
193+
```json
194+
{
195+
"this_agent_result": "🟩 New Realms Status 🟩: OC4 ✅, OC5 ✅, OC6 ✅",
196+
"other_agent_result": "🟨 Old Realms status 🟨: OC1 ✅, OC2 ✅, OC3 ✅"
197+
}
198+
```
199+
200+
---
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Python-generated files
2+
__pycache__/
3+
*.py[oc]
4+
build/
5+
dist/
6+
wheels/
7+
*.egg-info
8+
9+
# Virtual environments
10+
.venv
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
3.12
Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# syntax=docker/dockerfile:1
2+
FROM ghcr.io/astral-sh/uv:debian
3+
4+
# Set work directory
5+
WORKDIR /app
6+
7+
# Copy project files
8+
COPY . .
9+
10+
# Install dependencies using uv (preferred for pyproject.toml)
11+
RUN uv sync
12+
13+
# Expose the port the app runs on
14+
EXPOSE 9999
15+
16+
# Run the app
17+
# CMD ["python", "-m", "__main__"]
18+
CMD ["uv", "run", "."]
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# Agent A - OCI Realm Finder Agent
2+
3+
An agent-to-agent communication system built with the A2A SDK that enables seamless interaction between AI agents. This project demonstrates how to create an intelligent agent that can communicate with other agents to gather and process information collaboratively.
4+
5+
## Architecture
6+
7+
```
8+
┌─────────────────┐ HTTP/A2A Protocol ┌─────────────────┐
9+
│ Agent A │ ◄─────────────────────► │ Agent B │
10+
│ (This Project) │ │ (External) │
11+
│ │ │ │
12+
│ • OCI Realm │ │ • Status │
13+
│ Finder │ │ Reporter │
14+
│ • Authentication│ │ • Data │
15+
│ • Communication │ │ Processing │
16+
└─────────────────┘ └─────────────────┘
17+
```
18+
19+
## Quick Start
20+
21+
```bash
22+
# Install dependencies
23+
uv sync
24+
25+
# Run the agent
26+
uv run .
27+
```
28+
29+
## Environment Variables
30+
31+
| Variable | Description | Default |
32+
|----------|-------------|---------|
33+
| `AGENT_A_URL` | URL for this agent's deployment | Required |
34+
| `AGENT_B_URL` | URL for the other agent to communicate with | Required |
35+
36+
## Exposed Endpoints
37+
38+
- `GET /health` - Health check endpoint
39+
- `GET /a2a/.well-known/agent.json` - Agent card information
40+
- `POST /a2a/messages` - Send messages to the agent
41+
42+
### Docker Deployment
43+
44+
```bash
45+
# Build and run
46+
docker build -t agent-a .
47+
docker run -p 9999:9999 \
48+
-e AGENT_A_URL="https://your-deployment-url.com/predict/a2a" \
49+
-e AGENT_B_URL="https://your-deployment-url.com/predict/a2a" \
50+
agent-a
51+
```
52+
53+
You can push this docker image to OCIR and then use Model Deployment service to run it. You can add the environment variables in the Model Deployment environment variables section.
Lines changed: 81 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,81 @@
1+
import uvicorn
2+
import os
3+
from a2a.server.apps import A2AStarletteApplication
4+
from a2a.server.request_handlers import DefaultRequestHandler
5+
from a2a.server.tasks import InMemoryTaskStore
6+
from a2a.types import (
7+
AgentCapabilities,
8+
AgentCard,
9+
AgentSkill,
10+
)
11+
from agent_executor import (
12+
OCIAllRealmFinderAgentExecutor,
13+
)
14+
from starlette.responses import JSONResponse
15+
from starlette.applications import Starlette
16+
from starlette.requests import Request
17+
18+
class PrefixDispatcher:
19+
def __init__(self, app, prefix="/a2a"):
20+
self.app = app
21+
self.prefix = prefix
22+
23+
async def __call__(self, scope, receive, send):
24+
if scope["type"] == "http" and scope["path"].startswith(self.prefix + "/"):
25+
scope = dict(scope)
26+
scope["path"] = scope["path"][len(self.prefix):]
27+
if not scope["path"]:
28+
scope["path"] = "/"
29+
await self.app(scope, receive, send)
30+
31+
if __name__ == '__main__':
32+
33+
agent_a_url = os.getenv('AGENT_A_URL')
34+
35+
skill = AgentSkill(
36+
id='oci_realm_finder',
37+
name='Returns OCI functioning realms and their status',
38+
description='just returns OCI functioning realms and their status',
39+
tags=['oci', 'realm', 'finder'],
40+
examples=['what are the functioning realms and their status?', 'what is the status of the OCI-1 realm?'],
41+
)
42+
43+
public_agent_card = AgentCard(
44+
name='OCI Realm Finder Agent',
45+
description='Just a OCI realm finder agent',
46+
# url='http://localhost:9999/', # TODO: change to the actual url of MD
47+
# url='https://modeldeployment.us-ashburn-1.oci.customer-oci.com/ocid1.datasciencemodeldeployment.oc1.iad.amaaaaaay75uckqavsz3dipblcb6ckgwljls5qosxramv4osvt77tr5nnrra/predict/a2a/',
48+
url=agent_a_url,
49+
version='1.0.0',
50+
defaultInputModes=['text'],
51+
defaultOutputModes=['text'],
52+
capabilities=AgentCapabilities(streaming=True),
53+
skills=[skill],
54+
supportsAuthenticatedExtendedCard=False,
55+
)
56+
57+
request_handler = DefaultRequestHandler(
58+
agent_executor=OCIAllRealmFinderAgentExecutor(),
59+
task_store=InMemoryTaskStore(),
60+
)
61+
62+
server = A2AStarletteApplication(
63+
agent_card=public_agent_card,
64+
http_handler=request_handler,
65+
)
66+
67+
app = server.build()
68+
prefix_app = PrefixDispatcher(app, prefix="/a2a")
69+
starlette_app = Starlette()
70+
71+
@starlette_app.route("/health")
72+
async def health(request: Request):
73+
return JSONResponse({"status": "ok"})
74+
75+
async def main_app(scope, receive, send):
76+
if scope["type"] == "http" and scope["path"].startswith("/health"):
77+
await starlette_app(scope, receive, send)
78+
else:
79+
await prefix_app(scope, receive, send)
80+
81+
uvicorn.run(main_app, host='0.0.0.0', port=9999)

0 commit comments

Comments
 (0)