Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
OPENAI_CHAT_MODEL_ID="gpt-4o-2024-08-06"
OPENAI_API_KEY="your-openai-api-key"
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
FROM python:3.12-slim

WORKDIR /app

COPY . user_agent/
WORKDIR /app/user_agent

RUN if [ -f requirements.txt ]; then \
pip install -r requirements.txt; \
else \
echo "No requirements.txt found"; \
fi

EXPOSE 8088

CMD ["python", "main.py"]
47 changes: 47 additions & 0 deletions python/samples/demos/hosted_agents/agent_with_hosted_mcp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Hosted Agents with Hosted MCP Demo

This demo showcases an agent that has access to a MCP tool that can talk to the Microsoft Learn documentation platform, hosted as an agent endpoint running locally in a Docker container.

## What the Project Does

This project demonstrates how to:

- Create an agent with a hosted MCP tool using the Agent Framework
- Host the agent as an agent endpoint running in a Docker container

## Prerequisites

- OpenAI API access and credentials
- Required environment variables (see Configuration section)

## Configuration

Follow the `.env.example` file to set up the necessary environment variables for OpenAI.

## Docker Deployment

Build and run using Docker:

```bash
# Build the Docker image
docker build -t hosted-agent-mcp .

# Run the container
docker run -p 8088:8088 hosted-agent-mcp
```

> If you update the environment variables in the `.env` file or change the code or the dockerfile, make sure to rebuild the Docker image to apply the changes.

## Testing the Agent

Once the agent is running, you can test it by sending queries that contain the trigger keywords. For example:

```bash
curl -sS -H "Content-Type: application/json" -X POST http://localhost:8088/responses -d '{"input": "How to create an Azure storage account using az cli?","stream":false}'
```

Expected response:

```bash
{"object":"response","metadata":{},"agent":null,"conversation":{"id":"conv_6Y7osWAQ1ASyUZ7Ze0LL6dgPubmQv52jHb7G9QDqpV5yakc3ay"},"type":"message","role":"assistant","temperature":1.0,"top_p":1.0,"user":"","id":"resp_Vfd6mdmnmTZ2RNirwfldfqldWLhaxD6fO2UkXsVUg1jYJgftL9","created_at":1763075575,"output":[{"id":"msg_6Y7osWAQ1ASyUZ7Ze0PwiK2V4Bb7NOPaaEpQoBvFRZ5h6OfW4u","type":"message","status":"completed","role":"assistant","content":[{"type":"output_text","text":"To create an Azure Storage account using the Azure CLI, you'll need to follow these steps:\n\n1. **Install Azure CLI**: Make sure the Azure CLI is installed on your machine. You can download it from [here](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli).\n\n2. **Log in to Azure**: Open your terminal or command prompt and use the following command to log in to your Azure account:\n\n ```bash\n az login\n ```\n\n This command will open a web browser where you can log in with your Azure account credentials. If you're using a service principal, you would use `az login --service-principal ...` with the appropriate parameters.\n\n3. **Select the Subscription**: If you have multiple Azure subscriptions, set the default subscription that you want to use:\n\n ```bash\n az account set --subscription \"Your Subscription Name\"\n ```\n\n4. **Create a Resource Group**: If you don’t already have a resource group, create one using:\n\n ```bash\n az group create --name myResourceGroup --location eastus\n ```\n\n Replace `myResourceGroup` and `eastus` with your desired resource group name and location.\n\n5. **Create the Storage Account**: Use the following command to create the storage account:\n\n ```bash\n az storage account create --name mystorageaccount --resource-group myResourceGroup --location eastus --sku Standard_LRS\n ```\n\n Replace `mystorageaccount` with a unique name for your storage account. The storage account name must be between 3 and 24 characters in length, and may contain numbers and lowercase letters only. You can also choose other `--sku` options like `Standard_GRS`, `Standard_RAGRS`, `Standard_ZRS`, `Premium_LRS`, based on your redundancy and performance needs.\n\nBy following these steps, you'll create a new Azure Storage account in the specified resource group and location with the specified SKU.","annotations":[],"logprobs":[]}]}],"parallel_tool_calls":true,"status":"completed"}
```
25 changes: 25 additions & 0 deletions python/samples/demos/hosted_agents/agent_with_hosted_mcp/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Copyright (c) Microsoft. All rights reserved.


from agent_framework import HostedMCPTool
from agent_framework.openai import OpenAIChatClient
from azure.ai.agentserver.agentframework import from_agent_framework # pyright: ignore[reportUnknownVariableType]


def main():
# Create an Agent using the OpenAI Chat Client with a MCP Tool that connects to Microsoft Learn MCP
agent = OpenAIChatClient().create_agent(
name="DocsAgent",
instructions="You are a helpful assistant that can help with microsoft documentation questions.",
tools=HostedMCPTool(
name="Microsoft Learn MCP",
url="https://learn.microsoft.com/api/mcp",
),
)

# Run the agent as a hosted agent
from_agent_framework(agent).run()


if __name__ == "__main__":
main()
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
azure-ai-agentserver-agentframework==1.0.0b3
agent-framework
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
OPENAI_CHAT_MODEL_ID="gpt-4o-2024-08-06"
OPENAI_API_KEY="your-openai-api-key"
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
FROM python:3.12-slim

WORKDIR /app

COPY . user_agent/
WORKDIR /app/user_agent

RUN if [ -f requirements.txt ]; then \
pip install -r requirements.txt; \
else \
echo "No requirements.txt found"; \
fi

EXPOSE 8088

CMD ["python", "main.py"]
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Hosted Agents with Text Search RAG Demo

This demo showcases an agent that uses Retrieval-Augmented Generation (RAG) with text search capabilities that will be hosted as an agent endpoint running locally in a Docker container.

## What the Project Does

This project demonstrates how to:

- Build a customer support agent using the Agent Framework
- Implement a custom `TextSearchContextProvider` that simulates document retrieval
- Host the agent as an agent endpoint running in a Docker container

The agent responds to customer inquiries about:

- **Return & Refund Policies** - Triggered by keywords: "return", "refund"
- **Shipping Information** - Triggered by keyword: "shipping"
- **Product Care Instructions** - Triggered by keywords: "tent", "fabric"

## Prerequisites

- OpenAI API access and credentials
- Required environment variables (see Configuration section)

## Configuration

Follow the `.env.example` file to set up the necessary environment variables for OpenAI.

## Docker Deployment

Build and run using Docker:

```bash
# Build the Docker image
docker build -t hosted-agent-rag .

# Run the container
docker run -p 8088:8088 hosted-agent-rag
```

> If you update the environment variables in the `.env` file or change the code or the dockerfile, make sure to rebuild the Docker image to apply the changes.

## Testing the Agent

Once the agent is running, you can test it by sending queries that contain the trigger keywords. For example:

```bash
curl -sS -H "Content-Type: application/json" -X POST http://localhost:8088/responses -d '{"input": "What is the return policy","stream":false}'
```

Expected response:

```bash
{"object":"response","metadata":{},"agent":null,"conversation":{"id":"conv_2GbSxDpJJ89B6N4FQkKhrHaz78Hjtxy9b30JEPuY9YFjJM0uw3"},"type":"message","role":"assistant","temperature":1.0,"top_p":1.0,"user":"","id":"resp_Bvffxq0iIzlVkx2I8x7hV4fglm9RBPWfMCpNtEpDT6ciV2IG6z","created_at":1763071467,"output":[{"id":"msg_2GbSxDpJJ89B6N4FQknLsnxkwwFS2FULJqRV9jMey2BOXljqUz","type":"message","status":"completed","role":"assistant","content":[{"type":"output_text","text":"As of the most recent update, Contoso Outdoors' return policy allows customers to return products within 30 days of purchase for a full refund or exchange, provided the items are in their original condition and packaging. However, make sure to check your purchase receipt or the company's website for the most updated and specific details, as policies can vary by location and may change over time.","annotations":[],"logprobs":[]}]}],"parallel_tool_calls":true,"status":"completed"}
```
109 changes: 109 additions & 0 deletions python/samples/demos/hosted_agents/agent_with_text_search_rag/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
# Copyright (c) Microsoft. All rights reserved.

import json
import sys
from collections.abc import MutableSequence
from dataclasses import dataclass
from typing import Any

from agent_framework import ChatMessage, Context, ContextProvider, Role
from agent_framework.openai import OpenAIChatClient
from azure.ai.agentserver.agentframework import from_agent_framework # pyright: ignore[reportUnknownVariableType]

if sys.version_info >= (3, 12):
from typing import override
else:
from typing_extensions import override


@dataclass
class TextSearchResult:
source_name: str
source_link: str
text: str


class TextSearchContextProvider(ContextProvider):
"""A simple context provider that simulates text search results based on keywords in the user's message."""

def _get_most_recent_message(self, messages: ChatMessage | MutableSequence[ChatMessage]) -> ChatMessage:
"""Helper method to extract the most recent message from the input."""
if isinstance(messages, ChatMessage):
return messages
if messages:
return messages[-1]
raise ValueError("No messages provided")

@override
async def invoking(self, messages: ChatMessage | MutableSequence[ChatMessage], **kwargs: Any) -> Context:
message = self._get_most_recent_message(messages)
query = message.text.lower()

results: list[TextSearchResult] = []
if "return" in query and "refund" in query:
results.append(
TextSearchResult(
source_name="Contoso Outdoors Return Policy",
source_link="https://contoso.com/policies/returns",
text=(
"Customers may return any item within 30 days of delivery. "
"Items should be unused and include original packaging. "
"Refunds are issued to the original payment method within 5 business days of inspection."
),
)
)

if "shipping" in query:
results.append(
TextSearchResult(
source_name="Contoso Outdoors Shipping Guide",
source_link="https://contoso.com/help/shipping",
text=(
"Standard shipping is free on orders over $50 and typically arrives in 3-5 business days "
"within the continental United States. Expedited options are available at checkout."
),
)
)

if "tent" in query or "fabric" in query:
results.append(
TextSearchResult(
source_name="TrailRunner Tent Care Instructions",
source_link="https://contoso.com/manuals/trailrunner-tent",
text=(
"Clean the tent fabric with lukewarm water and a non-detergent soap. "
"Allow it to air dry completely before storage and avoid prolonged UV "
"exposure to extend the lifespan of the waterproof coating."
),
)
)

if not results:
return Context()

return Context(
messages=[
ChatMessage(
role=Role.USER, text="\n\n".join(json.dumps(result.__dict__, indent=2) for result in results)
)
]
)


def main():
# Create an Agent using the OpenAI Chat Client
agent = OpenAIChatClient().create_agent(
name="SupportSpecialist",
instructions=(
"You are a helpful support specialist for Contoso Outdoors. "
"Answer questions using the provided context and cite the source document when available."
),
context_providers=TextSearchContextProvider(),
)

# Run the agent as a hosted agent
from_agent_framework(agent).run()


if __name__ == "__main__":
main()
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
azure-ai-agentserver-agentframework==1.0.0b3
agent-framework
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
OPENAI_CHAT_MODEL_ID="gpt-4o-2024-08-06"
OPENAI_API_KEY="your-openai-api-key"
16 changes: 16 additions & 0 deletions python/samples/demos/hosted_agents/agents_in_workflow/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
FROM python:3.12-slim

WORKDIR /app

COPY . user_agent/
WORKDIR /app/user_agent

RUN if [ -f requirements.txt ]; then \
pip install -r requirements.txt; \
else \
echo "No requirements.txt found"; \
fi

EXPOSE 8088

CMD ["python", "main.py"]
49 changes: 49 additions & 0 deletions python/samples/demos/hosted_agents/agents_in_workflow/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# Hosted Workflow Agents Demo

This demo showcases an agent that is backed by a workflow of multiple agents running concurrently, hosted as an agent endpoint in a Docker container.

## What the Project Does

This project demonstrates how to:

- Build a workflow of agents using the Agent Framework
- Host the workflow agent as an agent endpoint running in a Docker container

The agent responds to product launch strategy inquiries by concurrently leveraging insights from three specialized agents:

- **Researcher Agent** - Provides market research insights
- **Marketer Agent** - Crafts marketing value propositions and messaging
- **Legal Agent** - Reviews for compliance and legal considerations

## Prerequisites

- OpenAI API access and credentials
- Required environment variables (see Configuration section)

## Configuration

Follow the `.env.example` file to set up the necessary environment variables for OpenAI.

## Docker Deployment

Build and run using Docker:

```bash
# Build the Docker image
docker build -t hosted-agent-workflow .

# Run the container
docker run -p 8088:8088 hosted-agent-workflow
```

> If you update the environment variables in the `.env` file or change the code or the dockerfile, make sure to rebuild the Docker image to apply the changes.

## Testing the Agent

Once the agent is running, you can test it by sending queries that contain the trigger keywords. For example:

```bash
curl -sS -H "Content-Type: application/json" -X POST http://localhost:8088/responses -d '{"input": "We are launching a new budget-friendly electric bike for urban commuters.","stream":false}'
```

> Expected response is not shown here for brevity. The response will include insights from the researcher, marketer, and legal agents based on the input prompt.
Loading
Loading