Skip to content

feat: Add Azure OpenAI integration support #48

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 13 additions & 4 deletions .env.example
Original file line number Diff line number Diff line change
@@ -1,11 +1,20 @@
# Path to Chrome/Chromium executable leave blank to use default playwright chromium
CHROME_PATH=

# OpenAI API key for OpenAI model access
OPENAI_API_KEY=your-api-key-here

# Set to true if you want api calls to wait for tasks to complete (default is false)
PATIENT=false

# Set to true if you want to disable anonymous telemetry (default is false)
ANONYMIZED_TELEMETRY=false
ANONYMIZED_TELEMETRY=false

# Select your OpenAI service provider, which can either be 'openai' or 'azure'
OPENAI_PROVIDER=azure

# if OPENAI_PROVIDER=azure
AZURE_OPENAI_KEY=you-azure-openai-api-key-here
AZURE_OPENAI_ENDPOINT=https://user-ai-foundry.cognitiveservices.azure.com/
AZURE_OPENAI_DEPLOYMENT=your-deployment-name(e.g.,gpt-4o)
AZURE_OPENAI_API_VERSION=azure-openai-api-version-such-as-2024-11-20

# if OPENAI_PROVIDER=openai
OPENAI_API_KEY=your-openai-platform-api-key-here
54 changes: 54 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,60 @@ uv pip install playwright
uv run playwright install --with-deps --no-shell chromium
```

## LLM Providers

The server can use **either** the public OpenAI platform **or** your own
Azure OpenAI resource.

| Provider | How to enable | Required variables |
|----------|---------------|--------------------|
| OpenAI (default) | set `OPENAI_PROVIDER=openai` | `OPENAI_API_KEY` |
| Azure OpenAI | set `OPENAI_PROVIDER=azure` | `AZURE_OPENAI_KEY`, `AZURE_OPENAI_ENDPOINT`, `AZURE_OPENAI_DEPLOYMENT` |

### 1 · Environment variables (`.env`)

```bash
# --- choose ONE of the following ---

# A. Public OpenAI
OPENAI_PROVIDER=openai
OPENAI_API_KEY=your-openai-api-key

# B. Azure OpenAI
OPENAI_PROVIDER=azure
AZURE_OPENAI_KEY=your-azure-openai-api-key
AZURE_OPENAI_ENDPOINT=https://my-resource.openai.azure.com
AZURE_OPENAI_DEPLOYMENT=your-azure-oai-deployment-such-asgpt-4o
AZURE_OPENAI_API_VERSION=2024-02-01 #optional, default shown
```

> The server first looks for `AZURE_OPENAI_KEY` and falls back to
> `OPENAI_API_KEY`, so you may keep one variable if you use the same key for
> both resources.

### 2 · CLI overrides

```bash
uv python server/server.py \
--azure-endpoint https://my-resource.openai.azure.com \
--azure-deployment gpt-4o \
--azure-api-version 2024-02-01
```

Flags override `.env` values, enabling one-off tests without editing files.

---

## Dependencies

Make sure you are on *at least* these versions:

```
langchain>=0.2.0
langchain-openai>=0.1.6
openai>=1.23.0
```

## Usage

### SSE Mode
Expand Down
60 changes: 59 additions & 1 deletion server/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,10 @@ def init_configuration() -> Dict[str, Any]:
],
# Patient mode - if true, functions wait for task completion before returning
"PATIENT_MODE": parse_bool_env("PATIENT", False),
"OPENAI_PROVIDER": os.environ.get("OPENAI_PROVIDER", "openai").lower(),
"AZURE_OPENAI_ENDPOINT": os.environ.get("AZURE_OPENAI_ENDPOINT"),
"AZURE_OPENAI_DEPLOYMENT": os.environ.get("AZURE_OPENAI_DEPLOYMENT"),
"AZURE_OPENAI_API_VERSION": os.environ.get("AZURE_OPENAI_API_VERSION", "2024-02-01"),
}

return config
Expand Down Expand Up @@ -758,6 +762,21 @@ async def read_resource(uri: str) -> list[types.ResourceContents]:
default=False,
help="Enable stdio mode. If specified, enables proxy mode.",
)
@click.option(
"--azure-endpoint",
default=None,
help="Azure OpenAI resource endpoint, e.g. https://myres.openai.azure.com",
)
@click.option(
"--azure-deployment",
default=None,
help="Deployed model name inside the Azure resource",
)
@click.option(
"--azure-api-version",
default=None,
help="Azure OpenAI API version (defaults to 2024-02-01)",
)
def main(
port: int,
proxy_port: Optional[int],
Expand All @@ -767,6 +786,9 @@ def main(
locale: str,
task_expiry_minutes: int,
stdio: bool,
azure_endpoint: Optional[str] = None,
azure_deployment: Optional[str] = None,
azure_api_version: Optional[str] = None,
) -> int:
"""
Run the browser-use MCP server.
Expand Down Expand Up @@ -801,7 +823,43 @@ def main(
)

# Initialize LLM
llm = ChatOpenAI(model="gpt-4o", temperature=0.0)
if CONFIG["OPENAI_PROVIDER"] == "azure":
# Allow CLI to override .env
if azure_endpoint:
CONFIG["AZURE_OPENAI_ENDPOINT"] = azure_endpoint
if azure_deployment:
CONFIG["AZURE_OPENAI_DEPLOYMENT"] = azure_deployment
if azure_api_version:
CONFIG["AZURE_OPENAI_API_VERSION"] = azure_api_version

required = [
CONFIG["AZURE_OPENAI_ENDPOINT"],
CONFIG["AZURE_OPENAI_DEPLOYMENT"],
os.getenv("AZURE_OPENAI_KEY") or os.getenv("OPENAI_API_KEY"),
]
if not all(required):
raise RuntimeError(
"OPENAI_PROVIDER=azure but one of the required environment variables "
"(AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_DEPLOYMENT, AZURE_OPENAI_KEY) "
"is missing."
)

llm = ChatOpenAI(
model=CONFIG["AZURE_OPENAI_DEPLOYMENT"],
temperature=0.0,
azure_endpoint=CONFIG["AZURE_OPENAI_ENDPOINT"],
azure_deployment=CONFIG["AZURE_OPENAI_DEPLOYMENT"],
api_version=CONFIG["AZURE_OPENAI_API_VERSION"],
api_key=os.getenv("AZURE_OPENAI_KEY") or os.getenv("OPENAI_API_KEY"),
)
logger.info(
"Using Azure OpenAI provider "
f"({CONFIG['AZURE_OPENAI_ENDPOINT']} | deployment={CONFIG['AZURE_OPENAI_DEPLOYMENT']})"
)
else:
# Default public OpenAI
llm = ChatOpenAI(model="gpt-4o", temperature=0.0)
logger.info("Using public OpenAI provider")

# Create MCP server
app = create_mcp_server(
Expand Down