A Model Context Protocol (MCP) server that provides access to JIRA issue data stored in Snowflake. This server enables AI assistants to query, filter, and analyze JIRA issues through a standardized interface.
This MCP server connects to Snowflake to query JIRA data and provides five main tools for interacting with the data:
list_jira_issues- Query and filter JIRA issues with various criteriaget_jira_issue_details- Get detailed information for multiple issues by their keysget_jira_project_summary- Get statistics and summaries for all projectsget_jira_issue_links- Get issue links for a specific JIRA issue by its keyget_jira_issues_by_sprint- Get all JIRA issues in a specific sprint by sprint name
The server connects to Snowflake and queries the following tables:
JIRA_ISSUE_NON_PII- Main issue data (non-personally identifiable information)JIRA_LABEL_RHAI- Issue labels and tagsJIRA_COMMENT_NON_PII- Issue comments (non-personally identifiable information)JIRA_COMPONENT_RHAI- JIRA project components and their metadataJIRA_NODEASSOCIATION_RHAI- Associations between JIRA entities (issues, components, versions)JIRA_PROJECTVERSION_NON_PII- Project versions (fix versions and affected versions)JIRA_ISSUELINK_RHAI- Links between JIRA issuesJIRA_ISSUELINKTYPE_RHAI- Types of issue linksJIRA_CUSTOMFIELDVALUE_NON_PII- Custom field values (e.g., sprint information)JIRA_SPRINT_RHAI- Sprint dataJIRA_CHANGEGROUP_RHAI- Change history groupsJIRA_CHANGEITEM_RHAI- Individual change items (e.g., status changes)
Note: Table names are expected to exist in your configured Snowflake database and schema.
Query JIRA issues with optional filtering:
- Project filtering - Filter by project key (e.g., 'SMQE', 'OSIM')
- Issue keys filtering - Filter by specific issue keys (e.g., ['SMQE-1280', 'SMQE-1281'])
- Issue type filtering - Filter by issue type ID
- Status filtering - Filter by issue status ID
- Priority filtering - Filter by priority ID
- Text search - Search in summary and description fields
- Component filtering - Filter by component names (comma-separated, matches any)
- Version filtering - Filter by fixed version or affected version name
- Date filtering - Filter by creation, update, or resolution date within last N days
- Timeframe filtering - Filter issues where any date (created, updated, or resolved) is within last N days
- Result limiting - Control number of results returned (default: 50)
Returns issue information including:
- Basic issue information (summary, description, status, priority)
- Timestamps (created, updated, due date, resolution date)
- Metadata (votes, watches, environment, components)
- Associated labels and links
- Fixed and affected versions
Retrieve comprehensive information for multiple JIRA issues by their keys (e.g., ['SMQE-1280', 'SMQE-1281']), including:
- Basic issue information (summary, description, status, priority)
- Timestamps (created, updated, due date, resolution date)
- Time tracking (original estimate, current estimate, time spent)
- Metadata (votes, watches, environment, components, workflow ID, security, archived status)
- Associated labels
- Comments (with comment body, creation/update timestamps, and role level)
- Issue links (inward and outward)
- Status change history
- Fixed and affected versions
Returns a dictionary with:
found_issues- Dictionary of found issues keyed by issue keynot_found- List of issue keys that were not foundtotal_found- Number of issues foundtotal_requested- Number of issues requested
Generate statistics across all projects:
- Total issue counts per project
- Status distribution per project
- Priority distribution per project
- Overall statistics
Get issue links for a specific JIRA issue by its key (e.g., 'SMQE-1280'):
- Issue links - Relationships to other issues (blocks, is blocked by, relates to, etc.)
- Link direction - Indicates if the link is inward or outward
- Linked issue details - Information about the linked issue
Returns information including:
- Issue key and ID
- List of all issue links with link type and direction
- Total count of links
Get all JIRA issues in a specific sprint by sprint name:
- Sprint filtering - Filter by sprint name (e.g., 'Sprint 256')
- Project filtering - Optional filter by project key (e.g., 'SMQE', 'OSIM')
- Result limiting - Control number of results returned (default: 50)
Returns issue information including:
- All standard issue fields (same as
list_jira_issues) - Sprint ID and sprint name
- Associated labels and links
- Fixed and affected versions
The server includes optional Prometheus metrics support for monitoring:
- Tool usage tracking - Track calls to each MCP tool with success/error rates and duration
- Snowflake query monitoring - Monitor database query performance and success rates
- Connection tracking - Track active MCP connections
- HTTP endpoints -
/metricsfor Prometheus scraping and/healthfor health checks
- Python 3.10+
- UV (Python package manager)
- Podman or Docker
- Access to Snowflake with appropriate credentials
The codebase is organized into modular components in the src/ directory:
src/mcp_server.py- Main server entry point and MCP initializationsrc/config.py- Configuration management and environment variable handlingsrc/database.py- Snowflake database connection and query executionsrc/tools.py- MCP tool implementations and business logicsrc/metrics.py- Optional Prometheus metrics collection and HTTP server
The following environment variables are used to configure the Snowflake connection:
SNOWFLAKE_CONNECTION_METHOD- Connection method to use- Values:
api(REST API) orconnector(snowflake-connector-python) - Default:
api
- Values:
When using SNOWFLAKE_CONNECTION_METHOD=api:
SNOWFLAKE_TOKEN- Your Snowflake authentication token (Bearer token)SNOWFLAKE_BASE_URL- Snowflake API base URL (e.g.,https://your-account.snowflakecomputing.com/api/v2)SNOWFLAKE_DATABASE- Snowflake database name containing your JIRA dataSNOWFLAKE_SCHEMA- Snowflake schema name containing your JIRA tables
When using SNOWFLAKE_CONNECTION_METHOD=connector:
SNOWFLAKE_ACCOUNT- Snowflake account identifier (e.g.,your-account.snowflakecomputing.com)SNOWFLAKE_DATABASE- Snowflake database name containing your JIRA dataSNOWFLAKE_SCHEMA- Snowflake schema name containing your JIRA tablesSNOWFLAKE_WAREHOUSE- Snowflake warehouse name
Private Key Authentication (Recommended for Service Accounts)
SNOWFLAKE_AUTHENTICATOR- Set tosnowflake_jwtSNOWFLAKE_USER- Snowflake username that has the public key registeredSNOWFLAKE_PRIVATE_KEY_FILE- Path to private key file (PKCS#8 format)SNOWFLAKE_PRIVATE_KEY_FILE_PWD- Private key password (optional, if key is encrypted)
Username/Password Authentication
SNOWFLAKE_AUTHENTICATOR- Set tosnowflake(default)SNOWFLAKE_USER- Snowflake usernameSNOWFLAKE_PASSWORD- Snowflake password
OAuth Client Credentials
SNOWFLAKE_AUTHENTICATOR- Set tooauth_client_credentialsSNOWFLAKE_OAUTH_CLIENT_ID- OAuth client IDSNOWFLAKE_OAUTH_CLIENT_SECRET- OAuth client secretSNOWFLAKE_OAUTH_TOKEN_URL- OAuth token URL (optional)
OAuth Token
SNOWFLAKE_AUTHENTICATOR- Set tooauthSNOWFLAKE_TOKEN- OAuth access token
SNOWFLAKE_ROLE- Snowflake role to use (optional)
MCP_TRANSPORT- Transport protocol for MCP communication- Default:
stdio
- Default:
ENABLE_METRICS- Enable Prometheus metrics collection- Default:
false
- Default:
METRICS_PORT- Port for metrics HTTP server- Default:
8000
- Default:
To set up private key authentication:
-
Generate RSA key pair:
# Generate private key openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out rsa_key.p8 # Generate public key openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub
-
Register public key with Snowflake user:
ALTER USER your_service_account SET RSA_PUBLIC_KEY='MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA...';
-
Set environment variables:
export SNOWFLAKE_CONNECTION_METHOD=connector export SNOWFLAKE_AUTHENTICATOR=snowflake_jwt export SNOWFLAKE_ACCOUNT=your-account.snowflakecomputing.com export SNOWFLAKE_USER=your_service_account export SNOWFLAKE_PRIVATE_KEY_FILE=/path/to/rsa_key.p8 export SNOWFLAKE_DATABASE=your_database export SNOWFLAKE_SCHEMA=your_schema export SNOWFLAKE_WAREHOUSE=your_warehouse export SNOWFLAKE_ROLE=your_role
This project has been updated to use UV for dependency management. If you have an existing setup:
-
Remove your old virtual environment:
rm -rf venv/
-
Install UV if you haven't already (see Local Development section below)
-
Install dependencies with UV:
uv sync
- Clone the repository:
git clone <repository-url>
cd jira-mcp-snowflake- Install UV if you haven't already:
# On macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# On Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# Or via pip
pip install uv- Install dependencies:
uv sync-
Set up environment variables (see Environment Variables section above)
-
Run the server:
uv run python src/mcp_server.pyFor convenience, several Makefile targets are available to streamline development tasks:
# Install dependencies including dev packages
make uv_sync_dev# Run linting (flake8)
make lint
# Run tests with coverage
make pytest
# Run both linting and tests
make test# Build container image with Podman
make buildNote: On macOS, you may need to install a newer version of make via Homebrew:
brew install makeTo build the container image locally using Podman, run:
podman build -t localhost/jira-mcp-snowflake:latest .This will create a local image named jira-mcp-snowflake:latest that you can use to run the server. The container now uses UV for fast dependency management.
Example 1: REST API with Token
{
"mcpServers": {
"jira-mcp-snowflake": {
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-e", "SNOWFLAKE_CONNECTION_METHOD=api",
"-e", "SNOWFLAKE_TOKEN=your_token_here",
"-e", "SNOWFLAKE_BASE_URL=https://your-account.snowflakecomputing.com/api/v2",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
}Example 2: Private Key Authentication (Service Account)
{
"mcpServers": {
"jira-mcp-snowflake": {
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-v", "/path/to/your/rsa_key.p8:/app/rsa_key.p8:ro",
"-e", "SNOWFLAKE_CONNECTION_METHOD=connector",
"-e", "SNOWFLAKE_AUTHENTICATOR=snowflake_jwt",
"-e", "SNOWFLAKE_ACCOUNT=your-account.snowflakecomputing.com",
"-e", "SNOWFLAKE_USER=your_service_account",
"-e", "SNOWFLAKE_PRIVATE_KEY_FILE=/app/rsa_key.p8",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "SNOWFLAKE_WAREHOUSE=your_warehouse_name",
"-e", "SNOWFLAKE_ROLE=your_role_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
}Then access metrics at: http://localhost:8000/metrics
Example configuration for connecting to a remote instance:
{
"mcpServers": {
"jira-mcp-snowflake": {
"url": "https://jira-mcp-snowflake.example.com/sse",
"headers": {
"X-Snowflake-Token": "your_token_here"
}
}
}
}Example configuration to add to VS Code Continue:
{
"experimental": {
"modelContextProtocolServers": [
{
"name": "jira-mcp-snowflake",
"transport": {
"type": "stdio",
"command": "podman",
"args": [
"run",
"-i",
"--rm",
"-e", "SNOWFLAKE_TOKEN=your_token_here",
"-e", "SNOWFLAKE_BASE_URL=https://your-account.snowflakecomputing.com/api/v2",
"-e", "SNOWFLAKE_DATABASE=your_database_name",
"-e", "SNOWFLAKE_SCHEMA=your_schema_name",
"-e", "MCP_TRANSPORT=stdio",
"-e", "ENABLE_METRICS=true",
"-e", "METRICS_PORT=8000",
"localhost/jira-mcp-snowflake:latest"
]
}
}
]
}
}# List all issues from the SMQE project
result = await list_jira_issues(project="SMQE", limit=10)# Search for issues containing "authentication" in summary or description
result = await list_jira_issues(search_text="authentication", limit=20)# Find issues in specific components
result = await list_jira_issues(components="Security,Authentication", limit=20)# Find issues with a specific fixed version
result = await list_jira_issues(fixed_version="2.5.0", limit=20)# Find issues created in the last 7 days
result = await list_jira_issues(created_days=7, limit=20)
# Find issues updated in the last 30 days
result = await list_jira_issues(updated_days=30, limit=50)# Get detailed information for multiple issues
result = await get_jira_issue_details(issue_keys=["SMQE-1280", "SMQE-1281"])
# Access the results
for issue_key, issue_data in result["found_issues"].items():
print(f"Issue: {issue_key}")
print(f"Summary: {issue_data['summary']}")
print(f"Status: {issue_data['status']}")
print(f"Labels: {issue_data['labels']}")
print(f"Comments: {len(issue_data['comments'])}")# Get all issue links for a specific issue
result = await get_jira_issue_links(issue_key="SMQE-1280")
# Access the links
print(f"Total links: {result['total_links']}")
for link in result['links']:
print(f"Link type: {link['link_type']}")
print(f"Direction: {link['direction']}")
print(f"Linked issue: {link['linked_issue_key']}")# Get all issues in a specific sprint
result = await get_jira_issues_by_sprint(sprint_name="Sprint 256", limit=50)
# Get issues in a sprint for a specific project
result = await get_jira_issues_by_sprint(
sprint_name="Sprint 256",
project="SMQE",
limit=50
)
# Access the results
print(f"Sprint: {result['sprint_name']}")
print(f"Total issues: {result['total_returned']}")
for issue in result['issues']:
print(f"Issue: {issue['key']} - {issue['summary']}")
print(f"Status: {issue['status']}")# Get statistics for all projects
result = await get_jira_project_summary()When metrics are enabled, the server provides the following monitoring endpoints:
/metrics- Prometheus metrics endpoint for scraping/health- Health check endpoint returning JSON status
mcp_tool_calls_total- Counter of tool calls by tool name and statusmcp_tool_call_duration_seconds- Histogram of tool call durationsmcp_active_connections- Gauge of active MCP connectionsmcp_snowflake_queries_total- Counter of Snowflake queries by statusmcp_snowflake_query_duration_seconds- Histogram of Snowflake query durations
This server is designed to work with non-personally identifiable information (non-PII) data only. The Snowflake tables should contain sanitized data with any sensitive personal information removed.
- Environment Variables: Store sensitive information like
SNOWFLAKE_TOKENin environment variables, never in code - Token Security: Ensure your Snowflake token is kept secure and rotated regularly
- Network Security: Use HTTPS endpoints and secure network connections
- Access Control: Follow principle of least privilege for Snowflake database access
- SQL Injection Prevention: The server includes input sanitization to prevent SQL injection attacks
httpx- HTTP client library for Snowflake API communicationfastmcp- Fast MCP server frameworkprometheus_client- Prometheus metrics client (optional, for monitoring)
The project follows a modular architecture:
jira-mcp-snowflake/
├── src/
│ ├── mcp_server.py # Main entry point
│ ├── config.py # Configuration and environment variables
│ ├── database.py # Snowflake database operations
│ ├── tools.py # MCP tool implementations
│ └── metrics.py # Prometheus metrics (optional)
├── requirements.txt # Python dependencies
└── README.md # This file
To add new MCP tools:
- Add the tool function to
src/tools.py - Decorate with
@mcp.tool()and@track_tool_usage("tool_name") - Follow the existing patterns for error handling and logging
- Update this README with documentation for the new tool