https://github.com/yangkyeongmo/mcp-server-apache-airflow
https://github.com/yangkyeongmo/mcp-server-apache-airflow
Last synced: 25 days ago
JSON representation
- Host: GitHub
- URL: https://github.com/yangkyeongmo/mcp-server-apache-airflow
- Owner: yangkyeongmo
- License: mit
- Created: 2025-02-13T04:09:06.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2026-03-03T13:24:59.000Z (about 2 months ago)
- Last Synced: 2026-03-03T15:50:31.542Z (about 2 months ago)
- Language: Python
- Homepage: https://pypi.org/project/mcp-server-apache-airflow/
- Size: 182 KB
- Stars: 142
- Watchers: 7
- Forks: 43
- Open Issues: 11
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
- awesome-mcp - yangkyeongmo/mcp-server-apache-airflow
- awesome-mcp-servers - mcp-server-apache-airflow - An MCP server that connects to Apache Airflow using the official Airflow client, enabling LLM agents to inspect and interact with Airflow DAGs and workflows via the Model Context Protocol. ([Read more](/details/mcp-server-apache-airflow.md)) `apache-airflow` `workflow` `orchestration` (Workflow & Automation Mcp Servers)
- awesome-mcp-servers - Airflow - MCP server that connects to [Apache Airflow](https://airflow.apache.org/) via its official Python client. (Community Servers)
- awesome-mcp-servers - **mcp-server-apache-airflow** - Python-based `python` `http` `ai` `git` `github` `pip install git+https://github.com/yangkyeongmo/mcp-server-apache-airflow` (π€ AI/ML)
- awesome-mcp-servers - yangkyeongmo@/mcp-server-apache-airflow - Ψ³Ψ±ΩΨ± MCP Ϊ©Ω Ψ¨Ψ§ Ψ§Ψ³ΨͺΩΨ§Ψ―Ω Ψ§Ψ² Ϊ©ΩΨ§ΫΩΨͺ Ψ±Ψ³Ω Ϋ Ψ¨Ω [Apache Airflow](https://airflow.apache.org/) Ω ΨͺΨ΅Ω Ω ΫβΨ΄ΩΨ―. (ΩΎΫΨ§Ψ―ΩβΨ³Ψ§Ψ²ΫβΩΨ§Ϋ Ψ³Ψ±ΩΨ± / π» <a name="developer-tools"></a>Ψ§Ψ¨Ψ²Ψ§Ψ±ΩΨ§Ϋ ΨͺΩΨ³ΨΉΩβΨ―ΩΩΨ―Ω)
- Awesome-MCP-Servers-directory - Airflow - A MCP Server that connects to Apache Airflow using official python client (Workflow Automation)
- metorial-index - Apache Airflow MCP Server - Integrates Apache Airflow's REST API with the Model Context Protocol, enabling MCP clients to interact with Airflow tasks and workflows effectively. Facilitates standardized communication between AI models and Apache Airflow for job orchestration. (Task and Project Management)
- awesome-mcp-servers - @yangkyeongmo@/mcp-server-apache-airflow - MCP server that connects to [Apache Airflow](https://airflow.apache.org/) using official client. (Legend / π» <a name="developer-tools"></a>Developer Tools)
- awesome-mcp - @yangkyeongmo@/mcp-server-apache-airflow - MCP server that connects toΒ [Apache Airflow](https://airflow.apache.org/)Β using official client. (MCP Servers / π» Developer Tools)
- toolsdk-mcp-registry - β mcp-server-apache-airflow
- best-of-mcp-servers - GitHub - 25% open Β· β±οΈ 03.03.2026) (Developer Tools)
- awesome-devops-mcp - yangkyeongmo/mcp-server-apache-airflow - MCP server connecting to Apache Airflow using official client (π³ Kubernetes & Containers)
- awesome-mcp-list - @yangkyeongmo/mcp-server-apache-airflow - server-apache-airflow?style=social)](https://github.com/yangkyeongmo/mcp-server-apache-airflow): Connects to Apache Airflow using the official client. (Uncategorized / Uncategorized)
- awesome-mcp-servers - Airflow MCP Server - Provides a Model Context Protocol (MCP) server for standardized Apache Airflow interaction (Table of Contents / System Automation)
- awesome-claude-dxt - yangkyeongmo/mcp-server-apache-airflow - A MCP Server that connects to [Apache Airflow](https://airflow.apache.org/) using official python client. (ποΈ Extensions by Category / βοΈ Development Tools)
README
[](https://mseep.ai/app/yangkyeongmo-mcp-server-apache-airflow)
# mcp-server-apache-airflow
[](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow)

A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.
## About
This project implements a [Model Context Protocol](https://modelcontextprotocol.io/introduction) server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
## Feature Implementation Status
| Feature | API Path | Status |
| -------------------------------- | --------------------------------------------------------------------------------------------- | ------ |
| **DAG Management** | | |
| List DAGs | `/api/v1/dags` | β
|
| Get DAG Details | `/api/v1/dags/{dag_id}` | β
|
| Pause DAG | `/api/v1/dags/{dag_id}` | β
|
| Unpause DAG | `/api/v1/dags/{dag_id}` | β
|
| Update DAG | `/api/v1/dags/{dag_id}` | β
|
| Delete DAG | `/api/v1/dags/{dag_id}` | β
|
| Get DAG Source | `/api/v1/dagSources/{file_token}` | β
|
| Patch Multiple DAGs | `/api/v1/dags` | β
|
| Reparse DAG File | `/api/v1/dagSources/{file_token}/reparse` | β
|
| **DAG Runs** | | |
| List DAG Runs | `/api/v1/dags/{dag_id}/dagRuns` | β
|
| Create DAG Run | `/api/v1/dags/{dag_id}/dagRuns` | β
|
| Get DAG Run Details | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | β
|
| Update DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | β
|
| Delete DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` | β
|
| Get DAG Runs Batch | `/api/v1/dags/~/dagRuns/list` | β
|
| Clear DAG Run | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear` | β
|
| Set DAG Run Note | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote` | β
|
| Get Upstream Dataset Events | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents` | β
|
| **Tasks** | | |
| List DAG Tasks | `/api/v1/dags/{dag_id}/tasks` | β
|
| Get Task Details | `/api/v1/dags/{dag_id}/tasks/{task_id}` | β
|
| Get Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | β
|
| List Task Instances | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances` | β
|
| Update Task Instance | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}` | β
|
| Get Task Instance Log | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/logs/{task_try_number}` | β
|
| Clear Task Instances | `/api/v1/dags/{dag_id}/clearTaskInstances` | β
|
| Set Task Instances State | `/api/v1/dags/{dag_id}/updateTaskInstancesState` | β
|
| List Task Instance Tries | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/tries` | β
|
| **Variables** | | |
| List Variables | `/api/v1/variables` | β
|
| Create Variable | `/api/v1/variables` | β
|
| Get Variable | `/api/v1/variables/{variable_key}` | β
|
| Update Variable | `/api/v1/variables/{variable_key}` | β
|
| Delete Variable | `/api/v1/variables/{variable_key}` | β
|
| **Connections** | | |
| List Connections | `/api/v1/connections` | β
|
| Create Connection | `/api/v1/connections` | β
|
| Get Connection | `/api/v1/connections/{connection_id}` | β
|
| Update Connection | `/api/v1/connections/{connection_id}` | β
|
| Delete Connection | `/api/v1/connections/{connection_id}` | β
|
| Test Connection | `/api/v1/connections/test` | β
|
| **Pools** | | |
| List Pools | `/api/v1/pools` | β
|
| Create Pool | `/api/v1/pools` | β
|
| Get Pool | `/api/v1/pools/{pool_name}` | β
|
| Update Pool | `/api/v1/pools/{pool_name}` | β
|
| Delete Pool | `/api/v1/pools/{pool_name}` | β
|
| **XComs** | | |
| List XComs | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries` | β
|
| Get XCom Entry | `/api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` | β
|
| **Datasets** | | |
| List Datasets | `/api/v1/datasets` | β
|
| Get Dataset | `/api/v1/datasets/{uri}` | β
|
| Get Dataset Events | `/api/v1/datasetEvents` | β
|
| Create Dataset Event | `/api/v1/datasetEvents` | β
|
| Get DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | β
|
| Get DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | β
|
| Delete DAG Dataset Queued Event | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri}` | β
|
| Delete DAG Dataset Queued Events | `/api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents` | β
|
| Get Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | β
|
| Delete Dataset Queued Events | `/api/v1/datasets/{uri}/dagRuns/queued/datasetEvents` | β
|
| **Monitoring** | | |
| Get Health | `/api/v1/health` | β
|
| **DAG Stats** | | |
| Get DAG Stats | `/api/v1/dags/statistics` | β
|
| **Config** | | |
| Get Config | `/api/v1/config` | β
|
| **Plugins** | | |
| Get Plugins | `/api/v1/plugins` | β
|
| **Providers** | | |
| List Providers | `/api/v1/providers` | β
|
| **Event Logs** | | |
| List Event Logs | `/api/v1/eventLogs` | β
|
| Get Event Log | `/api/v1/eventLogs/{event_log_id}` | β
|
| **System** | | |
| Get Import Errors | `/api/v1/importErrors` | β
|
| Get Import Error Details | `/api/v1/importErrors/{import_error_id}` | β
|
| Get Health Status | `/api/v1/health` | β
|
| Get Version | `/api/v1/version` | β
|
## Setup
### Dependencies
This project depends on the official Apache Airflow client library (`apache-airflow-client`). It will be automatically installed when you install this package.
### Environment Variables
Set the following environment variables:
```
AIRFLOW_HOST= # Optional, defaults to http://localhost:8080
AIRFLOW_API_VERSION=v1 # Optional, defaults to v1
READ_ONLY=true # Optional, enables read-only mode (true/false, defaults to false)
```
#### Authentication
Choose one of the following authentication methods:
**Basic Authentication (default):**
```
AIRFLOW_USERNAME=
AIRFLOW_PASSWORD=
```
**JWT Token Authentication:**
```
AIRFLOW_JWT_TOKEN=
```
To obtain a JWT token, you can use Airflow's authentication endpoint:
```bash
ENDPOINT_URL="http://localhost:8080" # Replace with your Airflow endpoint
curl -X 'POST' \
"${ENDPOINT_URL}/auth/token" \
-H 'Content-Type: application/json' \
-d '{ "username": "", "password": "" }'
```
> **Note**: If both JWT token and basic authentication credentials are provided, JWT token takes precedence.
### Usage with Claude Desktop
Add to your `claude_desktop_config.json`:
**Basic Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
```
**JWT Token Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_JWT_TOKEN": "your-jwt-token"
}
}
}
}
```
For read-only mode (recommended for safety):
**Basic Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password",
"READ_ONLY": "true"
}
}
}
}
```
**JWT Token Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow", "--read-only"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_JWT_TOKEN": "your-jwt-token"
}
}
}
}
```
Alternative configuration using `uv`:
**Basic Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-server-apache-airflow",
"run",
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
```
**JWT Token Authentication:**
```json
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-server-apache-airflow",
"run",
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_JWT_TOKEN": "your-jwt-token"
}
}
}
}
```
Replace `/path/to/mcp-server-apache-airflow` with the actual path where you've cloned the repository.
### Selecting the API groups
You can select the API groups you want to use by setting the `--apis` flag.
```bash
uv run mcp-server-apache-airflow --apis dag --apis dagrun
```
The default is to use all APIs.
Allowed values are:
- config
- connections
- dag
- dagrun
- dagstats
- dataset
- eventlog
- importerror
- monitoring
- plugin
- pool
- provider
- taskinstance
- variable
- xcom
### Read-Only Mode
You can run the server in read-only mode by using the `--read-only` flag or by setting the `READ_ONLY=true` environment variable. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.
Using the command-line flag:
```bash
uv run mcp-server-apache-airflow --read-only
```
Using the environment variable:
```bash
READ_ONLY=true uv run mcp-server-apache-airflow
```
In read-only mode, the server will only expose tools like:
- Listing DAGs, DAG runs, tasks, variables, connections, etc.
- Getting details of specific resources
- Reading configurations and monitoring information
- Testing connections (non-destructive)
Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.
You can combine read-only mode with API group selection:
```bash
uv run mcp-server-apache-airflow --read-only --apis dag --apis variable
```
### Manual Execution
You can also run the server manually:
```bash
make run
```
`make run` accepts following options:
Options:
- `--port`: Port to listen on for SSE (default: 8000)
- `--transport`: Transport type (stdio/sse/http, default: stdio)
Or, you could run the sse server directly, which accepts same parameters:
```bash
make run-sse
```
Also, you could start service directly using `uv` like in the following command:
```bash
uv run src --transport http --port 8080
```
### Installing via Smithery
To install Apache Airflow MCP Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@yangkyeongmo/mcp-server-apache-airflow):
```bash
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude
```
## Development
### Setting up Development Environment
1. Clone the repository:
```bash
git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git
cd mcp-server-apache-airflow
```
2. Install development dependencies:
```bash
uv sync --dev
```
3. Create a `.env` file for environment variables (optional for development):
```bash
touch .env
```
> **Note**: No environment variables are required for running tests. The `AIRFLOW_HOST` defaults to `http://localhost:8080` for development and testing purposes.
### Running Tests
The project uses pytest for testing with the following commands available:
```bash
# Run all tests
make test
```
### Code Quality
```bash
# Run linting
make lint
# Run code formatting
make format
```
### Continuous Integration
The project includes a GitHub Actions workflow (`.github/workflows/test.yml`) that automatically:
- Runs tests on Python 3.10, 3.11, and 3.12
- Executes linting checks using ruff
- Runs on every push and pull request to `main` branch
The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
The package is deployed automatically to PyPI when project.version is updated in `pyproject.toml`.
Follow semver for versioning.
Please include version update in the PR in order to apply the changes to core logic.
## License
[MIT License](LICENSE)