A
Airflow
community
devtools
A MCP Server that connects to [Apache Airflow](https://airflow.apache.org/) using official python client.
mcp-server-apache-airflow
A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.
<a href="https://glama.ai/mcp/servers/e99b6vx9lw">
<img width="380" height="200" src="https://glama.ai/mcp/servers/e99b6vx9lw/badge" alt="Server for Apache Airflow MCP server" />
</a>
About
This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
Feature Implementation Status
| Feature | API Path | Status |
|---|---|---|
| DAG Management | ||
| List DAGs | | ✅ |
| Get DAG Details | | ✅ |
| Pause DAG | | ✅ |
| Unpause DAG | | ✅ |
| Update DAG | | ✅ |
| Delete DAG | | ✅ |
| Get DAG Source | | ✅ |
| Patch Multiple DAGs | | ✅ |
| Reparse DAG File | | ✅ |
| DAG Runs | ||
| List DAG Runs | | ✅ |
| Create DAG Run | | ✅ |
| Get DAG Run Details | | ✅ |
| Update DAG Run | | ✅ |
| Delete DAG Run | | ✅ |
| Get DAG Runs Batch | | ✅ |
| Clear DAG Run | | ✅ |
| Set DAG Run Note | | ✅ |
| Get Upstream Dataset Events | | ✅ |
| Tasks | ||
| List DAG Tasks | | ✅ |
| Get Task Details | | ✅ |
| Get Task Instance | | ✅ |
| List Task Instances | | ✅ |
| Update Task Instance | | ✅ |
| Clear Task Instances | | ✅ |
| Set Task Instances State | | ✅ |
| Variables | ||
| List Variables | | ✅ |
| Create Variable | | ✅ |
| Get Variable | | ✅ |
| Update Variable | | ✅ |
| Delete Variable | | ✅ |
| Connections | ||
| List Connections | | ✅ |
| Create Connection | | ✅ |
| Get Connection | | ✅ |
| Update Connection | | ✅ |
| Delete Connection | | ✅ |
| Test Connection | | ✅ |
| Pools | ||
| List Pools | | ✅ |
| Create Pool | | ✅ |
| Get Pool | | ✅ |
| Update Pool | | ✅ |
| Delete Pool | | ✅ |
| XComs | ||
| List XComs | | ✅ |
| Get XCom Entry | | ✅ |
| Datasets | ||
| List Datasets | | ✅ |
| Get Dataset | | ✅ |
| Get Dataset Events | | ✅ |
| Create Dataset Event | | ✅ |
| Get DAG Dataset Queued Event | | ✅ |
| Get DAG Dataset Queued Events | | ✅ |
| Delete DAG Dataset Queued Event | | ✅ |
| Delete DAG Dataset Queued Events | | ✅ |
| Get Dataset Queued Events | | ✅ |
| Delete Dataset Queued Events | | ✅ |
| Monitoring | ||
| Get Health | | ✅ |
| DAG Stats | ||
| Get DAG Stats | | ✅ |
| Config | ||
| Get Config | | ✅ |
| Plugins | ||
| Get Plugins | | ✅ |
| Providers | ||
| List Providers | | ✅ |
| Event Logs | ||
| List Event Logs | | ✅ |
| Get Event Log | | ✅ |
| System | ||
| Get Import Errors | | ✅ |
| Get Import Error Details | | ✅ |
| Get Health Status | | ✅ |
| Get Version | | ✅ |
Setup
Dependencies
This project depends on the official Apache Airflow client library (
apache-airflow-client). It will be automatically installed when you install this package.Environment Variables
Set the following environment variables:
AIRFLOW_HOST=<your-airflow-host> AIRFLOW_USERNAME=<your-airflow-username> AIRFLOW_PASSWORD=<your-airflow-password>
Usage with Claude Desktop
Add to your
claude_desktop_config.json:{ "mcpServers": { "mcp-server-apache-airflow": { "command": "uvx", "args": ["mcp-server-apache-airflow"], "env": { "AIRFLOW_HOST": "https://your-airflow-host", "AIRFLOW_USERNAME": "your-username", "AIRFLOW_PASSWORD": "your-password" } } } }
Alternative configuration using
uv:{ "mcpServers": { "mcp-server-apache-airflow": { "command": "uv", "args": [ "--directory", "/path/to/mcp-server-apache-airflow", "run", "mcp-server-apache-airflow" ], "env": { "AIRFLOW_HOST": "https://your-airflow-host", "AIRFLOW_USERNAME": "your-username", "AIRFLOW_PASSWORD": "your-password" } } } }
Replace
/path/to/mcp-server-apache-airflow with the actual path where you've cloned the repository.Selecting the API groups
You can select the API groups you want to use by setting the
--apis flag.uv run mcp-server-apache-airflow --apis "dag,dagrun"
The default is to use all APIs.
Allowed values are:
- config
- connections
- dag
- dagrun
- dagstats
- dataset
- eventlog
- importerror
- monitoring
- plugin
- pool
- provider
- taskinstance
- variable
- xcom
Manual Execution
You can also run the server manually:
make run
make run accepts following options:Options:
: Port to listen on for SSE (default: 8000)--port
: Transport type (stdio/sse, default: stdio)--transport
Or, you could run the sse server directly, which accepts same parameters:
make run-sse
Installing via Smithery
To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
Related Servers
Adfin
official
The only platform you need to get paid - all payments in one place, invoicing and accounting reconciliations with [Adfin](https://www.adfin.com/).
View DetailsAPIMatic MCP
official
APIMatic MCP Server is used to validate OpenAPI specifications using [APIMatic](https://www.apimatic.io/). The server processes OpenAPI files and returns validation summaries by leveraging APIMatic’s API.
View Details