https://github.com/alvarocavalcante/airflow-parse-bench
Stop creating bad DAGs! Use this tool to measure and compare the parse time of your DAGs, identify bottlenecks, and optimize your Airflow environment for better performance.
Science Score: 13.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
○CITATION.cff file
-
✓codemeta.json file
Found codemeta.json file -
○.zenodo.json file
-
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (14.5%) to scientific vocabulary
Keywords
Repository
Stop creating bad DAGs! Use this tool to measure and compare the parse time of your DAGs, identify bottlenecks, and optimize your Airflow environment for better performance.
Basic Info
- Host: GitHub
- Owner: AlvaroCavalcante
- License: apache-2.0
- Language: Python
- Default Branch: main
- Homepage: https://medium.com/p/146fcf4d27f7
- Size: 192 KB
Statistics
- Stars: 19
- Watchers: 2
- Forks: 0
- Open Issues: 6
- Releases: 1
Topics
Metadata Files
README.md
Airflow Dag Parse Benchmarking
Stop creating bad DAGs!
Use this tool to measure and compare the parse time of your DAGs, identify bottlenecks, and optimize your Airflow environment for better performance.
Contents
How It Works
Retrieving parse metrics from an Airflow cluster is straightforward, but measuring the effectiveness of code optimizations can be tedious. Each code change requires redeploying the Python file to your cloud provider, waiting for the DAG to be parsed, and then extracting a new report — a slow and time-consuming process.
This tool simplifies the process of measuring and comparing DAG parse times. It uses the same parse method as Airflow (from the Airflow repository) to measure the time taken to parse your DAGs locally, storing results for future comparisons.
To know more about how the tool works, check out the Medium article.
Installation
It's recommended to use a virtualenv to avoid library conflicts. Once set up, you can install the package by running the following command:
bash
pip install airflow-parse-bench
Install your Airflow dependencies
The command above installs only the essential library dependencies (Airflow and Airflow providers). You’ll need to manually install any additional libraries that your DAGs depend on.
For example, if a DAG uses boto3 to interact with AWS, ensure that boto3 is installed in your environment. Otherwise, you'll encounter parse errors.
Init the Airflow database
Before parsing your DAGs, it's also necessary to start an Airflow database on your local machine. To do this, run the command below:
airflow db init
Configure your Airflow Variables
If your DAGs use Airflow Variables, you must define them locally as well. Use placeholder values, as the actual values aren't required for parsing purposes.
To setup Airflow Variables locally, you can use the following command:
bash
airflow variables set MY_VARIABLE 'ANY TEST VALUE'
Without this, you'll encounter an error like:
bash
error: 'Variable MY_VARIABLE does not exist'
Usage
To measure the parse time of a single Python file, just run:
bash
airflow-parse-bench --path your_path/dag_test.py
The output will look like this:

The result table includes the following columns:
- Filename: The name of the Python module containing the DAG. This unique name is the key to store DAG information.
- Current Parse Time: The time (in seconds) taken to parse the DAG.
Previous Parse Time: The parse time from the previous run.
Difference: The difference between the current and previous parse times.
Best Parse Time: The best parse time recorded for the DAG.
You can also measure the parse time for all Python files in a directory by running:
bash
airflow-parse-bench --path your_path/your_dag_folder
This time, the output table will display parse times for all Python files in the folder:

Additional Options
The library supports some additional arguments to customize the results. To see all available options, run:
bash
airflow-parse-bench --help
It will display the following options:
- --path: The path to the Python file or directory containing the DAGs.
- --order: The order in which the results are displayed. You can choose between 'asc' (ascending) or 'desc' (descending).
- --num-iterations: The number of times to parse each DAG. The parse time will be averaged across iterations.
- --skip-unchanged: Skip DAGs that haven't changed since the last run.
- --reset-db: Clear all stored data in the local database, starting a fresh execution.
Note: If a Python file has parsing errors or contains no valid DAGs, it will be excluded from the results table, and an error message will be displayed.
Roadmap
This project is still in its early stages, and there are many improvements planned for the future. Some of the features we're considering include:
- Cloud DAG Parsing: Automatically download and parse DAGs from cloud providers like AWS S3 or Google Cloud Storage.
- CI/CD Integration: Adapt the tool to work with CI/CD pipelines.
- Parallel Parsing: Speed up processing by parsing multiple DAGs simultaneously.
- Support .airflowignore: Ignore files and directories specified in the
.airflowignorefile.
If you’d like to suggest a feature or report a bug, please open a new issue!
Contributing
This project is open to contributions! If you want to collaborate to improve the tool, please follow these steps:
- Open a new issue to discuss the feature or bug you want to address.
- Once approved, fork the repository and create a new branch.
- Implement the changes.
- Create a pull request with a detailed description of the changes.
Owner
- Name: Alvaro Leandro Cavalcante Carneiro
- Login: AlvaroCavalcante
- Kind: user
- Location: São Paulo, SP
- Company: Bank Master
- Website: https://alvarocavalcante.github.io
- Repositories: 6
- Profile: https://github.com/AlvaroCavalcante
Master's degree student and Data Engineer at Bank Master.
GitHub Events
Total
- Issues event: 3
- Watch event: 17
- Issue comment event: 3
- Push event: 2
- Public event: 1
Last Year
- Issues event: 3
- Watch event: 17
- Issue comment event: 3
- Push event: 2
- Public event: 1
Packages
- Total packages: 1
-
Total downloads:
- pypi 118 last-month
- Total dependent packages: 0
- Total dependent repositories: 0
- Total versions: 8
- Total maintainers: 1
pypi.org: airflow-parse-bench
Easily measure and compare your Airflow DAGs' parse time.
- Homepage: https://github.com/AlvaroCavalcante/airflow-parse-bench
- Documentation: https://airflow-parse-bench.readthedocs.io/
- License: Apache License 2.0
-
Latest release: 1.0.1
published about 1 year ago
Rankings
Maintainers (1)
Dependencies
- actions/first-interaction v1 composite
- apache-airflow ==2.10.4
- apache-airflow-providers-apache-beam ==5.9.1
- apache-airflow-providers-common-compat ==1.2.2
- apache-airflow-providers-common-io ==1.4.2
- apache-airflow-providers-common-sql ==1.20.0
- apache-airflow-providers-google ==10.26.0
- apache-airflow-providers-sqlite ==3.9.0
- apache-airflow-providers-standard ==0.0.2
- colorama ==0.4.6
- tqdm ==4.67.1