This is an experimental library as of June 2021! The Great Expectations core team maintains this provider in an experimental state and does not guarantee ongoing support yet.
An Airflow operator for Great Expectations, a Python library for testing and validating data.
- This operator has been updated to use Great Expectations Checkpoints instead of the former ValidationOperators. Therefore, it requires Great Expectations >=v0.13.9, which is pinned in the requirements.txt starting with release 0.0.5.
- Great Expectations version 0.13.8 unfortunately contained a bug that would make this operator not work.
- Great Expectations version 0.13.7 and below will work with version 0.0.4 of this operator and below.
This package has been most recently tested with Airflow 2.0 and Great Expectations v0.13.7.
Pre-requisites: An environment running great-expectations
and apache-airflow
- these are requirements of this package that will be installed as dependencies.
pip install airflow-provider-great-expectations
In order to run the BigQueryOperator
, you will also need to install the relevant dependencies: pybigquery
and apache-airflow-providers-google
Depending on your use-case, you might need to add ENV AIRFLOW__CORE__ENABLE_XCOM_PICKLING=true
to your Dockerfile to enable XCOM to pass data between tasks.
Great Expectations Operator: A base operator for Great Expectations. Import into your DAG via:
from great_expectations_provider.operators.great_expectations import GreatExpectationsOperator
Great Expectations BigQuery Operator: An operator for Great Expectations that provides some pre-set parameters for a BigQuery Datasource and Expectation, Validation, and Data Docs stores in Google Cloud Storage. The operator can also be configured to send email on validation failure. See the docstrings in the class for more configuration options. Import into your DAG via:
from great_expectations_provider.operators.great_expectations_bigquery import GreatExpectationsBigQueryOperator
See the example_dags directory for an example DAG with some sample tasks that demonstrate operator functionality. The example DAG file contains a comment with instructions on how to run the examples.
Note that to make these operators work, you will need to change the value of enable_xcom_pickling
to true
in your airflow.cfg.
These examples can be tested in one of two ways:
With the open-source Astro CLI:
- Initialize a project with the Astro CLI
- Copy the example DAG into the
dags/
folder of your astro project - Add the following env var to your
Dockerfile
to enable xcom pickling:ENV AIRFLOW__CORE__ENABLE_XCOM_PICKLING=True
- Copy the directories in the
include
folder of this repository into theinclude
directory of your Astro project - Add
airflow-provider-great-expectations
to yourrequirements.txt
- Run
astro dev start
to view the DAG on a local Airflow instance (you will need Docker running)
With a vanilla Airflow installation:
- Add the example DAG to your
dags/
folder - Make the
great_expectations
anddata
directories ininclude/
available in your environment. - Change the
data_file
andge_root_dir
paths in your DAG file to point to the appropriate places. - Change the paths in
great-expectations/checkpoints/*.yml
to point to the absolute path of your data files.
**This operator is in early stages of development! Feel free to submit issues, PRs, or join the #integration-airflow channel in the Great Expectations Slack for feedback. Thanks to Pete DeJoy and the Astronomer.io team for the support.