Before you can use Cloud Composer (Airflow) to schedule jobs, you will need to have access rights. You can gain them by adding yourself to the list in users_composer.yaml file.

Once merged, after some time, you will gain access to safi-env-dev-cloudcomopser and safi-env-stage-cloudcomposer projects on Google Cloud Platform. This will allow you to configure the Cloud Composer:

Use Airflow webserver to access Airflow UI and verify that DAG has been successfully uploaded and executes as expected.

Use DAGs folder to upload new DAGs. These should be uploaded to your repository (e.g., statement manager/dags) but also have to be manually uploaded to the bucket. Once the new Python script is uploaded, it will be automatically loaded by Airflow.

Attachments: