fix(airflow): set DAGS_FOLDER in image env and reserialize on init
All checks were successful
Build and Push Docker Images / Build Backend (FastAPI) (push) Successful in 32s
Build and Push Docker Images / Build Frontend (Next.js) (push) Successful in 1m5s
Build and Push Docker Images / Build Integrator (push) Successful in 57s
Build and Push Docker Images / Build Kestra Init (push) Successful in 32s
Build and Push Docker Images / Build Pipeline (Meltano + dbt + Airflow) (push) Successful in 32s
Build and Push Docker Images / Trigger Portainer Update (push) Successful in 0s

- Add AIRFLOW__CORE__DAGS_FOLDER env var in Dockerfile so it's always set
- Run `airflow dags reserialize` after `db migrate` in init container so
  DAGs appear immediately without waiting for scheduler scan interval

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
2026-03-26 11:05:41 +00:00
parent e815f597ab
commit b6a487776b
3 changed files with 3 additions and 2 deletions

View File

@@ -250,7 +250,7 @@ services:
airflow-init: airflow-init:
image: privaterepo.sitaru.org/tudor/school_compare-pipeline:latest image: privaterepo.sitaru.org/tudor/school_compare-pipeline:latest
container_name: schoolcompare_airflow_init container_name: schoolcompare_airflow_init
command: airflow db migrate command: bash -c "airflow db migrate && airflow dags reserialize"
environment: environment:
AIRFLOW__CORE__EXECUTOR: LocalExecutor AIRFLOW__CORE__EXECUTOR: LocalExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://${DB_USERNAME}:${DB_PASSWORD}@sc_database:5432/${DB_DATABASE_NAME} AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://${DB_USERNAME}:${DB_PASSWORD}@sc_database:5432/${DB_DATABASE_NAME}

View File

@@ -149,7 +149,7 @@ services:
airflow-init: airflow-init:
image: privaterepo.sitaru.org/tudor/school_compare-pipeline:latest image: privaterepo.sitaru.org/tudor/school_compare-pipeline:latest
container_name: schoolcompare_airflow_init container_name: schoolcompare_airflow_init
command: airflow db migrate command: bash -c "airflow db migrate && airflow dags reserialize"
environment: *airflow-env environment: *airflow-env
depends_on: depends_on:
db: db:

View File

@@ -33,6 +33,7 @@ COPY dags/ dags/
RUN cd transform && dbt deps --profiles-dir . 2>/dev/null || true RUN cd transform && dbt deps --profiles-dir . 2>/dev/null || true
ENV AIRFLOW_HOME=/opt/airflow ENV AIRFLOW_HOME=/opt/airflow
ENV AIRFLOW__CORE__DAGS_FOLDER=/opt/pipeline/dags
ENV PYTHONPATH=/opt/pipeline ENV PYTHONPATH=/opt/pipeline
CMD ["airflow", "api-server"] CMD ["airflow", "api-server"]