最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - DAG's from DAG Folder is not visible in Airflow Home DAG (localhost:8080) - Stack Overflow

programmeradmin0浏览0评论

I have 3 Dag files (example1.py, example2.py and example3.py) in DAG folder in airflow/airflow-docker/dags folder (docker container in vscode) and they're not showing up in the Airflow Web Home Page, It's showing as 'no results' in the homepage.

My Set up is - I'm using airflow inside a Docker container and using VScode terminal for writing CLI commands.

I tried setting up the enviroment variable in docker-compose.yaml file as absolute path below

AIRFLOW__CORE__DAGS_FOLDER: '/workspaces/my_dir/airflow/airflow-docker/dags'

which didn't worked.

I don't have any config file, I'm just trying to make this work by changing in docker-compose.yaml generated by this command :

curl -LfO '.10.4/docker-compose.yaml'

I've tried airflow dags_list as well which shows me that all the examples dags existing within the directory but not the dags that i've included in my dags folder.

Also, want to mention the volume code in my docker-compose.yaml file :

 volumes:
    - ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
    - ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs 
    - ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins

Attaching one of my example dag as below :

from datetime import datetime, timedelta
from airflow import DAG
from airflow.decorators import dag, task


default_args = {
    'owner': 'abc',
    'depends_on_past': False,
    'start_date': datetime(2023, 1, 1),
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
}

@dag(
    'my_example',
    default_args=default_args,
    description='A simple hello DAG',
    schedule_interval=None,  # Disable scheduling
    catchup=False,
)
def taskflow():
    @task
    def say_hello(name):
        return(f'Yo, {name}!')

    (
        say_hello.override(task_id='A')('A')
        >> say_hello.override(task_id='Z')('Z')
    )

taskflow()

Looking for guidance on what would be the right directory to put all my dags to load them automatically in Airflow home page.

Thanks!

I have 3 Dag files (example1.py, example2.py and example3.py) in DAG folder in airflow/airflow-docker/dags folder (docker container in vscode) and they're not showing up in the Airflow Web Home Page, It's showing as 'no results' in the homepage.

My Set up is - I'm using airflow inside a Docker container and using VScode terminal for writing CLI commands.

I tried setting up the enviroment variable in docker-compose.yaml file as absolute path below

AIRFLOW__CORE__DAGS_FOLDER: '/workspaces/my_dir/airflow/airflow-docker/dags'

which didn't worked.

I don't have any config file, I'm just trying to make this work by changing in docker-compose.yaml generated by this command :

curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.10.4/docker-compose.yaml'

I've tried airflow dags_list as well which shows me that all the examples dags existing within the directory but not the dags that i've included in my dags folder.

Also, want to mention the volume code in my docker-compose.yaml file :

 volumes:
    - ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
    - ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs 
    - ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins

Attaching one of my example dag as below :

from datetime import datetime, timedelta
from airflow import DAG
from airflow.decorators import dag, task


default_args = {
    'owner': 'abc',
    'depends_on_past': False,
    'start_date': datetime(2023, 1, 1),
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
}

@dag(
    'my_example',
    default_args=default_args,
    description='A simple hello DAG',
    schedule_interval=None,  # Disable scheduling
    catchup=False,
)
def taskflow():
    @task
    def say_hello(name):
        return(f'Yo, {name}!')

    (
        say_hello.override(task_id='A')('A')
        >> say_hello.override(task_id='Z')('Z')
    )

taskflow()

Looking for guidance on what would be the right directory to put all my dags to load them automatically in Airflow home page.

Thanks!

Share Improve this question asked Feb 6 at 13:18 Sahil NegiSahil Negi 1 New contributor Sahil Negi is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.
Add a comment  | 

1 Answer 1

Reset to default 0

Airflow is looking for DAG files inside the container. In the docker-compose file you’re using, you have these volume mappings:

volumes:
  - ${AIRFLOW_PROJ_DIR:-.}/dags:/opt/airflow/dags
  - ${AIRFLOW_PROJ_DIR:-.}/logs:/opt/airflow/logs 
  - ${AIRFLOW_PROJ_DIR:-.}/plugins:/opt/airflow/plugins

This means that the folder on your host at ${AIRFLOW_PROJ_DIR:-.}/dags is mounted into the container at /opt/airflow/dags. (Note: default value for AIRFLOW_PROJ_DIR is the directory where the docker-compose file is located).

Airflow will look for DAGs in /opt/airflow/dags in the container's file system.

The environment variable AIRFLOW__CORE__DAGS_FOLDER is unnecessary. You are attempting to point Airflow to a directory on your host rather than the path inside the container, from what it seems?

Because of this, Airflow sees only the example DAGs (which are probably in /opt/airflow/dags from the official setup) and not your custom DAGs because they are in the wrong location.

  1. To fix this, put your custom DAG files (e.g. example1.py, example2.py, and example3.py) into the dags folder that is part of your Airflow project directory (i.e. the same directory where your docker-compose.yaml is located). This folder is being mounted to /opt/airflow/dags inside the container.

  2. Remove the AIRFLOW__CORE__DAGS_FOLDER environment variable.

  3. Restart your containers:

docker-compose down
docker-compose up -d
发布评论

评论列表(0)

  1. 暂无评论