Difference between revisions of "Airflow scheduler --help"
Jump to navigation
Jump to search
(Created page with "<pre> usage: airflow scheduler [-h] [-D] [-p] [-l LOG_FILE] [-n NUM_RUNS] [--pid [PID]] [-s] [--stderr STDERR] [--stdout STDOUT] [-S SUBDIR] Start a...") |
|||
Line 1: | Line 1: | ||
+ | {{lc}} | ||
<pre> | <pre> | ||
usage: airflow scheduler [-h] [-D] [-p] [-l LOG_FILE] [-n NUM_RUNS] [--pid [PID]] [-s] [--stderr STDERR] [--stdout STDOUT] | usage: airflow scheduler [-h] [-D] [-p] [-l LOG_FILE] [-n NUM_RUNS] [--pid [PID]] [-s] [--stderr STDERR] [--stdout STDOUT] |
Revision as of 08:49, 11 November 2022
usage: airflow scheduler [-h] [-D] [-p] [-l LOG_FILE] [-n NUM_RUNS] [--pid [PID]] [-s] [--stderr STDERR] [--stdout STDOUT] [-S SUBDIR] Start a scheduler instance options: -h, --help show this help message and exit -D, --daemon Daemonize instead of running in the foreground -p, --do-pickle Attempt to pickle the DAG object to send over to the workers, instead of letting workers run their version of the code -l LOG_FILE, --log-file LOG_FILE Location of the log file -n NUM_RUNS, --num-runs NUM_RUNS Set the number of runs to execute before exiting --pid [PID] PID file location -s, --skip-serve-logs Don't start the serve logs process along with the workers --stderr STDERR Redirect stderr to this file --stdout STDOUT Redirect stdout to this file -S SUBDIR, --subdir SUBDIR File location or directory from which to look for the dag. Defaults to '[AIRFLOW_HOME]/dags' where [AIRFLOW_HOME] is the value you set for 'AIRFLOW_HOME' config you set in 'airflow.cfg' Signals: - SIGUSR2: Dump a snapshot of task state being tracked by the executor. Example: pkill -f -USR2 "airflow scheduler"
= See also
Advertising: