Advanced logging configuration

Not all configuration options are available from the airflow.cfg file. The config file describes how to configure logging for tasks, because the logs generated by tasks are not only logged in separate files by default but has to be also accessible via the webserver.

By default standard airflow component logs are written to the $AIRFLOW_HOME/logs directory, but you can also customize it and configure it as you want by overriding Python logger configuration that can be configured by providing custom logging configuration object. Some configuration options require that the logging config class be overwritten. You can do it by copying the default configuration of Airflow and modifying it to suit your needs. The default configuration can be seen in the airflow_local_settings.py template and you can see the loggers and handlers used there. Except the custom loggers and handlers configurable there via the airflow.cfg, the logging methods in Airflow follow the usual Python logging convention, that Python objects log to loggers that follow naming convention of <package>.<module_name>.

You can read more about standard python logging classes (Loggers, Handlers, Formatters) in the Python logging documentation.

Configuring your logging classes can be done via the logging_config_class option in airflow.cfg file. This configuration should specify the import path to a configuration compatible with logging.config.dictConfig(). If your file is a standard import location, then you should set a PYTHONPATH environment variable.

Follow the steps below to enable custom logging config class:

  1. Start by setting environment variable to known directory e.g. ~/airflow/

    export PYTHONPATH=~/airflow/
    
  2. Create a directory to store the config file e.g. ~/airflow/config

  3. Create file called ~/airflow/config/log_config.py with following the contents:

    from copy import deepcopy
    from airflow.config_templates.airflow_local_settings import DEFAULT_LOGGING_CONFIG
    
    LOGGING_CONFIG = deepcopy(DEFAULT_LOGGING_CONFIG)
    
  4. At the end of the file, add code to modify the default dictionary configuration.

  5. Update $AIRFLOW_HOME/airflow.cfg to contain:

    [logging]
    logging_config_class = log_config.LOGGING_CONFIG
    

You can also use the logging_config_class together with remote logging if you plan to just extend/update the configuration with remote logging enabled. Then the deep-copied dictionary will contain the remote logging configuration generated for you and your modification will apply after remote logging configuration has been added:

[logging]
remote_logging = True
logging_config_class = log_config.LOGGING_CONFIG
  1. Restart the application.

See Modules Management for details on how Python and Airflow manage modules.

Note

You can override the way both standard logs of the components and “task” logs are handled.

Was this entry helpful?