Airflow logs are not getting generated in AKS PVC/PV (azureblob-fuse-premium). All pods are failing. #45406
Unanswered
abhijit-sarkar-infocepts
asked this question in
General
Replies: 2 comments
-
Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Converted to a discussion - this is not airflow issue, it's a deployment configuration question - hopefully someone will be able to help with it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Apache Airflow version
Other Airflow 2 version (please specify below)
If "Other Airflow 2 version" selected, which one?
2.9.3
What happened?
I am deploying airflow 2.9.3 in AKS with Helm Chart 1.15.0. I am using CeleryExecutor. For logs I have created a PV and PVC with Storageclass azureblob-fuse-premium. But I found that after deploy all pods are getting failed when trying to create files in PV.
Note :: Inside from pods I have tried to generate a files (touch test.txt) from the path /opt/airflow/logs and it's generated in PV [Azure storage account]
What you think should happen instead?
I have tried using the PVC with Azure file-share driver, the logs are getting generated but airflow pods are unable to fetch the log from the PV. Because Azure file-share does not allow os.chmod operation. Then I have tried with azureblob-fuse-premium pvc after following this document Azure Airflow Document. The logs should be generated in the PV and airflow pods should be able to read those logs.
Logs
Usage: python -m celery [OPTIONS] COMMAND [ARGS]...
Try 'python -m celery --help' for help.
Error: Invalid value for '-A' / '--app':
Unable to load celery application.
While trying to load the module airflow.providers.celery.executors.celery_executor.app the following error occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler/2025-01-05'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/logging/config.py", line 563, in configure
handler = self.configure_handler(handlers[name])
File "/usr/local/lib/python3.8/logging/config.py", line 744, in configure_handler
result = factory(**kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_processor_handler.py", line 53, in init
Path(self._get_log_directory()).mkdir(parents=True, exist_ok=True)
File "/usr/local/lib/python3.8/pathlib.py", line 1292, in mkdir
self.parent.mkdir(parents=True, exist_ok=True)
File "/usr/local/lib/python3.8/pathlib.py", line 1293, in mkdir
self.mkdir(mode, parents=False, exist_ok=exist_ok)
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 59, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/utils/imports.py", line 109, in import_from_cwd
return imp(module, package=package)
File "/usr/local/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 975, in _find_and_load_unlocked
File "", line 671, in _load_unlocked
File "", line 843, in exec_module
File "", line 219, in _call_with_frames_removed
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/init.py", line 74, in
settings.initialize()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 531, in initialize
LOGGING_CLASS_PATH = configure_logging()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 74, in configure_logging
raise e
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 69, in configure_logging
dictConfig(logging_config)
File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
dictConfigClass(config).configure()
File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
raise ValueError('Unable to configure handler '
ValueError: Unable to configure handler 'processor'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/celery/bin/celery.py", line 58, in convert
return find_app(value)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/app/utils.py", line 383, in find_app
sym = symbol_by_name(app, imp=imp)
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 61, in symbol_by_name
reraise(ValueError,
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/exceptions.py", line 34, in reraise
raise value.with_traceback(tb)
File "/home/airflow/.local/lib/python3.8/site-packages/kombu/utils/imports.py", line 59, in symbol_by_name
module = imp(module_name, package=package, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/celery/utils/imports.py", line 109, in import_from_cwd
return imp(module, package=package)
File "/usr/local/lib/python3.8/importlib/init.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 961, in _find_and_load_unlocked
File "", line 219, in _call_with_frames_removed
File "", line 1014, in _gcd_import
File "", line 991, in _find_and_load
File "", line 975, in _find_and_load_unlocked
File "", line 671, in _load_unlocked
File "", line 843, in exec_module
File "", line 219, in _call_with_frames_removed
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/init.py", line 74, in
settings.initialize()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 531, in initialize
LOGGING_CLASS_PATH = configure_logging()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 74, in configure_logging
raise e
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 69, in configure_logging
dictConfig(logging_config)
File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
dictConfigClass(config).configure()
File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
raise ValueError('Unable to configure handler '
ValueError: Couldn't import 'airflow.providers.celery.executors.celery_executor.app': Unable to configure handler 'processor'
PS C:\AirflowSetup\Prod> kubectl logs airflow-worker-6f9f456bbd-bdfnk -n airflow293
Defaulted container "worker" out of: worker, wait-for-airflow-migrations (init)
....................
ERROR! Maximum number of retries (20) reached.
Last check result:
$ airflow db check
Unable to load the config, contains a configuration error.
Traceback (most recent call last):
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler/2025-01-05'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.8/logging/config.py", line 563, in configure
handler = self.configure_handler(handlers[name])
File "/usr/local/lib/python3.8/logging/config.py", line 744, in configure_handler
result = factory(**kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/utils/log/file_processor_handler.py", line 53, in init
Path(self._get_log_directory()).mkdir(parents=True, exist_ok=True)
File "/usr/local/lib/python3.8/pathlib.py", line 1292, in mkdir
self.parent.mkdir(parents=True, exist_ok=True)
File "/usr/local/lib/python3.8/pathlib.py", line 1293, in mkdir
self.mkdir(mode, parents=False, exist_ok=exist_ok)
File "/usr/local/lib/python3.8/pathlib.py", line 1288, in mkdir
self._accessor.mkdir(self, mode)
FileNotFoundError: [Errno 2] No such file or directory: '/opt/airflow/logs/scheduler'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 5, in
from airflow.main import main
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/init.py", line 74, in
settings.initialize()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/settings.py", line 531, in initialize
LOGGING_CLASS_PATH = configure_logging()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 74, in configure_logging
raise e
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/logging_config.py", line 69, in configure_logging
dictConfig(logging_config)
File "/usr/local/lib/python3.8/logging/config.py", line 808, in dictConfig
dictConfigClass(config).configure()
File "/usr/local/lib/python3.8/logging/config.py", line 570, in configure
raise ValueError('Unable to configure handler '
ValueError: Unable to configure handler 'processor'
How to reproduce
As I am doing setup of airgap airflow deployment.
Operating System
Linux-Ubantu
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==8.25.0
apache-airflow-providers-celery==3.7.2
apache-airflow-providers-cncf-kubernetes==8.4.2
apache-airflow-providers-common-io==1.3.2
apache-airflow-providers-common-sql==1.14.2
apache-airflow-providers-docker==3.12.2
apache-airflow-providers-elasticsearch==5.4.1
apache-airflow-providers-fab==1.2.2
apache-airflow-providers-ftp==3.10.0
apache-airflow-providers-google==10.21.0
apache-airflow-providers-grpc==3.5.2
apache-airflow-providers-hashicorp==3.7.1
apache-airflow-providers-http==4.12.0
apache-airflow-providers-imap==3.6.1
apache-airflow-providers-microsoft-azure==10.0.0
apache-airflow-providers-microsoft-winrm==3.4.0
apache-airflow-providers-mysql==5.6.2
apache-airflow-providers-odbc==4.6.2
apache-airflow-providers-openlineage==1.9.1
apache-airflow-providers-postgres==5.11.2
apache-airflow-providers-redis==3.7.1
apache-airflow-providers-sendgrid==3.5.1
apache-airflow-providers-sftp==4.10.2
apache-airflow-providers-slack==8.7.1
apache-airflow-providers-smtp==1.7.1
apache-airflow-providers-snowflake==4.1.0
apache-airflow-providers-sqlite==3.8.1
apache-airflow-providers-ssh==3.11.2
Deployment
Official Apache Airflow Helm Chart
Deployment details
Airflow - 2.9.3
Python - 3.8
Helm - 1.15.0
Kubernetes - 1.29.9
Anything else?
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions