Airflow 1.10.1rc2 logs not getting uploaded to S3 with an error
I am getting an error when airflow tries to write logs on S3. Infact the actual log file is not present in S3.
I created a connection with type s3 with key and secret in extra field.
Also added below setting to airflow.cfg
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://org-airflow-logs/logs
encrypt_s3_logs = False
I a getting below error.
worker_1 | [2018-11-21 18:20:38,211] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:39,845] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (3): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:43,177] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:44,478] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (5): org-airflow-logs.s3.amazonaws.com
[2018-11-21 15:34:08,456] {{s3_task_handler.py:173}} ERROR - Could not write logs to s3://org-airflow-logs/logs/tutorial/templated/2015-06-27T00:00:00+00:00/1.log
worker_1 | Traceback (most recent call last):
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py", line 170, in s3_write
worker_1 | encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 359, in load_string
worker_1 | encrypt=encrypt)
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 399, in load_bytes
worker_1 | client.upload_fileobj(filelike_buffer, bucket_name, key, ExtraArgs=extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
worker_1 | return future.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 73, in result
worker_1 | return self._coordinator.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 233, in result
worker_1 | raise self._exception
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 126, in __call__
worker_1 | return self._execute_main(kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 150, in _execute_main
worker_1 | return_value = self._main(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/upload.py", line 692, in _main
worker_1 | client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 314, in _api_call
worker_1 | return self._make_api_call(operation_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 599, in _make_api_call
worker_1 | operation_model, request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 148, in make_request
worker_1 | return self._send_request(request_dict, operation_model)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 177, in _send_request
worker_1 | success_response, exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 273, in _needs_retry
worker_1 | caught_exception=caught_exception, request_dict=request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
worker_1 | return self._emit(event_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 360, in _emit
worker_1 | aliased_event_name, kwargs, stop_on_response
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
worker_1 | response = handler(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
worker_1 | if self._checker(attempts, response, caught_exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 277, in _should_retry
worker_1 | return self._checker(attempt_number, response, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
worker_1 | attempt_number, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
worker_1 | raise caught_exception
worker_1 | botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://org-airflow-logs.s3.amazonaws.com/logs/tutorial/templated/2015-06-27T00%3A00%3A00%2B00%3A00/1.log"
Any help is appreciated.
amazon-s3 airflow
add a comment |
I am getting an error when airflow tries to write logs on S3. Infact the actual log file is not present in S3.
I created a connection with type s3 with key and secret in extra field.
Also added below setting to airflow.cfg
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://org-airflow-logs/logs
encrypt_s3_logs = False
I a getting below error.
worker_1 | [2018-11-21 18:20:38,211] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:39,845] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (3): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:43,177] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:44,478] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (5): org-airflow-logs.s3.amazonaws.com
[2018-11-21 15:34:08,456] {{s3_task_handler.py:173}} ERROR - Could not write logs to s3://org-airflow-logs/logs/tutorial/templated/2015-06-27T00:00:00+00:00/1.log
worker_1 | Traceback (most recent call last):
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py", line 170, in s3_write
worker_1 | encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 359, in load_string
worker_1 | encrypt=encrypt)
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 399, in load_bytes
worker_1 | client.upload_fileobj(filelike_buffer, bucket_name, key, ExtraArgs=extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
worker_1 | return future.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 73, in result
worker_1 | return self._coordinator.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 233, in result
worker_1 | raise self._exception
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 126, in __call__
worker_1 | return self._execute_main(kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 150, in _execute_main
worker_1 | return_value = self._main(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/upload.py", line 692, in _main
worker_1 | client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 314, in _api_call
worker_1 | return self._make_api_call(operation_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 599, in _make_api_call
worker_1 | operation_model, request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 148, in make_request
worker_1 | return self._send_request(request_dict, operation_model)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 177, in _send_request
worker_1 | success_response, exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 273, in _needs_retry
worker_1 | caught_exception=caught_exception, request_dict=request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
worker_1 | return self._emit(event_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 360, in _emit
worker_1 | aliased_event_name, kwargs, stop_on_response
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
worker_1 | response = handler(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
worker_1 | if self._checker(attempts, response, caught_exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 277, in _should_retry
worker_1 | return self._checker(attempt_number, response, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
worker_1 | attempt_number, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
worker_1 | raise caught_exception
worker_1 | botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://org-airflow-logs.s3.amazonaws.com/logs/tutorial/templated/2015-06-27T00%3A00%3A00%2B00%3A00/1.log"
Any help is appreciated.
amazon-s3 airflow
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08
add a comment |
I am getting an error when airflow tries to write logs on S3. Infact the actual log file is not present in S3.
I created a connection with type s3 with key and secret in extra field.
Also added below setting to airflow.cfg
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://org-airflow-logs/logs
encrypt_s3_logs = False
I a getting below error.
worker_1 | [2018-11-21 18:20:38,211] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:39,845] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (3): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:43,177] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:44,478] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (5): org-airflow-logs.s3.amazonaws.com
[2018-11-21 15:34:08,456] {{s3_task_handler.py:173}} ERROR - Could not write logs to s3://org-airflow-logs/logs/tutorial/templated/2015-06-27T00:00:00+00:00/1.log
worker_1 | Traceback (most recent call last):
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py", line 170, in s3_write
worker_1 | encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 359, in load_string
worker_1 | encrypt=encrypt)
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 399, in load_bytes
worker_1 | client.upload_fileobj(filelike_buffer, bucket_name, key, ExtraArgs=extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
worker_1 | return future.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 73, in result
worker_1 | return self._coordinator.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 233, in result
worker_1 | raise self._exception
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 126, in __call__
worker_1 | return self._execute_main(kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 150, in _execute_main
worker_1 | return_value = self._main(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/upload.py", line 692, in _main
worker_1 | client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 314, in _api_call
worker_1 | return self._make_api_call(operation_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 599, in _make_api_call
worker_1 | operation_model, request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 148, in make_request
worker_1 | return self._send_request(request_dict, operation_model)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 177, in _send_request
worker_1 | success_response, exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 273, in _needs_retry
worker_1 | caught_exception=caught_exception, request_dict=request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
worker_1 | return self._emit(event_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 360, in _emit
worker_1 | aliased_event_name, kwargs, stop_on_response
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
worker_1 | response = handler(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
worker_1 | if self._checker(attempts, response, caught_exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 277, in _should_retry
worker_1 | return self._checker(attempt_number, response, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
worker_1 | attempt_number, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
worker_1 | raise caught_exception
worker_1 | botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://org-airflow-logs.s3.amazonaws.com/logs/tutorial/templated/2015-06-27T00%3A00%3A00%2B00%3A00/1.log"
Any help is appreciated.
amazon-s3 airflow
I am getting an error when airflow tries to write logs on S3. Infact the actual log file is not present in S3.
I created a connection with type s3 with key and secret in extra field.
Also added below setting to airflow.cfg
remote_logging = True
remote_log_conn_id = MyS3Conn
remote_base_log_folder = s3://org-airflow-logs/logs
encrypt_s3_logs = False
I a getting below error.
worker_1 | [2018-11-21 18:20:38,211] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:39,845] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (3): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:43,177] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (4): org-airflow-logs.s3.amazonaws.com
worker_1 | [2018-11-21 18:20:44,478] {{connectionpool.py:735}} INFO - Starting new HTTPS connection (5): org-airflow-logs.s3.amazonaws.com
[2018-11-21 15:34:08,456] {{s3_task_handler.py:173}} ERROR - Could not write logs to s3://org-airflow-logs/logs/tutorial/templated/2015-06-27T00:00:00+00:00/1.log
worker_1 | Traceback (most recent call last):
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/utils/log/s3_task_handler.py", line 170, in s3_write
worker_1 | encrypt=configuration.conf.getboolean('core', 'ENCRYPT_S3_LOGS'),
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 359, in load_string
worker_1 | encrypt=encrypt)
worker_1 | File "/usr/local/lib/python3.6/site-packages/airflow/hooks/S3_hook.py", line 399, in load_bytes
worker_1 | client.upload_fileobj(filelike_buffer, bucket_name, key, ExtraArgs=extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/boto3/s3/inject.py", line 539, in upload_fileobj
worker_1 | return future.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 73, in result
worker_1 | return self._coordinator.result()
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/futures.py", line 233, in result
worker_1 | raise self._exception
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 126, in __call__
worker_1 | return self._execute_main(kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/tasks.py", line 150, in _execute_main
worker_1 | return_value = self._main(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/s3transfer/upload.py", line 692, in _main
worker_1 | client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 314, in _api_call
worker_1 | return self._make_api_call(operation_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/client.py", line 599, in _make_api_call
worker_1 | operation_model, request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 148, in make_request
worker_1 | return self._send_request(request_dict, operation_model)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 177, in _send_request
worker_1 | success_response, exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/endpoint.py", line 273, in _needs_retry
worker_1 | caught_exception=caught_exception, request_dict=request_dict)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
worker_1 | return self._emit(event_name, kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 360, in _emit
worker_1 | aliased_event_name, kwargs, stop_on_response
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
worker_1 | response = handler(**kwargs)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 183, in __call__
worker_1 | if self._checker(attempts, response, caught_exception):
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 251, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 277, in _should_retry
worker_1 | return self._checker(attempt_number, response, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 317, in __call__
worker_1 | caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 223, in __call__
worker_1 | attempt_number, caught_exception)
worker_1 | File "/usr/local/lib/python3.6/site-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
worker_1 | raise caught_exception
worker_1 | botocore.exceptions.EndpointConnectionError: Could not connect to the endpoint URL: "https://org-airflow-logs.s3.amazonaws.com/logs/tutorial/templated/2015-06-27T00%3A00%3A00%2B00%3A00/1.log"
Any help is appreciated.
amazon-s3 airflow
amazon-s3 airflow
edited Nov 21 '18 at 18:24
user3483651
asked Nov 21 '18 at 17:58
user3483651user3483651
207
207
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08
add a comment |
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418029%2fairflow-1-10-1rc2-logs-not-getting-uploaded-to-s3-with-an-error%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418029%2fairflow-1-10-1rc2-logs-not-getting-uploaded-to-s3-with-an-error%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I figured it out, this was a problem with the dns configuration in docker containers which created some inconstancy in connections.
– user3483651
Nov 21 '18 at 19:08