-
Notifications
You must be signed in to change notification settings - Fork 5k
Description
After a question in forum on how to collect docker container logs with elastic agent I stumbled across a potential issue.
As we do not have a specific integration to collect docker logs, we rely on custom-logs which uses logfile input type.
Leveraging the docker dynamic provider variables and setting log path to /var/lib/docker/containers/${docker.container.id}/*-json.log multiple inputs in agent per container are created as expected.
Although the log path on those inputs is correct , agent/filebeat does not collect the logs at all.
Steps to reproduce:
- Deploy elastic stack on Elastic cloud with version 8.0.0-alpha
- Set up a kubernetes environment with docker runtime.
minikube start --kubernetes-version=v1.21.3 --container-runtime=docker - Configure inside
elastic-agent-standalone-manifest.yamlagent.yml to collect the docker container logs
- name: docker-log
type: logfile
use_output: default
meta:
package:
name: log
version: 0.4.6
data_stream:
namespace: default
streams:
- data_stream:
dataset: generic
paths:
- /var/lib/docker/containers/${docker.container.id}/*-json.log
- Deploy standalone agent with image
docker.elastic.co/beats/elastic-agent:8.0.0-SNAPSHOT. Filtering the logs in Kibana withevent.dataset:generic, no logs are shown - Exec in agent container and run
./elastic-agent -c /etc/agent.yml inspect output -o default | grep docker, we can see one input per docker container. This means that thedocker.container.idvariable is populated by the docker dynamic provider.
name: docker-log
- /var/lib/docker/containers/06fd6e8770a51b700f2abaa584479ab22e4733953edf30be919a8f00da0c6814/*-json.log
io_kubernetes_docker_type: container
maintainer: NGINX Docker Maintainers <[email protected]>
name: docker-log
- /var/lib/docker/containers/0a07c2e613a301b2669e3f90dde2dd454000ea3829644748c8a0a279114dc9b4/*-json.log
io_kubernetes_docker_type: container
name: docker-log
- /var/lib/docker/containers/0e30f4a7cc42ef16219a108911079fbe1734ffc2c7a19c06c995d173808dc936/*-json.log
io_kubernetes_docker_type: podsandbox
name: docker-log
- /var/lib/docker/containers/1f58c8b2eefe5889716786910544223a3072493e2eac7676b2aa2460eb199d63/*-json.log
io_kubernetes_docker_type: container
- Although those log paths are correct and full of logs, filebeat does not collect the logs or log anything supsicious.
- If we update the agent.yml configmap removing the dynamic variable and restart the agent then logs are collected
- name: docker-log
type: logfile
use_output: default
meta:
package:
name: log
version: 0.4.6
data_stream:
namespace: default
streams:
- data_stream:
dataset: generic
paths:
- /var/lib/docker/containers/*/*-json.log
This is very weird and it seems that either the logfile input is not working as expected(although I tested the same with filestream input and did not fix the problem) or the agent-filebeat communication may have a problem.
Or something different.
Also the documentation of docker dynamic provider is outdated. Variables like docker.id do not exist. Instead the provider populates docker.container.id.
We can see the mapping it creates per event here.