How to ship logs from pods on Kubernetes running o

2020-07-11 06:51发布

问题:

I run new modules of my system in Google-Container-Engine. I would like to bring stdout and stderr from them (running in pods) to my centralised logstash. Is there an easy way to forward logs from pods to external logging service, e.g., logstash or elasticsearch?

回答1:

I decided to log directly to elasticsearch, an external virtual machine that can be access at elasticsearch.c.my-project.internal (I am on Google-Cloud-Platform). It is quite easy:

  1. Setup an ExternalService with name: elasticsearch that points to the elasticsearch instance:

    apiVersion: v1 kind: Service metadata: name: elasticsearch-logging namespace: kube-system labels: k8s-app: elasticsearch kubernetes.io/name: "elasticsearch" spec: type: ExternalName externalName: elasticsearch.c.my-project.internal ports: - port: 9200 targetPort: 9200

  2. Deploy a fluentd-elasticsearch as a DeamonSet. fluentd-elasticsearch will automatically connect to service with name elasticsearch-logging (based on a fluentd-elasticsearch deployment defintion :

    apiVersion: extensions/v1beta1 kind: DaemonSet metadata: name: fluentd-elasticsearch namespace: kube-system labels: tier: monitoring app: fluentd-logging k8s-app: fluentd-logging spec: template: metadata: labels: name: fluentd-elasticsearch spec: containers: - name: fluentd-elasticsearch image: gcr.io/google_containers/fluentd-elasticsearch:1.19 volumeMounts: - name: varlog mountPath: /var/log - name: varlibdockercontainers mountPath: /var/lib/docker/containers readOnly: true terminationGracePeriodSeconds: 30 volumes: - name: varlog hostPath: path: /var/log - name: varlibdockercontainers hostPath: path: /var/lib/docker/containers

    Use kubectl logs fluentd-elasticsearch-... to check whether you were able to connect to the elasticsearach instance.

  3. Now, you can access kibana and see the logs.



回答2:

you can create a sink form the logs in stack-driver to pub-sub and then use the logstash-input-google_pubsub plugin - which exports all the logs to elastic using logstash-input-google_pubsub image, see source code

export logs to pub-sub

  1. create a topic and a subscription in pubsub follow instrutions here

  2. in the log viewer page click on create export, make sure you are filtered to your app's logs (GKE Container -> cluster-name, app-name), enter a sink name, choose Cloud Pubsub as Sink Service , now choose your topic in Sink Destination.

logs from now and on are exported to pub-sub

configure logstash pipeline

here is the pubsub-elastic.conf file:

input {
    google_pubsub {
        project_id => "my-gcloud-project-id"
        topic => "elastic-pubsub-test"
        subscription => "elastic-pubsub-test"
        json_key_file => "/etc/logstash/gcloud-service-account-key.json"
    }
}


output {
    elasticsearch {
        hosts => "https://example.us-east-1.aws.found.io:9243"
        user => "elastic"
        password => "mypassword"
    }
}

here is my Docker file:

FROM sphereio/logstash-input-google_pubsub


# Logstash config
COPY gcloud-service-account-key.json /etc/logstash/gcloud-service-account-key.json
COPY config /etc/logstash/conf.d
COPY logstash.yml /etc/logstash/logstash.yml

now you should build the image and run

if running on kubernetes use the following:

here is deployment.yaml

apiVersion: extensions/v1beta1 kind: Deployment metadata: name: logstash-input-google-pubsub spec: replicas: 1 strategy: type: RollingUpdate template: metadata: labels: app: logstash-input-google-pubsub spec: containers: - name: logstash-input-google-pubsub image: us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0

build your image and push to registry

docker build --rm -t us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0 . 
gcloud docker -- push us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0

now create instance kubectl create -f deployment.yaml

done!!



回答3:

since elasticsearch 6.00 you can use filebeats

see blog

Download Filebeat DaemonSet manifest

curl -L -O https://raw.githubusercontent.com/elastic/beats/6.0/deploy/kubernetes/filebeat-kubernetes.yaml

Update Elasticsearch connection details

- name: ELASTICSEARCH_HOST
 value: elasticsearch
- name: ELASTICSEARCH_PORT
 value: "9200"
- name: ELASTICSEARCH_USERNAME
 value: elastic
- name: ELASTICSEARCH_PASSWORD
 value: changeme

Deploy it to Kubernetes

kubectl create -f filebeat-kubernetes.yaml


回答4:

You could try installing the following kubernetes addon: https://github.com/kubernetes/kubernetes/tree/master/cluster/addons/fluentd-elasticsearch

Haven't tried it myself but I'm also looking for proper logging. The GCE logging is somehow limited to my opinion.