In kube-up there's an addon for using elasticsearch + fluentd for logging.
In line with the new addons documentation I've "converted" the vendor addon to one that _should_ work with kops and I'm ready to make a pull request (after I clean my code and fork).
However it's unclear to me how I need to configure kubernetes to make my fluentd image work. I'm assuming that it needs to run my altered fluentd-elasticsearch Docker image during kops' node-creation process (and can't just be run as a pod inside kubernetes) however I haven't been able to figure out how to do this properly in kopf.
Would love some pointers because would really like to have proper logging working myself ASAP :-)
@RXminuS What ES instance does the addon target? Does it run its own ES pod?
We just set up fluentd + ES logging, with our ES endpoint being an AWS ES hosted install. We opted for the hosted AWS ES because it's logically separated from Kubernetes, so if our Kubernetes setup goes _boom_, we can still get to our logs.
We modified this repository a bit:
https://github.com/Nordstrom/docker-fluentd-aws-elasticsearch
The tricky part was figuring out how to mount the volumes correctly. Here is our DaemonSet definition:
apiVersion: extensions/v1beta1
kind: DaemonSet
metadata:
name: docker-fluentd-aws-elasticsearch
spec:
template:
metadata:
labels:
app: docker-fluentd-aws-elasticsearch
spec:
containers:
- name: docker-fluentd-aws-elasticsearch
image: {{ image_repo_host }}shotwell_fluentd-to-aws-es:{{ git_sha }}
volumeMounts:
- mountPath: /var/log/
name: var-log
- mountPath: /var/lib/docker/
name: var-lib-docker
env:
- name: AWS_ELASTICSEARCH_ENDPOINT
value: {{ aws_elasticsearch_endpoint }}
- name: AWS_REGION
value: us-west-2
volumes:
- name: var-log
hostPath:
path: /var/log/
- name: var-lib-docker
hostPath:
path: /var/lib/docker/
Note that the fluentd app is a pod like any other, but it's installed as a daemonset.
@philipn I've made the elastic-search endpoint configurable (which works great if you create a "best-practice-kubernetes-elasticsearch-cluster"), because in our case we use elasticsearch for other things as well and so it made sense to use the same cluster for it.
We were actually fine not having hosted ES instance because we use EBS anyways so data isn't lost.
Ah ok, didn't know about DeamonSets! That solves the final puzzle piece, I'll add a pull request soon then :-)
I am going to close this as a K8s issue and not kops
I'd like to have a configuration available for the stuff that ships with kube-up, at least. Otherwise we're just swimming upstream.
Also see #1305
We have an example in addons - closing
We have an example in addons - closing
Most helpful comment
@RXminuS What ES instance does the addon target? Does it run its own ES pod?
We just set up fluentd + ES logging, with our ES endpoint being an AWS ES hosted install. We opted for the hosted AWS ES because it's logically separated from Kubernetes, so if our Kubernetes setup goes _boom_, we can still get to our logs.
We modified this repository a bit:
https://github.com/Nordstrom/docker-fluentd-aws-elasticsearch
The tricky part was figuring out how to mount the volumes correctly. Here is our DaemonSet definition:
Note that the fluentd app is a pod like any other, but it's installed as a daemonset.