{"id":146,"date":"2018-10-26T16:05:47","date_gmt":"2018-10-26T16:05:47","guid":{"rendered":"http:\/\/tonysbit.blog\/?p=146"},"modified":"2018-10-26T16:05:47","modified_gmt":"2018-10-26T16:05:47","slug":"the-l-in-elkdocker-scale-out-logging","status":"publish","type":"post","link":"https:\/\/tonysbit.blog\/?p=146","title":{"rendered":"The L in ELK+Docker Scale-out Logging"},"content":{"rendered":"
Warning:\u00a0<\/strong>This article assumes a basic understanding of<\/p>\n Elasticsearch is a fantastic tool for logging as it allows for logs to be viewed as just another time-series piece of data. This is important for any organizations’ journey through the evolution of data.<\/p>\n This evolution can be outlined as the following:<\/p>\n Data that is not purposely collected for this journey will simply be bits wondering through the abyss of computing purgatory without a meaningful destiny! In this article we will be discussing using Docker to scale out your Logstash deployment.<\/p>\n If you have ever used Logstash (LS) to push logs to Elasticsearch (ES) here are a number of different challenges you may encounter:<\/p>\n When looking at solutions, the approach I take is:<\/p>\n Using Docker, a generic infrastructure can be deployed due to the abstraction of containers and underlying OS (Besides the difference between Windows and Linux hosts).<\/p>\n Docker solves the challenges inherent in the LS deployment:<\/p>\n I.e.<\/strong> Let’s say you have 1M logs required to be logged per day, and have a requirement to have 3 virtual machines for a maximum of 1 virtual machine loss.<\/p>\n Why not deploy straight onto OS 3 Logstashes sized at 4 CPU and 8 GB RAM?<\/p>\n Let’s take a look at how this architecture looks,<\/p>\n When a node goes down the resulting environment looks like:<\/p>\n <\/p>\n A added bonus to this deployment is if you wanted to ship logs from Logstash to Elasticsearch for central and real-time monitoring of the logs its as simple as adding Filebeats in the docker-compose.<\/p>\n <\/p>\n What does the docker-compose look like?<\/p>\n <\/code><\/p>\n As with most good things, there is a caveat. With Docker you add another layer of complexity however I would argue that as the docker images for Logstash are managed and maintained by Elasticsearch, it reduces the implementation headaches.<\/p>\n In saying this I found one big issue with routing UDP traffic within Docker.<\/p>\n\n
Why Log to Elasticsearch?<\/h1>\n
\n
The Challenges<\/h1>\n
\n
\n
The Solution<\/h1>\n
\n
\n
\n
\n
Architecture<\/h1>\n
<\/h1>\n
version: '3.3'<\/p>\n
\n
The BUT!<\/h1>\n