{"id":146,"date":"2018-10-26T16:05:47","date_gmt":"2018-10-26T16:05:47","guid":{"rendered":"http:\/\/tonysbit.blog\/?p=146"},"modified":"2018-10-26T16:05:47","modified_gmt":"2018-10-26T16:05:47","slug":"the-l-in-elkdocker-scale-out-logging","status":"publish","type":"post","link":"https:\/\/tonysbit.blog\/?p=146","title":{"rendered":"The L in ELK+Docker Scale-out Logging"},"content":{"rendered":"

Warning:\u00a0<\/strong>This article assumes a basic understanding of<\/p>\n

    \n
  1. Docker<\/a><\/li>\n
  2. Elasticsearch<\/a><\/li>\n
  3. Logstash<\/a><\/li>\n<\/ol>\n

    Why Log to Elasticsearch?<\/h1>\n

    Elasticsearch is a fantastic tool for logging as it allows for logs to be viewed as just another time-series piece of data. This is important for any organizations’ journey through the evolution of data.<\/p>\n

    This evolution can be outlined as the following:<\/p>\n

      \n
    1. Collection<\/strong>: Central collection of logs with required indexing<\/li>\n
    2. Shallow Analysis<\/strong>: The real-time detection of specific data for event based actions<\/li>\n
    3. Deep Analysis:<\/strong>\u00a0The study of trends using ML\/AI for pattern driven actions<\/li>\n<\/ol>\n

      Data that is not purposely collected for this journey will simply be bits wondering through the abyss of computing purgatory without a meaningful destiny! In this article we will be discussing using Docker to scale out your Logstash deployment.<\/p>\n

      The Challenges<\/h1>\n

      If you have ever used Logstash (LS) to push logs to Elasticsearch (ES) here are a number of different challenges you may encounter:<\/p>\n