Now when it comes to best practices, I was debating on whether or not to setup a dedicated logstash server. I feel as though the CPU load will creep up and doing triple duty of elastic search, kibana, and logstash could cause some serious load. If i need to change that later I will. Logstash will be pretty simple as it uses config files to enable the ingest. Easy enough to move them later.
I am also installing this as a service as it will be much easier to manage. Keeping everything consistent I plan on using the repo method of install as well.
The repo install guide can be found here.
First step is downloading the public signing key.
rpm --import https://packages.elastic.co/GPG-KEY-elasticsearch
Create a repo file in the yum.repos.d folder.
Add the following:
name=Logstash repository for 2.3.x packages
Once that file is saved then we can go ahead and run the yum installer.
yum install logstash
The final two components are starting the service and configuring it to start on reboot.
service logstash start
chkconfig logstash on
Thats pretty much it from this end the next post we will walk through is actually getting the data and start getting it into an index. Once the data is in an index we can then start carving it up.
Kibana is the data visualization for elastic search allowing you to search particular data sets and build dashboards.
I have been pretty impressed with the Kibana UI it allows for easily being able to carve up just about any data. Most of the visualizations are pretty self explanatory once you get into it a little bit. We will go over those in more details later as we start building the actual dashboard. For now lets get Kibana installed:
Just like the initial install of elastic search install I will leverage the yum repos and disable them after setup is complete.
The repo install for Centos can be found here.
The first part was setting up an elastic search server. I thought about creating a “cluster”, however most of the home labs won’t need that level of redundancy. The first thing I need to do is setup elastic search and the components I will use to set this up.
The components we plan on using:
- Elastic Search – open source search and analytics engine. Designed to scale horizontally, reliable, and easy to manage.
- Kibana – is the data visualization component that will help build the dashboard, this is easy to add to the search engine.
- LogStash – is utilized to collect the log from multiple inputs and add them to the engine and organize as you see fit.
There are some other components like beats and what not that I will add later as of right now this is pretty much the list.
I have been building out my “home lab” to become more robust and allowing me to build out pretty much any new technologies. With that being said, I need to understand whats going on with the environment. I looked at many open source monitoring applications. I recently started looking at Elastic Search and some of the capabilities it had. Turns out it gives me quite the ability to build my own dashboard with the information I want.
After googling and looking around on reddit, I found a number of users that were putting together their own home lab dashboards. Many had liked the view and information that Elastic Search provides but they were looking for how-tos on how to carve up the data.
So this will be a multi part series that will walk through how to get the data from Elastic Search into a useable dashboard. First blog post is an ambitious one – but should be good.