Now when it comes to best practices, I was debating on whether or not to setup a dedicated logstash server. I feel as though the CPU load will creep up and doing triple duty of elastic search, kibana, and logstash could cause some serious load. If i need to change that later I will. Logstash will be pretty simple as it uses config files to enable the ingest. Easy enough to move them later.
I am also installing this as a service as it will be much easier to manage. Keeping everything consistent I plan on using the repo method of install as well.
The repo install guide can be found here.
First step is downloading the public signing key.
rpm --import https://packages.elastic.co/GPG-KEY-elasticsearch
Create a repo file in the yum.repos.d folder.
Add the following:
name=Logstash repository for 2.3.x packages
Once that file is saved then we can go ahead and run the yum installer.
yum install logstash
The final two components are starting the service and configuring it to start on reboot.
service logstash start
chkconfig logstash on
Thats pretty much it from this end the next post we will walk through is actually getting the data and start getting it into an index. Once the data is in an index we can then start carving it up.