Project

General

Profile

Support #416

Updated by Daniel Curtis over 8 years ago

This is a guide for install Logstash with kibana, elasticsearch, and nginx on Debian 8 

 h2. Prepare the Environment 

 * Make sure the system is up to date; 
 <pre> 
 sudo apt-get update && sudo apt-get upgrade 
 </pre> 

 * Install openjdk: 
 <pre> 
 sudo apt-get install openjdk-7-jdk 
 </pre> 

 * Run the following command to import the Elasticsearch public GPG key into apt: 
 <pre> 
 wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add - 
 </pre> 

 h2. h3. Install Elasticsearch 

 * Create the Elasticsearch source list: 
 <pre> 
 echo "deb http://packages.elastic.co/elasticsearch/2.x/debian stable main" | sudo tee -a /etc/apt/sources.list.d/elasticsearch-2.x.list 
 </pre> 

 * Update your apt package database: 
 <pre> 
 sudo apt-get update 
 </pre> 

 * Install Elasticsearch: 
 <pre> 
 sudo apt-get install elasticsearch 
 </pre> 

 * Elasticsearch is now installed. Let's edit the configuration: 
 <pre> 
 sudo vi /etc/elasticsearch/elasticsearch.yml 
 </pre> 
 #* Add the following line somewhere in the file, to disable dynamic scripts: 
 <pre> 
 script.disable_dynamic: true 
 </pre> 
 #* You will also want to restrict outside access to your Elasticsearch instance (port 9200), so outsiders can't read your data or shutdown your Elasticseach cluster through the HTTP API. Find the line that specifies network.bind_host and uncomment it so it looks like this: 
 <pre> 
 network.bind_host: localhost 
 </pre> 

 * Now start Elasticsearch: 
 <pre> 
 sudo service elasticsearch restart 
 </pre> 

 * Then run the following command to start Elasticsearch on boot up: 
 <pre> 
 sudo systemctl daemon-reload 
 sudo systemctl enable elasticsearch.service 
 </pre> 

 h2. h3. Install Logstash 

 * The Logstash package is available from the same repository as Elasticsearch, and we already installed that public key, so let's create the Logstash source list: 
 <pre> 
 echo "deb http://packages.elastic.co/logstash/2.0/debian stable main" | sudo tee -a /etc/apt/sources.list.d/logstash-2.0.list /etc/apt/sources.list 
 </pre> 

 * Update your apt package database: 
 <pre> 
 sudo apt-get update 
 </pre> 

 * Install Logstash: 
 <pre> 
 sudo apt-get install logstash 
 </pre> 

 h3. Configure Logstash 

 * Now let's create a configuration file called 10-syslog.conf, where we will add a filter for syslog messages: 
 <pre> 
 sudo vi /etc/logstash/conf.d/10-syslog.conf 
 </pre> 
 *# Insert the following syslog filter configuration: 
 <pre> 
 filter { 
   if [type] == "syslog" { 
     grok { 
       match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } 
       add_field => [ "received_at", "%{@timestamp}" ] 
       add_field => [ "received_from", "%{host}" ] 
     } 
     syslog_pri { } 
     date { 
       match => [ "syslog_timestamp", "MMM    d HH:mm:ss", "MMM dd HH:mm:ss" ] 
     } 
   } 
 } 
 </pre> 

 Save and quit. This filter looks for logs that are labeled as "syslog" type (by a Logstash Forwarder), and it will try to use "grok" to parse incoming syslog logs to make it structured and query-able. 

 * Restart Logstash to put our configuration changes into effect: 
 <pre> 
 sudo service logstash restart 
 </pre> 

 h2. Install Kibana 

 * Download Kibana to your home directory with the following command: 
 <pre> 
 cd ~; wget http://download.elasticsearch.org/kibana/kibana/kibana-latest.zip 
 </pre> 

 * Install unzip so you can extract the kibana archive: 
 <pre> 
 sudo apt-get install unzip 
 </pre> 

 * Extract Kibana archive with unzip: 
 <pre> 
 unzip kibana-latest.zip 
 </pre> 

 * Open the Kibana configuration file for editing: 
 <pre> 
 sudo vi ~/kibana-latest/config.js 
 </pre> 
 #* In the Kibana configuration file, find the line that specifies the elasticsearch, and replace the port number (9200 by default) with 80: 
 <pre> 
 elasticsearch: "http://"+window.location.hostname+":80", 
 </pre> 

 This is necessary because we are planning on accessing Kibana on port 80. 

 * Create a directory with the following command: 
 <pre> 
 sudo mkdir -p /var/www/kibana 
 </pre> 

 * Now copy the Kibana files into your newly-created directory: 
 <pre> 
 sudo cp -R ~/kibana-latest/* /var/www/kibana/ 
 </pre> 

 Before we can use the Kibana web interface, we have to install Nginx.  

 h3. Install Nginx 

 * Use apt to install Nginx: 
 <pre> 
 sudo apt-get install nginx 
 </pre> 

 * Download the sample Nginx configuration from Kibana's github repository to your home directory: 
 <pre> 
 cd ~; wget https://github.com/elasticsearch/kibana/raw/master/sample/nginx.conf 
 </pre> 

 * Open the sample configuration file for editing: 
 <pre> 
 vi nginx.conf 
 </pre> 
 #* Find and change the values of the server_name to your FQDN (or localhost if you aren't using a domain name) and root to where we installed Kibana, so they look like the following entries: 
 <pre> 
 server_name logstash.example.com; 
 root /var/www/kibana; 
 </pre> 

 * Save and exit. Now copy it over your Nginx default server block with the following command: 
 <pre> 
 sudo cp nginx.conf /etc/nginx/sites-available/default 
 </pre> 

 * Now restart Nginx to put our changes into effect: 
 <pre> 
 sudo service nginx restart 
 </pre> 

 Kibana is now accessible via your FQDN or the public IP address of your Logstash Server i.e. http://logstash.example.com/. If you go there in a web browser, you should see a Kibana welcome page which will allow you to view dashboards but there will be no logs to view because Logstash has not been set up yet. Let's do that now. 

 h2. Generate SSL Certificates 

 Since we are going to use Logstash Forwarder to ship logs from our Servers to our Logstash Server, we need to create an SSL certificate and key pair. The certificate is used by the Logstash Forwarder to verify the identity of Logstash Server.  

 * Create the directories that will store the certificate and private key with the following commands: 
 <pre> 
 sudo mkdir -p /etc/pki/tls/certs 
 sudo mkdir /etc/pki/tls/private 
 </pre> 

 * Now generate the SSL certificate and private key, in the appropriate locations (/etc/pki/tls/...), with the following command: 
 <pre> 
 cd /etc/pki/tls; sudo openssl req -x509 -batch -nodes -newkey rsa:2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt 
 </pre> 

 The logstash-forwarder.crt file will be copied to all of the servers that will send logs to Logstash but we will do that a little later. Let's complete our Logstash configuration. 

 h3. Configure Logstash 

 * Now let's create a configuration file called 10-syslog.conf, where we will add a filter for syslog messages: 
 <pre> 
 sudo vi /etc/logstash/conf.d/10-syslog.conf 
 </pre> 
 *# Insert the following syslog filter configuration: 
 <pre> 
 filter { 
   if [type] == "syslog" { 
     grok { 
       match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } 
       add_field => [ "received_at", "%{@timestamp}" ] 
       add_field => [ "received_from", "%{host}" ] 
     } 
     syslog_pri { } 
     date { 
       match => [ "syslog_timestamp", "MMM    d HH:mm:ss", "MMM dd HH:mm:ss" ] 
     } 
   } 
 } 
 </pre> 

 Save and quit. This filter looks for logs that are labeled as "syslog" type (by a Logstash Forwarder), and it will try to use "grok" to parse incoming syslog logs to make it structured and query-able. 

 * Restart Logstash to put our configuration changes into effect: 
 <pre> 
 sudo service logstash restart 
 </pre> 

 h2. Connect to Kibana 

 When you are finished setting up Logstash Forwarder on all of the servers that you want to gather logs for, let's look at Kibana, the web interface that we installed earlier. 

 In a web browser, go to the FQDN or public IP address of your Logstash Server. You should see a Kibana welcome page. 

 Click on Logstash Dashboard to go to the premade dashboard. You should see a histogram with log events, with log messages below (if you don't see any events or messages, one of your four Logstash components is not configured properly). 

 Here, you can search and browse through your logs. You can also customize your dashboard. 

 Try the following things: 
 * Search for "root" to see if anyone is trying to log into your servers as root 
 * Search for a particular hostname 
 * Change the time frame by selecting an area on the histogram or from the menu above 
 * Click on messages below the histogram to see how the data is being filtered 

 Kibana has many other features, such as graphing and filtering, so feel free to poke around! 

 h2. Resources Conclusion 

 * https://www.elastic.co/guide/en/kibana/current/setup.html 
 * https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-repositories.html 
 * https://www.elastic.co/guide/en/logstash/current/package-repositories.html 
 * https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04 Now that your syslogs are centralized via Logstash, and you are able to visualize them with Kibana, you should be off to a good start with centralizing all of your important logs. Remember that you can send pretty much any type of log to Logstash, but the data becomes even more useful if it is parsed and structured with grok. 

 Note that your Kibana dashboard is accessible to anyone who can access your server, so you will want to secure it with something like htaccess.

Back