Setting Up ElasticSearch+Logstash+Kibana

Elasticsearch :

Elasticsearch is a search engine that can index new documents in near real-time and make them immediately available for querying. Elasticsearch is based on Apache Lucene and allows for setting up clusters of nodes that store any number of indices in a distributed, fault-tolerant way. If a node disappears, the cluster will rebalance the (shards of) indices over the remaining nodes. You can configure how many shards make up each index and how many replicas of these shards there should be. If a master shard goes offline, one of the replicas is promoted to master and used to repopulate another node.

Kibana :

Kibana is an open source analytics and visualization platform designed to work with Elasticsearch. You use Kibana to search, view, and interact with data stored in Elasticsearch indices. You can easily perform advanced data analysis and visualize your data in a variety of charts, tables, and maps.

Kibana makes it easy to understand large volumes of data. Its simple, browser-based interface enables you to quickly create and share dynamic dashboards that display changes to Elasticsearch queries in real time.

Logstash:

Logstash is a flexible, open source, data collection, enrichment, and transport pipeline designed to efficiently process a growing list of log, event, and unstructured data sources for distribution into a variety of outputs. Kibana is a web interface that can be used to search and view the logs that Logstash has indexed.oth of these tools are based on Elasticsearch.

Step 1: Installing Elastic Search.

 1.A :  Open the Linux terminal, And change the user to root.

Command: sudo su

1

1.B: Change the directory to home.

Command: cd

1

1.C : Down load the Elastic search using the fallowing command.

Command: wget https://download.elasticsearch.org/elasticsearch/elasticsearch/elasticsearch-1.6.0.noarch.rpm

1

1.D : Install Elastic search using the below command.

Command: rpm –ivh elasticsearch-1.6.0.noarch.rpm

1

1.E : Start the ElasticSearch.

Command: service elasticsearch start

1

1.F : Checking the Elastic search port.

Command: curl localhost:9200

1

1.G : Open the browser to access the Elastic search.

Command: localhost:9200

1

1.H : Stop the Elasticsearch.

Command: service elasticsearch stop

1

1.I : Change the directory to /usr/share/elasticsearch/

Command: cd /usr/share/elasticsearch/

1

1.J : Installing Elasticsearch-head plugin.

 elasticsearch-head:

elasticsearch-head is a web front end for browsing and interacting with an Elastic Search cluster.

General Utility :

es-head has a three major operations.

  • A ClusterOverview, which shows the topology of your cluster and allows you to perform index and node level operations
  • A couple of search interfaces that allow you to query the cluster a retrieve results in raw json or tabular format
  • Several quick access tabs that show the status of the cluster
  • An input section that allows arbitrary call to the RESTful API to be made. This interface includes several options that can be combined to produce interesting results;
    • Select request method (get, put, post, delete), json query data, node, and path
    • JSON validator
    • Ability to repeat requests on a timer
    • Ability to transform the result using javascript expressions
    • Ability to collect results over time (using timer), or compare results
    • Ability to chart the transformed results in a simple bar graph (including time series)

1.J.1: Install the elasticsearch-head plugin using the fallowing command.

Command: ./bin/plugin -install mobz/elasticsearch-head

1

1.K : Installing Elasticsearch-bigdesk plugin.

Bigdesk:

BigDesk is a wonderful web app developed by Lukáš Vlček, installable as an ElasticSearch plugin, which allows monitoring and analyzing real-time cluster status.

With this application, it’s possible to monitor both cluster and nodes in which ElasticSearch is running.

It’s a modern HTML5 application, which requires only a modern browser.

1.K.1: Install bigdesk plugin with the fallowing command.

Command: ./bin/plugin -install lukas-vlcek/bigdesk

1

1.L: Start the Elasticsearch.

Command: service elasticsearch start

1

1.M: Checking the elasticsearch-head plugin in the browser.

Command: localhost:9200/_plugin/head/

1

1.N : Checking the elasticsearch-bigdesk in the browser.

Command: localhost:9200/_plugin/bigdesk/

1

 

1

1

1.O : Inserting document’s into Elasticsearch using index.

1.O.1 : Inserting 1st document.

Command: curl -XPUT ‘http://localhost:9200/twitter/user/kiran’ -d ‘{ “name” : “kiranmai” }’

1

1.O.2 : Inserting 2nd  document.

Command: curl -XPUT ‘http://localhost:9200/twitter/tweet/1’ -d ‘{ “user” : “kimchy” ,

“postdate”: “2009-11-15T13:12:00”,

“message”: “Trying out Elasticsearch, so far so good?” }’

1

1.O.3: Inserting 3rd document.

Command: curl -XPUT ‘http://localhost:9200/twitter/tweet/2’ -d ‘{ “user” : “kiran” ,

“postdate”: “2015-06-12T12:32:00”,

“message”: “Another tweet, will it be indexed?” }’

1

1.P : Getting the document from elasticsearch in the command line.

Command: curl –XGET ‘http://localhost:9200/twitter/user/kiran?pretty=true’

1

1.Q: Checking the elasticsearch-head plugin in the browser to see the indexes that we have created in the previous steps.

Command: localhost:9200/_plugin/head/

1

Click on the “Indices” tab to check the indexes.

1

Click on the “Browser” tab to check all the information related to indices.

1

Step 2: Installing logstash.

 2.A: Download the logstash using the below command.

Command: wget http://download.elastic.co/logstash/logstash/logstash-1.5.1.tar.gz

1

2.B: Unzip the downloaded package.

Command:  tar xzf  logstash-1.5.1.tar.gz

1

2.C : Change the directory to “logstash-1.5.1”

Command: cd logstash-1.5.1

1

2.D : Create Configuration file to get the twitter data.

Command: vi twitter.conf

Note: Here “twitter” is the filename. You can specify any name that u like to give.

1

Paste the fallowing code in the file.

input {

twitter {

add your data

consumer_key => “<enter u r twitter consumer key here>”

consumer_secret => ” <enter u r twitter consumer secret key here >”

oauth_token => ” <enter u r twitter authorization key here>”

oauth_token_secret => ” <enter u r twitter authorization secret key ere>”

keywords => [“elasticsearch”] <here u can specify any keyword that you want to search from the twitter to get the data>

full_tweet => true

}

}

output {

                        elasticsearch_http {
                                     host => “localhost”
                                    index => “conf” < you can specify any index name u want>
                                    index_type => “tweet” < you can specify any index type u want>
    }
}

1

 2.E : Run the logstash using the fallowing command.

Command: bin/logstash agent -f twitter.conf

Note : “twitter.conf”  is the name of the configuration file, you need to specify your configuration file name that you have created in the previous step.

1

 

1

2.F: Checking the Twitter data in the elasticsearch using the browser.

Command: localhost:9200/_plugin/head/

1

The Index “conf” is created with the data from twitter. If you specify a different index name in the configuration file, then you will see that index name in the above screen instead on “conf”.

Click on the “Browser” tab.

1

Click on “Structured Query” tab.

1

Step 3: Installing Kibana.

3.A : Download the kibana using the below command.

Command: wget https://download.elasticsearch.org/kibana/kibana/kibana-4.1.0-linux-x64.tar.gz

1

3.B: Unzip the downloaded package.

Command: tar xzf kibana-4.1.0-linux-x64.tar.gz

1

3.C: Change the directory to “kibana-4.1.0-linux-x64”.

Command: cd kibana-4.1.0-linux-x64

1

3.D: Modifying the kibana.yml file.

Command: vi config/kibana.yml

1

Change the value of  host: “0.0.0.0” to host: “localhost”.

1

Save the file and exit.

3.E: Run the kibana.

Command: nohup ./bin/kibana &

1

3.F : Checking the Kibana web browser.

Open the browser and use the fallowing IP.

Command: localhost:5601

1

In the above screen remove the ”logstash-*” and enter the  index name matching with the index name in the elasticsearch.

1

Click on create.

1

Click on “Discover” tab.

1

Select the index that we have crated in the previous step from the left side of the screen.

1

Click on the timer on the top right side of the screen to set the time to get the records.

1

It will display the chart with the existing records in the Index from the elasticsearch.

1

Save it.

1
Enter any name to save.

1

Click on Save button.

Then click on “ visualize” tab.

1

Select any option from the list.

1

Save the visualization with any name.

1

Click on the “Dashboard” tab.

1

Click on the “+” icon on right side of the screen.

Select the saved visualization’s from the drop down list to add to the Dashboard.

1

 

Leave a Reply