ElasticSearch Tutorial : Creating index,updating an index,getting value by index search and delete index operation
What is Elastic search??
here "client" is a object of TransportClient. when we want to connect from program to elasticsearch, then we should use port 9300, and when access from browser then we should use 9200. as data get and store as json format, so before creating an index we need to convert our object as a json format. Here i used jackson object mapper. Object to json conversion code given below,
all the method in service implementation class given below,
Service interface class given below,
and last of all, the main method,
before test it comment out your delete block because it deleted all the inserted data. now we are going to test it. From your browser type
http://<your_ip_address>/INDEX/TYPE/ID?pretty=true
you will see the output in json format.
Cheers.... :)
Why we need it??
Where Elastic search used??
a lot of question..... huh??
so lets start with minimum answer say "short ans". I also assume that you know about elasticsearch but don't know how to code it in java.
ElasticSearch is noting but a full Text search engine with an http web interface. ElasticSearch is developed in java. it also supports distributed operation as well.
we need it because it's search like NRT(Near Real Time). that's means search is significantly faster than database. it's schema free. Data store and pull both in json format. High availability and easy to scale.
Large and big big system like Github, Facebook, StackOverFlow etc. usages ElasticSearch.
Architecture is like it has some cluster. under the cluster it has some nodes, under the nodes it has document (we can say it "Index"). to insert data we need to specify the cluster name as well as node name. then create index, index type, and last of all id. under that id we can store data. feeling dizzy huh?? :D
lets see the picture
Fig 1: Elastic Search Architecture
well, Elastic search has two clients,
- Transport Client.
- Node Client.
Now we want to go direct installation and coding part.
First you need to install java. For this example i used java 1.7. to see your java proper installed or version type java -version from your command window.
now download ElasticSearch from here.
unzipping the package you found a folder "elasticsearch-XXXX" (xxx for version).
create a project in netbeans and include all the jars located "elasticsearch/lib" in the classpath.
now say we have a User object having field "id", "userName", "email". we insert that object in elasticsearch,search,update and delete.
Here I use elasticsearch version 2.1.0
Here I use elasticsearch version 2.1.0
I have the project structure like,
Fig 2: project structure
first connect to elasticsearch by your host ip and port by the following code
public void connect(String host, int port) {
try {
final Settings settings = Settings.settingsBuilder()
.put("path.home", "F:\\elasticsearch-2.1.0")
.put("node.name", "node-1")
.put("cluster.name", "elasticsearch")
.build();
if (client == null) {
client = TransportClient.builder().settings(settings).build().addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName(host), port));
}
} catch (Exception e) {
e.printStackTrace();
try {
throw new ConnectException("Can not connect to " + host + ":" + port);
} catch (ConnectException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
here "client" is a object of TransportClient. when we want to connect from program to elasticsearch, then we should use port 9300, and when access from browser then we should use 9200. as data get and store as json format, so before creating an index we need to convert our object as a json format. Here i used jackson object mapper. Object to json conversion code given below,
private String getJSON(Object o) {
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(JsonMethod.FIELD, JsonAutoDetect.Visibility.ANY);
try {
return mapper.writeValueAsString(o);
} catch (IOException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
return null;
}
}
all the method in service implementation class given below,
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package com.elasticsearch.dao.impl;
import com.elasticsearch.beans.User;
import com.elasticsearch.dao.ServiceManagerInterface;
import java.io.IOException;
import java.net.ConnectException;
import java.net.InetAddress;
import java.util.List;
import java.util.concurrent.ExecutionException;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.codehaus.jackson.annotate.JsonAutoDetect;
import org.codehaus.jackson.annotate.JsonMethod;
import org.codehaus.jackson.map.ObjectMapper;
import org.elasticsearch.action.bulk.BulkRequestBuilder;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.update.UpdateRequest;
import org.elasticsearch.action.update.UpdateResponse;
import org.elasticsearch.client.Client;
import org.elasticsearch.client.transport.TransportClient;
import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.transport.InetSocketTransportAddress;
import org.elasticsearch.node.NodeBuilder.*;
/**
*
* @author Ataur Rahman
*/
public class ServiceManagerImpl implements ServiceManagerInterface {
private static final String INDEX = "tigerit";
private static final String TYPE = "user";
Client client;
@Override
public BulkResponse addUser(List<User> userList) {
String data = "";
BulkRequestBuilder bulkRequest = client.prepareBulk();
if (userList != null && userList.size() > 0) {
for (User user : userList) {
data = getJSON(user);
bulkRequest.add(client.prepareIndex(INDEX, TYPE, user.getId()).setSource(data));
}
BulkResponse bulkResponse = bulkRequest.execute().actionGet();
if (!bulkResponse.hasFailures()) {
return bulkResponse;
}
}
return null;
}
@Override
public GetResponse getUser(String id) {
GetResponse response = client.prepareGet(INDEX, TYPE, id).get();
if (response != null) {
System.out.println("ID " + response.getId());
System.out.println("INDEX " + response.getIndex());
System.out.println("TYPE " + response.getType());
String output = getJSON(response.getSource());
System.out.println(output);
}
return response;
}
@Override
public UpdateResponse updateUser(User user) {
UpdateRequest updateRequest = new UpdateRequest();
UpdateResponse response = new UpdateResponse();
updateRequest.index(INDEX);
updateRequest.type(TYPE);
updateRequest.id(user.getId());
updateRequest.doc(getJSON(user));
try {
response = client.update(updateRequest).get();
} catch (InterruptedException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
response = null;
} catch (ExecutionException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
response = null;
}
return response;
}
@Override
public void connect(String host, int port) {
try {
final Settings settings = Settings.settingsBuilder()
.put("path.home", "F:\\elasticsearch-2.1.0")
.put("node.name", "node-1")
.put("cluster.name", "elasticsearch")
.build();
if (client == null) {
client = TransportClient.builder().settings(settings).build().addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName(host), port));
}
} catch (Exception e) {
e.printStackTrace();
try {
throw new ConnectException("Can not connect to " + host + ":" + port);
} catch (ConnectException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
private String getJSON(Object o) {
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(JsonMethod.FIELD, JsonAutoDetect.Visibility.ANY);
try {
return mapper.writeValueAsString(o);
} catch (IOException ex) {
Logger.getLogger(ServiceManagerImpl.class.getName()).log(Level.SEVERE, null, ex);
return null;
}
}
@Override
public BulkResponse deleteUser(List<User> userList) {
BulkRequestBuilder requestBuilder = client.prepareBulk();
if (userList != null && userList.size() > 1) {
for (User user : userList) {
requestBuilder.add(client.prepareDelete(INDEX, TYPE, user.getId()));
}
}
BulkResponse bulkResponse = requestBuilder.execute().actionGet();
if (bulkResponse != null && !bulkResponse.hasFailures()) {
System.out.println("Index " + " deleted");
}
return bulkResponse;
}
}
Service interface class given below,
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package com.elasticsearch.dao;
import com.elasticsearch.beans.User;
import java.util.List;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.action.update.UpdateResponse;
/**
*
* @author Ataur Rahman
*/
public interface ServiceManagerInterface {
BulkResponse addUser(List<User> userList);
GetResponse getUser(String id);
BulkResponse deleteUser(List<User> userList);
UpdateResponse updateUser(User user);
void connect(String host, int port);
}
and last of all, the main method,
/*
* To change this template, choose Tools | Templates
* and open the template in the editor.
*/
package com.elasticsearch.client;
import com.elasticsearch.beans.User;
import com.elasticsearch.dao.ServiceManagerInterface;
import com.elasticsearch.dao.impl.ServiceManagerImpl;
import java.util.ArrayList;
import java.util.List;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.delete.DeleteResponse;
import org.elasticsearch.action.get.GetResponse;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.update.UpdateResponse;
/**
*
* @author Ataur Rahman
*/
public class ElasticSearchClient {
public static void main(String[] args) {
ServiceManagerInterface service = new ServiceManagerImpl();
service.connect("192.168.103.11", 9300);
User user = new User();
user.setId("5");
List<User> userList = prepareUserForInsertion();
BulkResponse response = service.addUser(userList);
if(response!= null){
if(!response.hasFailures())
System.out.println("All data inserted");
else System.out.println("Error,All data not inserted");
}
GetResponse getResponse = service.getUser(user.getId());
if(getResponse!= null){
System.out.println("Getting the user with id : "+getResponse.getId());
}
user.setEmail("email@email.com");
UpdateResponse updateResponse = service.updateUser(user);
if(updateResponse!= null){
System.out.println("User updated with id"+updateResponse.getId());
}
BulkResponse deleteResponse = service.deleteUser(prepareUserForInsertion());
if(deleteResponse!= null && !deleteResponse.hasFailures()){
System.out.println("All Item deleted ");
}
}
public static List<User> prepareUserForInsertion(){
User user;
List <User> userList = new ArrayList<User>();
for(int i=1; i<=100; i++){
user = new User();
user.setId(String.valueOf(i));
System.out.println(user.getId());
user.setUserName("munna"+String.valueOf(i));
System.out.println(user.getUserName());
user.setEmail("munna@munna.com"+String.valueOf(i));
System.out.println(user.getEmail());
userList.add(user);
}
return userList;
}
}
before test it comment out your delete block because it deleted all the inserted data. now we are going to test it. From your browser type
http://<your_ip_address>/INDEX/TYPE/ID?pretty=true
you will see the output in json format.
Cheers.... :)
Very Useful Article .
ReplyDeleteThanks for your appreciation. :)
DeleteElasticSearch + Kibana database
ReplyDeleteElasticsearch is a free, open-source search and analytics engine based on the Apache Lucene library. It’s the most popular search engine and has been available since 2010. It’s developed in Java, supporting clients in many different languages, such as PHP, Python, C#, and Ruby.
Kibana is an free and open frontend application that sits on top of the Elastic Stack, providing search and data visualization capabilities for data indexed in Elasticsearch. Commonly known as the charting tool for the Elastic Stack (previously referred to as the ELK Stack after Elasticsearch, Logstash, and Kibana), Kibana also acts as the user interface for monitoring, managing, and securing an Elastic Stack cluster — as well as the centralized hub for built-in solutions developed on the Elastic Stack. Developed in 2013 from within the Elasticsearch community, Kibana has grown to become the window into the Elastic Stack itself, offering a portal for users and companies.ElasticSearch + Kibana database
our ElasticSearch + Kibana database expert skills & 24/7 dedicated support for stable clusters and achieve unparalleled performance and cost reduction along with stable data health. Experience our enterprise-class, worldwide support for Kibana integrated Elasticsearch & other stack.With years of direct, hands-on experience managing large Elasticsearch deployments, Genex efficiently supports data-analytics in real time. Take advantage of market-leading functionalities and Kibana visualizations on large data sets, with features including high available clusters, TLS, and RBAC.