- Notifications
You must be signed in to change notification settings - Fork25.2k
Free and Open Source, Distributed, RESTful Search Engine
License
elastic/elasticsearch
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic’s open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.
Use cases enabled by Elasticsearch include:
Full-text search
Logs
Metrics
Application performance monitoring (APM)
Security logs
... and more!
To learn more about Elasticsearch’s features and capabilities, see ourproduct page.
To access information onmachine learning innovations and the latestLucene contributions from Elastic, more information can be found inSearch Labs.
The simplest way to set up Elasticsearch is to create a managed deployment withElasticsearch Service on ElasticCloud.
If you prefer to install and manage Elasticsearch yourself, you can downloadthe latest version fromelastic.co/downloads/elasticsearch.
Warning | DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS. This setup is intended for local development and testing only. |
Quickly set up Elasticsearch and Kibana in Docker for local development or testing, using thestart-local
script.
ℹ️ For more detailed information about thestart-local
setup, refer to theREADME on GitHub.
If you don’t have Docker installed,download and install Docker Desktop for your operating system.
If you’re using Microsoft Windows, then installWindows Subsystem for Linux (WSL).
This setup comes with a one-month trial license that includes all Elastic features.
After the trial period, the license reverts toFree and open - Basic.Refer toElastic subscriptions for more information.
To set up Elasticsearch and Kibana locally, run thestart-local
script:
curl -fsSL https://elastic.co/start-local| sh
This script creates anelastic-start-local
folder containing configuration files and starts both Elasticsearch and Kibana using Docker.
After running the script, you can access Elastic services at the following endpoints:
Elasticsearch:http://localhost:9200
Kibana:http://localhost:5601
The script generates a random password for theelastic
user, which is displayed at the end of the installation and stored in the.env
file.
Caution | This setup is for local testing only. HTTPS is disabled, and Basic authentication is used for Elasticsearch. For security, Elasticsearch and Kibana are accessible only through |
An API key for Elasticsearch is generated and stored in the.env
file asES_LOCAL_API_KEY
.Use this key to connect to Elasticsearch with aprogramming language client or theREST API.
From theelastic-start-local
folder, check the connection to Elasticsearch usingcurl
:
source .envcurl$ES_LOCAL_URL -H"Authorization: ApiKey${ES_LOCAL_API_KEY}"
You send data and other requests to Elasticsearch through REST APIs.You can interact with Elasticsearch using any client that sends HTTP requests,such as theElasticsearchlanguage clients andcurl.
Here’s an example curl command to create a new Elasticsearch index, using basic auth:
curl -u elastic:$ELASTIC_PASSWORD \ -X PUT \ http://localhost:9200/my-new-index \ -H'Content-Type: application/json'
To connect to your local dev Elasticsearch cluster with a language client, you can use basic authentication with theelastic
username and the password you set in the environment variable.
You’ll use the following connection details:
Elasticsearch endpoint:
http://localhost:9200
Username:
elastic
Password:
$ELASTIC_PASSWORD
(Value you set in the environment variable)
For example, to connect with the Pythonelasticsearch
client:
importosfromelasticsearchimportElasticsearchusername='elastic'password=os.getenv('ELASTIC_PASSWORD')# Value you set in the environment variableclient=Elasticsearch("http://localhost:9200",basic_auth=(username,password))print(client.info())
Kibana’s developer console provides an easy way to experiment and test requests.To access the console, open Kibana, then go toManagement >Dev Tools.
Add data
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.Whether you have structured or unstructured text, numerical data, or geospatial data,Elasticsearch efficiently stores and indexes it in a way that supports fast searches.
For timestamped data such as logs and metrics, you typically add documents to adata stream made up of multiple auto-generated backing indices.
To add a single document to an index, submit an HTTP post request that targets the index.
POST /customer/_doc/1{ "firstname": "Jennifer", "lastname": "Walters"}
This request automatically creates thecustomer
index if it doesn’t exist,adds a new document that has an ID of 1, andstores and indexes thefirstname
andlastname
fields.
The new document is available immediately from any node in the cluster.You can retrieve it with a GET request that specifies its document ID:
GET /customer/_doc/1
To add multiple documents in one request, use the_bulk
API.Bulk data must be newline-delimited JSON (NDJSON).Each line must end in a newline character (\n
), including the last line.
PUT customer/_bulk{ "create": { } }{ "firstname": "Monica","lastname":"Rambeau"}{ "create": { } }{ "firstname": "Carol","lastname":"Danvers"}{ "create": { } }{ "firstname": "Wanda","lastname":"Maximoff"}{ "create": { } }{ "firstname": "Jennifer","lastname":"Takeda"}
Search
Indexed documents are available for search in near real-time.The following search matches all customers with a first name ofJenniferin thecustomer
index.
GET customer/_search{ "query" : { "match" : { "firstname": "Jennifer" } }}
Explore
You can use Discover in Kibana to interactively search and filter your data.From there, you can start creating visualizations and building and sharing dashboards.
To get started, create adata view that connects to one or more Elasticsearch indices,data streams, or index aliases.
Go toManagement > Stack Management > Kibana > Data Views.
SelectCreate data view.
Enter a name for the data view and a pattern that matches one or more indices,such ascustomer.
SelectSave data view to Kibana.
To start exploring, go toAnalytics > Discover.
To upgrade from an earlier version of Elasticsearch, see theElasticsearch upgradedocumentation.
Elasticsearch usesGradle for its build system.
To build a distribution for your local OS and print its output location uponcompletion, run:
./gradlew localDistro
To build a distribution for another platform, run the related command:
./gradlew :distribution:archives:linux-tar:assemble./gradlew :distribution:archives:darwin-tar:assemble./gradlew :distribution:archives:windows-zip:assemble
Distributions are output todistribution/archives
.
To run the test suite, seeTESTING.
For the complete Elasticsearch documentation visitelastic.co.
For information about our documentation processes, see thedocs README.
Theelasticsearch-labs
repo contains executable Python notebooks, sample apps, and resources to test out Elasticsearch for vector search, hybrid search and generative AI use cases.
For contribution guidelines, seeCONTRIBUTING.
To report a bug or request a feature, create aGitHub Issue. Pleaseensure someone else hasn’t created an issue for the same topic.
Need help using Elasticsearch? Reach out on theElastic Forum orSlack. Afellow community member or Elastic engineer will be happy to help you out.
About
Free and Open Source, Distributed, RESTful Search Engine