Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/elastic/elasticsearch
Free and Open Source, Distributed, RESTful Search Engine
https://github.com/elastic/elasticsearch
elasticsearch java search-engine
Last synced: about 11 hours ago
JSON representation
Free and Open Source, Distributed, RESTful Search Engine
- Host: GitHub
- URL: https://github.com/elastic/elasticsearch
- Owner: elastic
- License: other
- Created: 2010-02-08T13:20:56.000Z (almost 15 years ago)
- Default Branch: main
- Last Pushed: 2024-10-29T09:26:19.000Z (20 days ago)
- Last Synced: 2024-10-29T10:02:06.539Z (20 days ago)
- Topics: elasticsearch, java, search-engine
- Language: Java
- Homepage: https://www.elastic.co/products/elasticsearch
- Size: 1.19 GB
- Stars: 70,011
- Watchers: 2,678
- Forks: 24,758
- Open Issues: 4,754
-
Metadata Files:
- Readme: README.asciidoc
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.txt
- Codeowners: .github/CODEOWNERS
Awesome Lists containing this project
- Awesome-LLM-Productization - ElasticSearch - a distributed, RESTful search engine optimized for speed and relevance on production-scale workloads (Java based) (Models and Tools / Vector Store)
- awesome-distributed-system-projects - ElasticSearch - distributed, RESTful search and analytics engine
- awesome-open-source-alternatives - Elastic Search
- awesome-java - elasticsearch
- netlas-cookbook - Elasticsearch - based databases. (Search Query Syntax / Looking for Websites That Contain a Certain Word in Their Title)
- awesome-ccamel - elastic/elasticsearch - Free and Open Source, Distributed, RESTful Search Engine (Java)
- awesome-starred-test - elastic/elasticsearch - Free and Open Source, Distributed, RESTful Search Engine (Java)
- awesome - elastic/elasticsearch - Free and Open Source, Distributed, RESTful Search Engine (Java)
- awesome-github-repos - elastic/elasticsearch - Free and Open Source, Distributed, RESTful Search Engine (Java)
- awesome-repositories - elastic/elasticsearch - Free and Open Source, Distributed, RESTful Search Engine (Java)
- awesome-starred - elasticsearch - Open Source, Distributed, RESTful Search Engine (Java)
- awesome-elastic-resources - Link
- awesome-starts - elastic/elasticsearch - Free and Open, Distributed, RESTful Search Engine (Java)
- awesome-for-beginners - elasticsearch
- awesome-failure-diagnosis - Elasticsearch
- awesome-list - Elasticsearch - Free and Open, Distributed, RESTful Search Engine. (Data Management & Processing / Database & Cloud Management)
- awesome-list - elasticsearch
- awesome-cloud-native - elasticsearch - Open Source, Distributed, RESTful Search Engine. (Logging)
- awesome-tools - elasticsearch - Free and Open, Distributed, RESTful Search Engine (Uncategorized / Uncategorized)
- awesome-dataops - Elasticsearch - A distributed document oriented database with a RESTful search engine. (Database / Document-Oriented Database)
- StarryDivineSky - elastic/elasticsearch
- AiTreasureBox - elastic/elasticsearch - 11-13_1168_23](https://img.shields.io/github/stars/elastic/elasticsearch.svg)|Free and Open, Distributed, RESTful Search Engine| (Repos)
- fucking-awesome-for-beginners - elasticsearch
- awesome-homelab - Elasticsearch
README
= Elasticsearch
Elasticsearch is a distributed search and analytics engine, scalable data store and vector database optimized for speed and relevance on production-scale workloads. Elasticsearch is the foundation of Elastic's open Stack platform. Search in near real-time over massive datasets, perform vector searches, integrate with generative AI applications, and much more.
Use cases enabled by Elasticsearch include:
* https://www.elastic.co/search-labs/blog/articles/retrieval-augmented-generation-rag[Retrieval Augmented Generation (RAG)]
* https://www.elastic.co/search-labs/blog/categories/vector-search[Vector search]
* Full-text search
* Logs
* Metrics
* Application performance monitoring (APM)
* Security logs\... and more!
To learn more about Elasticsearch's features and capabilities, see our
https://www.elastic.co/products/elasticsearch[product page].To access information on https://www.elastic.co/search-labs/blog/categories/ml-research[machine learning innovations] and the latest https://www.elastic.co/search-labs/blog/categories/lucene[Lucene contributions from Elastic], more information can be found in https://www.elastic.co/search-labs[Search Labs].
[[get-started]]
== Get startedThe simplest way to set up Elasticsearch is to create a managed deployment with
https://www.elastic.co/cloud/as-a-service[Elasticsearch Service on Elastic
Cloud].If you prefer to install and manage Elasticsearch yourself, you can download
the latest version from
https://www.elastic.co/downloads/elasticsearch[elastic.co/downloads/elasticsearch].=== Run Elasticsearch locally
////
IMPORTANT: This content is replicated in the Elasticsearch guide. See `run-elasticsearch-locally.asciidoc`.
Both will soon be replaced by a quickstart script.
////[WARNING]
====
DO NOT USE THESE INSTRUCTIONS FOR PRODUCTION DEPLOYMENTS.This setup is intended for local development and testing only.
====The following commands help you very quickly spin up a single-node Elasticsearch cluster, together with Kibana in Docker.
Use this setup for local development or testing.==== Prerequisites
If you don't have Docker installed, https://www.docker.com/products/docker-desktop[download and install Docker Desktop] for your operating system.
==== Set environment variables
Configure the following environment variables.
[source,sh]
----
export ELASTIC_PASSWORD="" # password for "elastic" username
export KIBANA_PASSWORD="" # Used internally by Kibana, must be at least 6 characters long
----==== Create a Docker network
To run both Elasticsearch and Kibana, you'll need to create a Docker network:
[source,sh]
----
docker network create elastic-net
----==== Run Elasticsearch
Start the Elasticsearch container with the following command:
[source,sh]
----
docker run -p 127.0.0.1:9200:9200 -d --name elasticsearch --network elastic-net \
-e ELASTIC_PASSWORD=$ELASTIC_PASSWORD \
-e "discovery.type=single-node" \
-e "xpack.security.http.ssl.enabled=false" \
-e "xpack.license.self_generated.type=trial" \
docker.elastic.co/elasticsearch/elasticsearch:{version}
----==== Run Kibana (optional)
To run Kibana, you must first set the `kibana_system` password in the Elasticsearch container.
[source,sh]
----
# configure the Kibana password in the ES container
curl -u elastic:$ELASTIC_PASSWORD \
-X POST \
http://localhost:9200/_security/user/kibana_system/_password \
-d '{"password":"'"$KIBANA_PASSWORD"'"}' \
-H 'Content-Type: application/json'
----
// NOTCONSOLEStart the Kibana container with the following command:
[source,sh]
----
docker run -p 127.0.0.1:5601:5601 -d --name kibana --network elastic-net \
-e ELASTICSEARCH_URL=http://elasticsearch:9200 \
-e ELASTICSEARCH_HOSTS=http://elasticsearch:9200 \
-e ELASTICSEARCH_USERNAME=kibana_system \
-e ELASTICSEARCH_PASSWORD=$KIBANA_PASSWORD \
-e "xpack.security.enabled=false" \
-e "xpack.license.self_generated.type=trial" \
docker.elastic.co/kibana/kibana:{version}
----.Trial license
[%collapsible]
====
The service is started with a trial license. The trial license enables all features of Elasticsearch for a trial period of 30 days. After the trial period expires, the license is downgraded to a basic license, which is free forever. If you prefer to skip the trial and use the basic license, set the value of the `xpack.license.self_generated.type` variable to basic instead. For a detailed feature comparison between the different licenses, refer to our https://www.elastic.co/subscriptions[subscriptions page].
======== Send requests to Elasticsearch
You send data and other requests to Elasticsearch through REST APIs.
You can interact with Elasticsearch using any client that sends HTTP requests,
such as the https://www.elastic.co/guide/en/elasticsearch/client/index.html[Elasticsearch
language clients] and https://curl.se[curl].===== Using curl
Here's an example curl command to create a new Elasticsearch index, using basic auth:
[source,sh]
----
curl -u elastic:$ELASTIC_PASSWORD \
-X PUT \
http://localhost:9200/my-new-index \
-H 'Content-Type: application/json'
----
// NOTCONSOLE===== Using a language client
To connect to your local dev Elasticsearch cluster with a language client, you can use basic authentication with the `elastic` username and the password you set in the environment variable.
You'll use the following connection details:
* **Elasticsearch endpoint**: `http://localhost:9200`
* **Username**: `elastic`
* **Password**: `$ELASTIC_PASSWORD` (Value you set in the environment variable)For example, to connect with the Python `elasticsearch` client:
[source,python]
----
import os
from elasticsearch import Elasticsearchusername = 'elastic'
password = os.getenv('ELASTIC_PASSWORD') # Value you set in the environment variableclient = Elasticsearch(
"http://localhost:9200",
basic_auth=(username, password)
)print(client.info())
----===== Using the Dev Tools Console
Kibana's developer console provides an easy way to experiment and test requests.
To access the console, open Kibana, then go to **Management** > **Dev Tools**.**Add data**
You index data into Elasticsearch by sending JSON objects (documents) through the REST APIs.
Whether you have structured or unstructured text, numerical data, or geospatial data,
Elasticsearch efficiently stores and indexes it in a way that supports fast searches.For timestamped data such as logs and metrics, you typically add documents to a
data stream made up of multiple auto-generated backing indices.To add a single document to an index, submit an HTTP post request that targets the index.
----
POST /customer/_doc/1
{
"firstname": "Jennifer",
"lastname": "Walters"
}
----This request automatically creates the `customer` index if it doesn't exist,
adds a new document that has an ID of 1, and
stores and indexes the `firstname` and `lastname` fields.The new document is available immediately from any node in the cluster.
You can retrieve it with a GET request that specifies its document ID:----
GET /customer/_doc/1
----To add multiple documents in one request, use the `_bulk` API.
Bulk data must be newline-delimited JSON (NDJSON).
Each line must end in a newline character (`\n`), including the last line.----
PUT customer/_bulk
{ "create": { } }
{ "firstname": "Monica","lastname":"Rambeau"}
{ "create": { } }
{ "firstname": "Carol","lastname":"Danvers"}
{ "create": { } }
{ "firstname": "Wanda","lastname":"Maximoff"}
{ "create": { } }
{ "firstname": "Jennifer","lastname":"Takeda"}
----**Search**
Indexed documents are available for search in near real-time.
The following search matches all customers with a first name of _Jennifer_
in the `customer` index.----
GET customer/_search
{
"query" : {
"match" : { "firstname": "Jennifer" }
}
}
----**Explore**
You can use Discover in Kibana to interactively search and filter your data.
From there, you can start creating visualizations and building and sharing dashboards.To get started, create a _data view_ that connects to one or more Elasticsearch indices,
data streams, or index aliases.. Go to **Management > Stack Management > Kibana > Data Views**.
. Select **Create data view**.
. Enter a name for the data view and a pattern that matches one or more indices,
such as _customer_.
. Select **Save data view to Kibana**.To start exploring, go to **Analytics > Discover**.
[[upgrade]]
== UpgradeTo upgrade from an earlier version of Elasticsearch, see the
https://www.elastic.co/guide/en/elasticsearch/reference/current/setup-upgrade.html[Elasticsearch upgrade
documentation].[[build-source]]
== Build from sourceElasticsearch uses https://gradle.org[Gradle] for its build system.
To build a distribution for your local OS and print its output location upon
completion, run:
----
./gradlew localDistro
----To build a distribution for another platform, run the related command:
----
./gradlew :distribution:archives:linux-tar:assemble
./gradlew :distribution:archives:darwin-tar:assemble
./gradlew :distribution:archives:windows-zip:assemble
----To build distributions for all supported platforms, run:
----
./gradlew assemble
----Distributions are output to `distribution/archives`.
To run the test suite, see xref:TESTING.asciidoc[TESTING].
[[docs]]
== DocumentationFor the complete Elasticsearch documentation visit
https://www.elastic.co/guide/en/elasticsearch/reference/current/index.html[elastic.co].For information about our documentation processes, see the
xref:docs/README.asciidoc[docs README].[[examples]]
== Examples and guidesThe https://github.com/elastic/elasticsearch-labs[`elasticsearch-labs`] repo contains executable Python notebooks, sample apps, and resources to test out Elasticsearch for vector search, hybrid search and generative AI use cases.
[[contribute]]
== ContributeFor contribution guidelines, see xref:CONTRIBUTING.md[CONTRIBUTING].
[[questions]]
== Questions? Problems? Suggestions?* To report a bug or request a feature, create a
https://github.com/elastic/elasticsearch/issues/new/choose[GitHub Issue]. Please
ensure someone else hasn't created an issue for the same topic.* Need help using Elasticsearch? Reach out on the
https://discuss.elastic.co[Elastic Forum] or https://ela.st/slack[Slack]. A
fellow community member or Elastic engineer will be happy to help you out.