In this article, we will discuss how to parse JSON fields in Elasticsearch, which is a common requirement when dealing with log data or other structured data formats. We will cover the following topics:
- Ingesting JSON data into Elasticsearch
- Using an Ingest Pipeline to parse JSON fields
- Querying and aggregating JSON fields
1. Ingesting JSON data into Elasticsearch
When ingesting JSON data into Elasticsearch, it is essential to ensure that the data is properly formatted and structured. Elasticsearch can automatically detect and map JSON fields, but it is recommended to define an explicit mapping for better control over the indexing process.
To create an index with a custom mapping, you can use the following API call:
PUT /my_index
{
"mappings": {
"properties": {
"message": {
"type": "keyword"
},
"json_field": {
"properties": {
"field1": {
"type": "keyword"
},
"field2": {
"type": "integer"
}
}
}
}
}
}
In this example, we create an index called my_index
with a custom mapping for a JSON field named json_field
.
2. Using an Ingest Pipeline to parse JSON fields
If your JSON data is stored as a string within a field, you can use the Ingest Pipeline feature in Elasticsearch to parse the JSON string and extract the relevant fields. The Ingest Pipeline provides a set of built-in processors, including the json
processor, which can be used to parse JSON data.
To create an ingest pipeline with the json
processor, use the following API call:
PUT _ingest/pipeline/json_parser
{
"description": "Parse JSON field",
"processors": [
{
"json": {
"field": "message",
"target_field": "json_field"
}
}
]
}
In this example, we create an ingest pipeline called json_parser
that parses the JSON string stored in the message
field and stores the resulting JSON object in a new field called json_field
.
To index a document using this pipeline, use the following API call:
POST /my_index/_doc?pipeline=json_parser
{
"message": "{\"field1\": \"value1\", \"field2\": 42}"
}
The document will be indexed with the parsed JSON fields:
{
"_index": "my_index",
"_type": "_doc",
"_id": "1",
"_source": {
"message": "{\"field1\": \"value1\", \"field2\": 42}",
"json_field": {
"field1": "value1",
"field2": 42
}
}
}
3. Querying and aggregating JSON fields
Once the JSON fields are indexed, you can query and aggregate them using the Elasticsearch Query DSL. For example, to search for documents with a specific value in the field1
subfield, you can use the following query:
POST /my_index/_search
{
"query": {
"term": {
"json_field.field1": "value1"
}
}
}
To aggregate the values of the field2
subfield, you can use the following aggregation:
POST /my_index/_search
{
"size": 0,
"aggs": {
"field2_sum": {
"sum": {
"field": "json_field.field2"
}
}
}
Conclusion
In conclusion, parsing JSON fields in Elasticsearch can be achieved using custom mappings, the Ingest Pipeline feature, and the Elasticsearch Query DSL. By following these steps, you can efficiently index, query, and aggregate JSON data in your Elasticsearch cluster.
Ready to try this out on your own? Start a free trial.
Want to get Elastic certified? Find out when the next Elasticsearch Engineer training is running!
Related content

June 5, 2025
Making sense of unstructured documents: Using Reducto parsing with Elasticsearch
Demonstrating how Reducto's document processing can be integrated with Elasticsearch for semantic search.

June 2, 2025
AI-powered case deflection: build & deploy in minutes
Exploring the AI Assistant Knowledge Base capabilities combined with Playground to create a self-service case deflection platform.

May 21, 2025
Get set, build: Red Hat OpenShift AI applications powered by Elasticsearch vector database
The Elasticsearch vector database is now supported by the ‘AI Generation with LLM and RAG’ Validated Pattern. This blog walks you through how to get started.

May 19, 2025
Elasticsearch in JavaScript the proper way, part II
Reviewing production best practices and explaining how to run the Elasticsearch Node.js client in Serverless environments.

May 15, 2025
Elasticsearch in JavaScript the proper way, part I
Explaining how to create a production-ready Elasticsearch backend in JavaScript.