├── .gitignore
├── Aggregations
├── aggregating-nested-objects.md
├── defining-bucket-rules-with-filters.md
├── filtering-out-documents.md
├── global-aggregation.md
├── histograms.md
├── introduction-to-aggregations.md
├── introduction-to-bucket-aggregations.md
├── metric-aggregations.md
├── missing-field-values.md
├── nested-aggregations.md
└── range-aggregations.md
├── Controlling Query Results
├── filters.md
├── sorting-by-multi-value-fields.md
├── sorting-results.md
├── source-filtering.md
├── specifying-an-offset.md
├── specifying-the-result-format.md
└── specifying-the-result-size.md
├── Getting Started
├── adding-more-nodes-to-the-cluster.md
├── inspecting-the-cluster.md
├── overview-of-node-roles.md
├── sending-queries-with-curl.md
├── setting-up-elasticsearch-kibana-macos-linux.md
├── setting-up-elasticsearch-kibana-windows.md
├── sharding-and-scalability.md
└── understanding-replication.md
├── Improving Search Results
├── adding-synonyms-from-file.md
├── adding-synonyms.md
├── affecting-relevance-scoring-with-proximity.md
├── fuzzy-match-query.md
├── fuzzy-query.md
├── highlighting-matches-in-fields.md
├── proximity-searches.md
└── stemming.md
├── Joining Queries
├── add-departments-test-data.md
├── adding-documents.md
├── mapping-document-relationships.md
├── multi-level-relations.md
├── parent-child-inner-hits.md
├── querying-by-parent-id.md
├── querying-child-documents-by-parent.md
├── querying-parent-by-child-documents.md
└── terms-lookup-mechanism.md
├── LICENSE.md
├── Managing Documents
├── batch-processing.md
├── creating-and-deleting-indices.md
├── delete-by-query.md
├── deleting-documents.md
├── importing-data-with-curl.md
├── indexing-documents.md
├── optimistic-concurrency-control.md
├── replacing-documents.md
├── retrieving-documents-by-id.md
├── scripted-updates.md
├── update-by-query.md
├── updating-documents.md
└── upserts.md
├── Mapping & Analysis
├── adding-analyzers-to-existing-indices.md
├── adding-explicit-mappings.md
├── adding-mappings-to-existing-indices.md
├── combining-explicit-and-dynamic-mapping.md
├── configuring-dynamic-mapping.md
├── creating-custom-analyzers.md
├── defining-field-aliases.md
├── dynamic-templates.md
├── how-dates-work-in-elasticsearch.md
├── how-the-keyword-data-type-works.md
├── index-templates.md
├── multi-field-mappings.md
├── reindexing-documents-with-the-reindex-api.md
├── retrieving-mappings.md
├── understanding-arrays.md
├── understanding-type-coercion.md
├── updating-analyzers.md
├── updating-existing-mappings.md
├── using-dot-notation-in-field-names.md
└── using-the-analyze-api.md
├── README.md
├── Searching for Data
├── boosting-query.md
├── disjunction-max.md
├── introduction-to-relevance-scoring.md
├── nested-inner-hits.md
├── phrase-searches.md
├── prefixes-wildcards-regular-expressions.md
├── querying-by-field-existence.md
├── querying-nested-objects.md
├── querying-with-boolean-logic.md
├── range-searches.md
├── retrieving-documents-by-ids.md
├── searching-for-terms.md
├── searching-multiple-fields.md
└── the-match-query.md
├── orders-bulk.json
├── products-bulk.json
└── recipes-bulk.json
/.gitignore:
--------------------------------------------------------------------------------
1 | .idea
2 | .DS_Store
3 | data
4 | quickstart.md
--------------------------------------------------------------------------------
/Aggregations/aggregating-nested-objects.md:
--------------------------------------------------------------------------------
1 | # Aggregating nested objects
2 |
3 | ```
4 | GET /department/_search
5 | {
6 | "size": 0,
7 | "aggs": {
8 | "employees": {
9 | "nested": {
10 | "path": "employees"
11 | }
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ```
18 | GET /department/_search
19 | {
20 | "size": 0,
21 | "aggs": {
22 | "employees": {
23 | "nested": {
24 | "path": "employees"
25 | },
26 | "aggs": {
27 | "minimum_age": {
28 | "min": {
29 | "field": "employees.age"
30 | }
31 | }
32 | }
33 | }
34 | }
35 | }
36 | ```
--------------------------------------------------------------------------------
/Aggregations/defining-bucket-rules-with-filters.md:
--------------------------------------------------------------------------------
1 | # Defining bucket rules with filters
2 |
3 | ## Placing documents into buckets based on criteria
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "my_filter": {
11 | "filters": {
12 | "filters": {
13 | "pasta": {
14 | "match": {
15 | "title": "pasta"
16 | }
17 | },
18 | "spaghetti": {
19 | "match": {
20 | "title": "spaghetti"
21 | }
22 | }
23 | }
24 | }
25 | }
26 | }
27 | }
28 | ```
29 |
30 | ## Calculate average ratings for buckets
31 |
32 | ```
33 | GET /recipes/_search
34 | {
35 | "size": 0,
36 | "aggs": {
37 | "my_filter": {
38 | "filters": {
39 | "filters": {
40 | "pasta": {
41 | "match": {
42 | "title": "pasta"
43 | }
44 | },
45 | "spaghetti": {
46 | "match": {
47 | "title": "spaghetti"
48 | }
49 | }
50 | }
51 | },
52 | "aggs": {
53 | "avg_rating": {
54 | "avg": {
55 | "field": "ratings"
56 | }
57 | }
58 | }
59 | }
60 | }
61 | }
62 | ```
--------------------------------------------------------------------------------
/Aggregations/filtering-out-documents.md:
--------------------------------------------------------------------------------
1 | # Filtering out documents
2 |
3 | ## Filtering out documents with low `total_amount`
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "low_value": {
11 | "filter": {
12 | "range": {
13 | "total_amount": {
14 | "lt": 50
15 | }
16 | }
17 | }
18 | }
19 | }
20 | }
21 | ```
22 |
23 | ## Aggregating on the bucket of remaining documents
24 |
25 | ```
26 | GET /orders/_search
27 | {
28 | "size": 0,
29 | "aggs": {
30 | "low_value": {
31 | "filter": {
32 | "range": {
33 | "total_amount": {
34 | "lt": 50
35 | }
36 | }
37 | },
38 | "aggs": {
39 | "avg_amount": {
40 | "avg": {
41 | "field": "total_amount"
42 | }
43 | }
44 | }
45 | }
46 | }
47 | }
48 | ```
--------------------------------------------------------------------------------
/Aggregations/global-aggregation.md:
--------------------------------------------------------------------------------
1 | # `global` aggregation
2 |
3 | ## Break out of the aggregation context
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "query": {
9 | "range": {
10 | "total_amount": {
11 | "gte": 100
12 | }
13 | }
14 | },
15 | "size": 0,
16 | "aggs": {
17 | "all_orders": {
18 | "global": { },
19 | "aggs": {
20 | "stats_amount": {
21 | "stats": {
22 | "field": "total_amount"
23 | }
24 | }
25 | }
26 | }
27 | }
28 | }
29 | ```
30 |
31 | ## Adding aggregation without global context
32 |
33 | ```
34 | GET /orders/_search
35 | {
36 | "query": {
37 | "range": {
38 | "total_amount": {
39 | "gte": 100
40 | }
41 | }
42 | },
43 | "size": 0,
44 | "aggs": {
45 | "all_orders": {
46 | "global": { },
47 | "aggs": {
48 | "stats_amount": {
49 | "stats": {
50 | "field": "total_amount"
51 | }
52 | }
53 | }
54 | },
55 | "stats_expensive": {
56 | "stats": {
57 | "field": "total_amount"
58 | }
59 | }
60 | }
61 | }
62 | ```
--------------------------------------------------------------------------------
/Aggregations/histograms.md:
--------------------------------------------------------------------------------
1 | # Histograms
2 |
3 | ## Distribution of `total_amount` with interval `25`
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "amount_distribution": {
11 | "histogram": {
12 | "field": "total_amount",
13 | "interval": 25
14 | }
15 | }
16 | }
17 | }
18 | ```
19 |
20 | ## Requiring minimum 1 document per bucket
21 |
22 | ```
23 | GET /orders/_search
24 | {
25 | "size": 0,
26 | "aggs": {
27 | "amount_distribution": {
28 | "histogram": {
29 | "field": "total_amount",
30 | "interval": 25,
31 | "min_doc_count": 1
32 | }
33 | }
34 | }
35 | }
36 | ```
37 |
38 | ## Specifying fixed bucket boundaries
39 |
40 | ```
41 | GET /orders/_search
42 | {
43 | "size": 0,
44 | "query": {
45 | "range": {
46 | "total_amount": {
47 | "gte": 100
48 | }
49 | }
50 | },
51 | "aggs": {
52 | "amount_distribution": {
53 | "histogram": {
54 | "field": "total_amount",
55 | "interval": 25,
56 | "min_doc_count": 0,
57 | "extended_bounds": {
58 | "min": 0,
59 | "max": 500
60 | }
61 | }
62 | }
63 | }
64 | }
65 | ```
66 |
67 | ## Aggregating by month with the `date_histogram` aggregation
68 |
69 | ```
70 | GET /orders/_search
71 | {
72 | "size": 0,
73 | "aggs": {
74 | "orders_over_time": {
75 | "date_histogram": {
76 | "field": "purchased_at",
77 | "calendar_interval": "month"
78 | }
79 | }
80 | }
81 | }
82 | ```
--------------------------------------------------------------------------------
/Aggregations/introduction-to-aggregations.md:
--------------------------------------------------------------------------------
1 | # Introduction to aggregations
2 |
3 | ## Adding `orders` index with field mappings
4 |
5 | ```
6 | PUT /orders
7 | {
8 | "mappings": {
9 | "properties": {
10 | "purchased_at": {
11 | "type": "date"
12 | },
13 | "lines": {
14 | "type": "nested",
15 | "properties": {
16 | "product_id": {
17 | "type": "integer"
18 | },
19 | "amount": {
20 | "type": "double"
21 | },
22 | "quantity": {
23 | "type": "short"
24 | }
25 | }
26 | },
27 | "total_amount": {
28 | "type": "double"
29 | },
30 | "status": {
31 | "type": "keyword"
32 | },
33 | "sales_channel": {
34 | "type": "keyword"
35 | },
36 | "salesman": {
37 | "type": "object",
38 | "properties": {
39 | "id": {
40 | "type": "integer"
41 | },
42 | "name": {
43 | "type": "text"
44 | }
45 | }
46 | }
47 | }
48 | }
49 | }
50 | ```
51 |
52 | ## Populating the `orders` index with test data
53 |
54 | If you are using a cloud hosted Elasticsearch deployment, remove the `--cacert` argument.
55 |
56 | ```
57 | # macOS & Linux
58 | curl --cacert config/certs/http_ca.crt -u elastic -H "Content-Type:application/x-ndjson" -X POST https://localhost:9200/orders/_bulk --data-binary "@orders-bulk.json"
59 |
60 | # Windows
61 | curl --cacert config\certs\http_ca.crt -u elastic -H "Content-Type:application/x-ndjson" -X POST https://localhost:9200/orders/_bulk --data-binary "@orders-bulk.json"
62 | ```
--------------------------------------------------------------------------------
/Aggregations/introduction-to-bucket-aggregations.md:
--------------------------------------------------------------------------------
1 | # Introduction to bucket aggregations
2 |
3 | ## Creating a bucket for each `status` value
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "status_terms": {
11 | "terms": {
12 | "field": "status"
13 | }
14 | }
15 | }
16 | }
17 | ```
18 |
19 | ## Including `20` terms instead of the default `10`
20 |
21 | ```
22 | GET /orders/_search
23 | {
24 | "size": 0,
25 | "aggs": {
26 | "status_terms": {
27 | "terms": {
28 | "field": "status",
29 | "size": 20
30 | }
31 | }
32 | }
33 | }
34 | ```
35 |
36 | ## Aggregating documents with missing field (or `NULL`)
37 |
38 | ```
39 | GET /orders/_search
40 | {
41 | "size": 0,
42 | "aggs": {
43 | "status_terms": {
44 | "terms": {
45 | "field": "status",
46 | "size": 20,
47 | "missing": "N/A"
48 | }
49 | }
50 | }
51 | }
52 | ```
53 |
54 | ## Changing the minimum document count for a bucket to be created
55 |
56 | ```
57 | GET /orders/_search
58 | {
59 | "size": 0,
60 | "aggs": {
61 | "status_terms": {
62 | "terms": {
63 | "field": "status",
64 | "size": 20,
65 | "missing": "N/A",
66 | "min_doc_count": 0
67 | }
68 | }
69 | }
70 | }
71 | ```
72 |
73 | ## Ordering the buckets
74 |
75 | ```
76 | GET /orders/_search
77 | {
78 | "size": 0,
79 | "aggs": {
80 | "status_terms": {
81 | "terms": {
82 | "field": "status",
83 | "size": 20,
84 | "missing": "N/A",
85 | "min_doc_count": 0,
86 | "order": {
87 | "_key": "asc"
88 | }
89 | }
90 | }
91 | }
92 | }
93 | ```
--------------------------------------------------------------------------------
/Aggregations/metric-aggregations.md:
--------------------------------------------------------------------------------
1 | # Metric aggregations
2 |
3 | ## Calculating statistics with `sum`, `avg`, `min`, and `max` aggregations
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "total_sales": {
11 | "sum": {
12 | "field": "total_amount"
13 | }
14 | },
15 | "avg_sale": {
16 | "avg": {
17 | "field": "total_amount"
18 | }
19 | },
20 | "min_sale": {
21 | "min": {
22 | "field": "total_amount"
23 | }
24 | },
25 | "max_sale": {
26 | "max": {
27 | "field": "total_amount"
28 | }
29 | }
30 | }
31 | }
32 | ```
33 |
34 | ## Retrieving the number of distinct values
35 |
36 | ```
37 | GET /orders/_search
38 | {
39 | "size": 0,
40 | "aggs": {
41 | "total_salesmen": {
42 | "cardinality": {
43 | "field": "salesman.id"
44 | }
45 | }
46 | }
47 | }
48 | ```
49 |
50 | ## Retrieving the number of values
51 |
52 | ```
53 | GET /orders/_search
54 | {
55 | "size": 0,
56 | "aggs": {
57 | "values_count": {
58 | "value_count": {
59 | "field": "total_amount"
60 | }
61 | }
62 | }
63 | }
64 | ```
65 |
66 | ## Using `stats` aggregation for common statistics
67 |
68 | ```
69 | GET /orders/_search
70 | {
71 | "size": 0,
72 | "aggs": {
73 | "amount_stats": {
74 | "stats": {
75 | "field": "total_amount"
76 | }
77 | }
78 | }
79 | }
80 | ```
--------------------------------------------------------------------------------
/Aggregations/missing-field-values.md:
--------------------------------------------------------------------------------
1 | # Missing field values
2 |
3 | ## Adding test documents
4 |
5 | ```
6 | PUT /orders/_doc/1001
7 | {
8 | "total_amount": 100
9 | }
10 | ```
11 |
12 | ```
13 | PUT /orders/_doc/1002
14 | {
15 | "total_amount": 200,
16 | "status": null
17 | }
18 | ```
19 |
20 | ## Aggregating documents with missing field value
21 |
22 | ```
23 | GET /orders/_search
24 | {
25 | "size": 0,
26 | "aggs": {
27 | "orders_without_status": {
28 | "missing": {
29 | "field": "status"
30 | }
31 | }
32 | }
33 | }
34 | ```
35 |
36 | ## Combining `missing` aggregation with other aggregations
37 |
38 | ```
39 | GET /orders/_search
40 | {
41 | "size": 0,
42 | "aggs": {
43 | "orders_without_status": {
44 | "missing": {
45 | "field": "status"
46 | },
47 | "aggs": {
48 | "missing_sum": {
49 | "sum": {
50 | "field": "total_amount"
51 | }
52 | }
53 | }
54 | }
55 | }
56 | }
57 | ```
58 |
59 | ## Deleting test documents
60 |
61 | ```
62 | DELETE /orders/_doc/1001
63 | ```
64 |
65 | ```
66 | DELETE /orders/_doc/1002
67 | ```
--------------------------------------------------------------------------------
/Aggregations/nested-aggregations.md:
--------------------------------------------------------------------------------
1 | # Nested aggregations
2 |
3 | ## Retrieving statistics for each status
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "status_terms": {
11 | "terms": {
12 | "field": "status"
13 | },
14 | "aggs": {
15 | "status_stats": {
16 | "stats": {
17 | "field": "total_amount"
18 | }
19 | }
20 | }
21 | }
22 | }
23 | }
24 | ```
25 |
26 | ## Narrowing down the aggregation context
27 |
28 | ```
29 | GET /orders/_search
30 | {
31 | "size": 0,
32 | "query": {
33 | "range": {
34 | "total_amount": {
35 | "gte": 100
36 | }
37 | }
38 | },
39 | "aggs": {
40 | "status_terms": {
41 | "terms": {
42 | "field": "status"
43 | },
44 | "aggs": {
45 | "status_stats": {
46 | "stats": {
47 | "field": "total_amount"
48 | }
49 | }
50 | }
51 | }
52 | }
53 | }
54 | ```
--------------------------------------------------------------------------------
/Aggregations/range-aggregations.md:
--------------------------------------------------------------------------------
1 | # Range aggregations
2 |
3 | ## `range` aggregation
4 |
5 | ```
6 | GET /orders/_search
7 | {
8 | "size": 0,
9 | "aggs": {
10 | "amount_distribution": {
11 | "range": {
12 | "field": "total_amount",
13 | "ranges": [
14 | {
15 | "to": 50
16 | },
17 | {
18 | "from": 50,
19 | "to": 100
20 | },
21 | {
22 | "from": 100
23 | }
24 | ]
25 | }
26 | }
27 | }
28 | }
29 | ```
30 |
31 | ## `date_range` aggregation
32 |
33 | ```
34 | GET /orders/_search
35 | {
36 | "size": 0,
37 | "aggs": {
38 | "purchased_ranges": {
39 | "date_range": {
40 | "field": "purchased_at",
41 | "ranges": [
42 | {
43 | "from": "2016-01-01",
44 | "to": "2016-01-01||+6M"
45 | },
46 | {
47 | "from": "2016-01-01||+6M",
48 | "to": "2016-01-01||+1y"
49 | }
50 | ]
51 | }
52 | }
53 | }
54 | }
55 | ```
56 |
57 | ## Specifying the date format
58 |
59 | ```
60 | GET /orders/_search
61 | {
62 | "size": 0,
63 | "aggs": {
64 | "purchased_ranges": {
65 | "date_range": {
66 | "field": "purchased_at",
67 | "format": "yyyy-MM-dd",
68 | "ranges": [
69 | {
70 | "from": "2016-01-01",
71 | "to": "2016-01-01||+6M"
72 | },
73 | {
74 | "from": "2016-01-01||+6M",
75 | "to": "2016-01-01||+1y"
76 | }
77 | ]
78 | }
79 | }
80 | }
81 | }
82 | ```
83 |
84 | ## Enabling keys for the buckets
85 |
86 | ```
87 | GET /orders/_search
88 | {
89 | "size": 0,
90 | "aggs": {
91 | "purchased_ranges": {
92 | "date_range": {
93 | "field": "purchased_at",
94 | "format": "yyyy-MM-dd",
95 | "keyed": true,
96 | "ranges": [
97 | {
98 | "from": "2016-01-01",
99 | "to": "2016-01-01||+6M"
100 | },
101 | {
102 | "from": "2016-01-01||+6M",
103 | "to": "2016-01-01||+1y"
104 | }
105 | ]
106 | }
107 | }
108 | }
109 | }
110 | ```
111 |
112 | ## Defining the bucket keys
113 |
114 | ```
115 | GET /orders/_search
116 | {
117 | "size": 0,
118 | "aggs": {
119 | "purchased_ranges": {
120 | "date_range": {
121 | "field": "purchased_at",
122 | "format": "yyyy-MM-dd",
123 | "keyed": true,
124 | "ranges": [
125 | {
126 | "from": "2016-01-01",
127 | "to": "2016-01-01||+6M",
128 | "key": "first_half"
129 | },
130 | {
131 | "from": "2016-01-01||+6M",
132 | "to": "2016-01-01||+1y",
133 | "key": "second_half"
134 | }
135 | ]
136 | }
137 | }
138 | }
139 | }
140 | ```
141 |
142 | ## Adding a sub-aggregation
143 |
144 | ```
145 | GET /orders/_search
146 | {
147 | "size": 0,
148 | "aggs": {
149 | "purchased_ranges": {
150 | "date_range": {
151 | "field": "purchased_at",
152 | "format": "yyyy-MM-dd",
153 | "keyed": true,
154 | "ranges": [
155 | {
156 | "from": "2016-01-01",
157 | "to": "2016-01-01||+6M",
158 | "key": "first_half"
159 | },
160 | {
161 | "from": "2016-01-01||+6M",
162 | "to": "2016-01-01||+1y",
163 | "key": "second_half"
164 | }
165 | ]
166 | },
167 | "aggs": {
168 | "bucket_stats": {
169 | "stats": {
170 | "field": "total_amount"
171 | }
172 | }
173 | }
174 | }
175 | }
176 | }
177 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/filters.md:
--------------------------------------------------------------------------------
1 | # Filters
2 |
3 | ## Adding a `filter` clause to the `bool` query
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "query": {
9 | "bool": {
10 | "must": [
11 | {
12 | "match": {
13 | "title": "pasta"
14 | }
15 | }
16 | ],
17 | "filter": [
18 | {
19 | "range": {
20 | "preparation_time_minutes": {
21 | "lte": 15
22 | }
23 | }
24 | }
25 | ]
26 | }
27 | }
28 | }
29 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/sorting-by-multi-value-fields.md:
--------------------------------------------------------------------------------
1 | # Sorting by multi-value fields
2 |
3 | ## Sorting by the average rating (descending)
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "_source": "ratings",
9 | "query": {
10 | "match_all": {}
11 | },
12 | "sort": [
13 | {
14 | "ratings": {
15 | "order": "desc",
16 | "mode": "avg"
17 | }
18 | }
19 | ]
20 | }
21 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/sorting-results.md:
--------------------------------------------------------------------------------
1 | # Sorting results
2 |
3 | ## Sorting by ascending order (implicitly)
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "_source": false,
9 | "query": {
10 | "match_all": {}
11 | },
12 | "sort": [
13 | "preparation_time_minutes"
14 | ]
15 | }
16 | ```
17 |
18 | ## Sorting by descending order
19 |
20 | ```
21 | GET /recipes/_search
22 | {
23 | "_source": "created",
24 | "query": {
25 | "match_all": {}
26 | },
27 | "sort": [
28 | { "created": "desc" }
29 | ]
30 | }
31 | ```
32 |
33 | ## Sorting by multiple fields
34 |
35 | ```
36 | GET /recipes/_search
37 | {
38 | "_source": [ "preparation_time_minutes", "created" ],
39 | "query": {
40 | "match_all": {}
41 | },
42 | "sort": [
43 | { "preparation_time_minutes": "asc" },
44 | { "created": "desc" }
45 | ]
46 | }
47 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/source-filtering.md:
--------------------------------------------------------------------------------
1 | # Source filtering
2 |
3 | ## Excluding the `_source` field altogether
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "_source": false,
9 | "query": {
10 | "match": { "title": "pasta" }
11 | }
12 | }
13 | ```
14 |
15 | ## Only returning the `created` field
16 |
17 | ```
18 | GET /recipes/_search
19 | {
20 | "_source": "created",
21 | "query": {
22 | "match": { "title": "pasta" }
23 | }
24 | }
25 | ```
26 |
27 | ## Only returning an object's key
28 |
29 | ```
30 | GET /recipes/_search
31 | {
32 | "_source": "ingredients.name",
33 | "query": {
34 | "match": { "title": "pasta" }
35 | }
36 | }
37 | ```
38 |
39 | ## Returning all of an object's keys
40 |
41 | ```
42 | GET /recipes/_search
43 | {
44 | "_source": "ingredients.*",
45 | "query": {
46 | "match": { "title": "pasta" }
47 | }
48 | }
49 | ```
50 |
51 | ## Returning the `ingredients` object with all keys, __and__ the `servings` field
52 |
53 | ```
54 | GET /recipes/_search
55 | {
56 | "_source": [ "ingredients.*", "servings" ],
57 | "query": {
58 | "match": { "title": "pasta" }
59 | }
60 | }
61 | ```
62 |
63 | ## Including all of the `ingredients` object's keys, except the `name` key
64 |
65 | ```
66 | GET /recipes/_search
67 | {
68 | "_source": {
69 | "includes": "ingredients.*",
70 | "excludes": "ingredients.name"
71 | },
72 | "query": {
73 | "match": { "title": "pasta" }
74 | }
75 | }
76 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/specifying-an-offset.md:
--------------------------------------------------------------------------------
1 | # Specifying an offset
2 |
3 | ## Specifying an offset with the `from` parameter
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "_source": false,
9 | "size": 2,
10 | "from": 2,
11 | "query": {
12 | "match": {
13 | "title": "pasta"
14 | }
15 | }
16 | }
17 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/specifying-the-result-format.md:
--------------------------------------------------------------------------------
1 | # Specifying the result format
2 |
3 | ## Returning results as YAML
4 |
5 | ```
6 | GET /recipes/_search?format=yaml
7 | {
8 | "query": {
9 | "match": { "title": "pasta" }
10 | }
11 | }
12 | ```
13 |
14 | ## Returning pretty JSON
15 |
16 | ```
17 | GET /recipes/_search?pretty
18 | {
19 | "query": {
20 | "match": { "title": "pasta" }
21 | }
22 | }
23 | ```
--------------------------------------------------------------------------------
/Controlling Query Results/specifying-the-result-size.md:
--------------------------------------------------------------------------------
1 | # Specifying the result size
2 |
3 | ## Using a query parameter
4 |
5 | ```
6 | GET /recipes/_search?size=2
7 | {
8 | "_source": false,
9 | "query": {
10 | "match": {
11 | "title": "pasta"
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ## Using a parameter within the request body
18 |
19 | ```
20 | GET /recipes/_search
21 | {
22 | "_source": false,
23 | "size": 2,
24 | "query": {
25 | "match": {
26 | "title": "pasta"
27 | }
28 | }
29 | }
30 | ```
--------------------------------------------------------------------------------
/Getting Started/adding-more-nodes-to-the-cluster.md:
--------------------------------------------------------------------------------
1 | # Adding more nodes to the cluster (for development)
2 |
3 | ## Checking the cluster's health
4 |
5 | ```
6 | GET /_cluster/health
7 | ```
8 |
9 | ## Checking the shard distribution
10 |
11 | ```
12 | GET /_cat/shards?v
13 | ```
14 |
15 | ## Generating an enrollment token
16 | When adding a new node to an existing Elasticsearch cluster, we first need to generate an enrollment token.
17 |
18 | ```
19 | # macOS & Linux
20 | bin/elasticsearch-create-enrollment-token --scope node
21 |
22 | # Windows
23 | bin\elasticsearch-create-enrollment-token.bat -s node
24 | ```
25 |
26 | ## Adding a new node to the cluster
27 | To add a new node to an existing cluster, run the following command. Remember to have the working
28 | directory set to the new node's `$ES_HOME` directory (use the `cd` command for this).
29 |
30 | ```
31 | # macOS & Linux
32 | bin/elasticsearch --enrollment-token [INSERT_ENROLLMENT_TOKEN_HERE]
33 |
34 | # Windows
35 | bin\elasticsearch.bat --enrollment-token [INSERT_ENROLLMENT_TOKEN_HERE]
36 | ```
37 |
38 | Once the node has been added, starting up the node again in the future is as simple as
39 | running `bin/elasticsearch` (macOS & Linux) or `bin\elasticsearch.bat` (Windows).
--------------------------------------------------------------------------------
/Getting Started/inspecting-the-cluster.md:
--------------------------------------------------------------------------------
1 | # Inspecting the cluster
2 |
3 | ## Checking the cluster's health
4 |
5 | ```
6 | GET /_cluster/health
7 | ```
8 |
9 | ## Listing the cluster's nodes
10 |
11 | ```
12 | GET /_cat/nodes?v
13 | ```
14 |
15 | ## Listing the cluster's indices
16 |
17 | ```
18 | GET /_cat/indices?v&expand_wildcards=all
19 | ```
--------------------------------------------------------------------------------
/Getting Started/overview-of-node-roles.md:
--------------------------------------------------------------------------------
1 | # Overview of node roles
2 |
3 | ## Listing the cluster's nodes (and their roles)
4 |
5 | ```
6 | GET /_cat/nodes?v
7 | ```
--------------------------------------------------------------------------------
/Getting Started/sending-queries-with-curl.md:
--------------------------------------------------------------------------------
1 | # Sending queries with cURL
2 |
3 | ## Handling self signed certificates
4 |
5 | Local deployments of Elasticsearch are protected with a self signed certificate by default, which HTTP clients do not trust.
6 | Sending a request will therefore fail with a certificate error. To fix this, we have a couple of options.
7 | For cloud deployments, simply skip this step.
8 |
9 | ### 1. Skip certificate verification
10 |
11 | One option is to entirely skip the verification of the certificate. This is not exactly best practice,
12 | but if you are just developing with a local cluster, then it might be just fine. To ignore the
13 | certificate, use either the `--insecure` flag or `-k`.
14 |
15 | ```
16 | curl --insecure [...]
17 | curl -k [...]
18 | ```
19 |
20 | ### 2. Provide the CA certificate
21 |
22 | A better approach is to provide the CA certificate so that the TLS certificate is not just ignored.
23 | The path to the file can be supplied with the `--cacert` argument. The CA certificate is typically stored within
24 | the `config/certs` directory, although the `certs` directory may be at the root of your Elasticsearch
25 | home directory (`$ES_HOME`) depending on how you installed Elasticsearch.
26 |
27 | ```
28 | # macOS & Linux
29 | cd /path/to/elasticsearch
30 | curl --cacert config/certs/http_ca.crt [...]
31 |
32 | # Windows
33 | cd C:\Path\To\Elasticsearch
34 | curl --cacert config\certs\http_ca.crt [...]
35 | ```
36 |
37 | Alternatively, you can specify the absolute path to the file.
38 |
39 | ## Authentication
40 | All requests made to Elasticsearch must be authenticated by default.
41 |
42 | ### Local deployments
43 | For local deployments, use the password that was generated for the `elastic` user the first time Elasticsearch started up.
44 |
45 | ```
46 | curl -u elastic [...]
47 | ```
48 |
49 | The above will prompt you to enter the password when running the command. Alternatively, you can enter
50 | the password directly within the command as follows (without the brackets).
51 |
52 | ```
53 | curl -u elastic:[YOUR_PASSWORD_HERE] [...]
54 | ```
55 |
56 | Note that this exposes your password within the terminal, so this is not best practice from a security perspective.
57 |
58 | ### Elastic Cloud
59 | With Elastic Cloud, we should add an `Authorization` header to our requests and include an API key. API keys can be
60 | created within Kibana (Stack Management > Security > API keys). Replace `API_TOKEN` below with the base64 encoded API key.
61 |
62 | ```bash
63 | curl -H "Authorization:ApiKey API_TOKEN" [...]
64 | ```
65 |
66 | ## Adding a request body & `Content-Type` header
67 |
68 | To send data within the request, use the `-d` argument, e.g. for the `match_all` query. Note that using
69 | single quotes does not work on Windows, so each double quote within the JSON object must be escaped.
70 |
71 | ```
72 | # macOS & Linux
73 | curl [...] https://localhost:9200/products/_search -d '{ "query": { "match_all": {} } }'
74 |
75 | # Windows
76 | curl [...] https://localhost:9200/products/_search -d "{ \"query\": { \"match_all\": {} } }"
77 | ```
78 |
79 | When sending data (typically JSON), we need to tell Elasticsearch which type of data we are sending. This
80 | can be done with the `Content-Type` HTTP header. Simply add it with cURL's `-H` argument.
81 |
82 | ```
83 | curl -H "Content-Type:application/json" [...]
84 | ```
85 |
86 | ## Specifying the HTTP verb
87 |
88 | You may also specify the HTTP verb (e.g. `POST`). This is necessary for some endpoints, such as when
89 | indexing documents. `GET` is assumed by default.
90 |
91 | ```
92 | curl -X POST [...]
93 | ```
94 |
95 | ## All together now
96 |
97 | ```
98 | # macOS & Linux
99 | curl --cacert config/certs/http_ca.crt -u elastic https://localhost:9200/products/_search -d '{ "query": { "match_all": {} } }'
100 |
101 | # Windows
102 | curl --cacert config\certs\http_ca.crt -u elastic https://localhost:9200/products/_search -d "{ \"query\": { \"match_all\": {} } }"
103 | ```
--------------------------------------------------------------------------------
/Getting Started/setting-up-elasticsearch-kibana-macos-linux.md:
--------------------------------------------------------------------------------
1 | # Setting up Elasticsearch & Kibana on macOS & Linux
2 |
3 | ## Extracting the archives
4 |
5 | Both the Elasticsearch and Kibana archives can be extracted by using the below commands.
6 | Alternatively, simply double-clicking them should do the trick.
7 |
8 | ```
9 | cd /path/to/archive/directory
10 | tar -zxf archive.tar.gz
11 | ```
12 |
13 | ## Starting up Elasticsearch
14 |
15 | ```
16 | cd /path/to/elasticsearch
17 | bin/elasticsearch
18 | ```
19 |
20 | ## Resetting the `elastic` user's password
21 |
22 | If you lose the password for the `elastic` user, it can be reset with the following commands.
23 |
24 | ```
25 | cd /path/to/elasticsearch
26 | bin/elasticsearch-reset-password -u elastic
27 | ```
28 |
29 | ## Generating a new Kibana enrollment token
30 |
31 | If you need to generate a new enrollment token for Kibana, this can be done with the following commands.
32 |
33 | ```
34 | cd /path/to/elasticsearch
35 | bin/elasticsearch-create-enrollment-token --scope kibana
36 | ```
37 |
38 | ## Disabling Gatekeeper for Kibana directory
39 |
40 | macOS contains a security feature named Gatekeeper, which prevents Kibana from starting up.
41 | We can disable it for just the Kibana directory, which allows Kibana to start up correctly.
42 | Simply use the following command to do so.
43 |
44 | ```
45 | xattr -d -r com.apple.quarantine /path/to/kibana
46 | ```
47 |
48 | ## Starting up Kibana
49 |
50 | ```
51 | cd /path/to/kibana
52 | bin/kibana
53 | ```
--------------------------------------------------------------------------------
/Getting Started/setting-up-elasticsearch-kibana-windows.md:
--------------------------------------------------------------------------------
1 | # Setting up Elasticsearch & Kibana on Windows
2 |
3 | ## Starting up Elasticsearch
4 |
5 | ```
6 | cd path\to\elasticsearch
7 | bin\elasticsearch.bat
8 | ```
9 |
10 | ## Resetting the `elastic` user's password
11 |
12 | If you lose the password for the `elastic` user, it can be reset with the following commands.
13 |
14 | ```
15 | cd path\to\elasticsearch
16 | bin\elasticsearch-reset-password.bat -u elastic
17 | ```
18 |
19 | ## Generating a new Kibana enrollment token
20 |
21 | If you need to generate a new enrollment token for Kibana, this can be done with the following commands.
22 |
23 | ```
24 | cd path\to\elasticsearch
25 | bin\elasticsearch-create-enrollment-token.bat -s kibana
26 | ```
27 |
28 | ## Starting up Kibana
29 |
30 | ```
31 | cd path\to\kibana
32 | bin\kibana.bat
33 | ```
--------------------------------------------------------------------------------
/Getting Started/sharding-and-scalability.md:
--------------------------------------------------------------------------------
1 | # Sharding and scalability
2 |
3 | ## Listing the cluster's indices
4 |
5 | ```
6 | GET /_cat/indices?v
7 | ```
--------------------------------------------------------------------------------
/Getting Started/understanding-replication.md:
--------------------------------------------------------------------------------
1 | # Understanding replication
2 |
3 | ## Creating a new index
4 |
5 | ```
6 | PUT /pages
7 | ```
8 |
9 | ## Checking the cluster's health
10 |
11 | ```
12 | GET /_cluster/health
13 | ```
14 |
15 | ## Listing the cluster's indices
16 |
17 | ```
18 | GET /_cat/indices?v
19 | ```
20 |
21 | ## Listing the cluster's shards
22 |
23 | ```
24 | GET /_cat/shards?v
25 | ```
--------------------------------------------------------------------------------
/Improving Search Results/adding-synonyms-from-file.md:
--------------------------------------------------------------------------------
1 | # Adding synonyms from file
2 |
3 | ## Adding index with custom analyzer
4 |
5 | ```
6 | PUT /synonyms
7 | {
8 | "settings": {
9 | "analysis": {
10 | "filter": {
11 | "synonym_test": {
12 | "type": "synonym",
13 | "synonyms_path": "analysis/synonyms.txt"
14 | }
15 | },
16 | "analyzer": {
17 | "my_analyzer": {
18 | "tokenizer": "standard",
19 | "filter": [
20 | "lowercase",
21 | "synonym_test"
22 | ]
23 | }
24 | }
25 | }
26 | },
27 | "mappings": {
28 | "properties": {
29 | "description": {
30 | "type": "text",
31 | "analyzer": "my_analyzer"
32 | }
33 | }
34 | }
35 | }
36 | ```
37 |
38 | ## Synonyms file (`config/analysis/synonyms.txt`)
39 |
40 | ```
41 | # This is a comment
42 |
43 | awful => terrible
44 | awesome => great, super
45 | elasticsearch, logstash, kibana => elk
46 | weird, strange
47 | ```
48 |
49 | ## Testing the analyzer
50 |
51 | ```
52 | POST /synonyms/_analyze
53 | {
54 | "analyzer": "my_analyzer",
55 | "text": "Elasticsearch"
56 | }
57 | ```
--------------------------------------------------------------------------------
/Improving Search Results/adding-synonyms.md:
--------------------------------------------------------------------------------
1 | # Adding synonyms
2 |
3 | ## Creating index with custom analyzer
4 |
5 | ```
6 | PUT /synonyms
7 | {
8 | "settings": {
9 | "analysis": {
10 | "filter": {
11 | "synonym_test": {
12 | "type": "synonym",
13 | "synonyms": [
14 | "awful => terrible",
15 | "awesome => great, super",
16 | "elasticsearch, logstash, kibana => elk",
17 | "weird, strange"
18 | ]
19 | }
20 | },
21 | "analyzer": {
22 | "my_analyzer": {
23 | "tokenizer": "standard",
24 | "filter": [
25 | "lowercase",
26 | "synonym_test"
27 | ]
28 | }
29 | }
30 | }
31 | },
32 | "mappings": {
33 | "properties": {
34 | "description": {
35 | "type": "text",
36 | "analyzer": "my_analyzer"
37 | }
38 | }
39 | }
40 | }
41 | ```
42 |
43 | ## Testing the analyzer (with synonyms)
44 |
45 | ```
46 | POST /synonyms/_analyze
47 | {
48 | "analyzer": "my_analyzer",
49 | "text": "awesome"
50 | }
51 | ```
52 |
53 | ```
54 | POST /synonyms/_analyze
55 | {
56 | "analyzer": "my_analyzer",
57 | "text": "Elasticsearch"
58 | }
59 | ```
60 |
61 | ```
62 | POST /synonyms/_analyze
63 | {
64 | "analyzer": "my_analyzer",
65 | "text": "weird"
66 | }
67 | ```
68 |
69 | ```
70 | POST /synonyms/_analyze
71 | {
72 | "analyzer": "my_analyzer",
73 | "text": "Elasticsearch is awesome, but can also seem weird sometimes."
74 | }
75 | ```
76 |
77 | ## Adding a test document
78 |
79 | ```
80 | POST /synonyms/_doc
81 | {
82 | "description": "Elasticsearch is awesome, but can also seem weird sometimes."
83 | }
84 | ```
85 |
86 | ## Searching the index for synonyms
87 |
88 | ```
89 | GET /synonyms/_search
90 | {
91 | "query": {
92 | "match": {
93 | "description": "great"
94 | }
95 | }
96 | }
97 | ```
98 |
99 | ```
100 | GET /synonyms/_search
101 | {
102 | "query": {
103 | "match": {
104 | "description": "awesome"
105 | }
106 | }
107 | }
108 | ```
--------------------------------------------------------------------------------
/Improving Search Results/affecting-relevance-scoring-with-proximity.md:
--------------------------------------------------------------------------------
1 | # Affecting relevance scoring with proximity
2 |
3 | ## A simple `match` query within a `bool` query
4 |
5 | ```
6 | GET /proximity/_search
7 | {
8 | "query": {
9 | "bool": {
10 | "must": [
11 | {
12 | "match": {
13 | "title": {
14 | "query": "spicy sauce"
15 | }
16 | }
17 | }
18 | ]
19 | }
20 | }
21 | }
22 | ```
23 |
24 | ## Boosting relevance based on proximity
25 |
26 | ```
27 | GET /proximity/_search
28 | {
29 | "query": {
30 | "bool": {
31 | "must": [
32 | {
33 | "match": {
34 | "title": {
35 | "query": "spicy sauce"
36 | }
37 | }
38 | }
39 | ],
40 | "should": [
41 | {
42 | "match_phrase": {
43 | "title": {
44 | "query": "spicy sauce"
45 | }
46 | }
47 | }
48 | ]
49 | }
50 | }
51 | }
52 | ```
53 |
54 | ## Adding the `slop` parameter
55 |
56 | ```
57 | GET /proximity/_search
58 | {
59 | "query": {
60 | "bool": {
61 | "must": [
62 | {
63 | "match": {
64 | "title": {
65 | "query": "spicy sauce"
66 | }
67 | }
68 | }
69 | ],
70 | "should": [
71 | {
72 | "match_phrase": {
73 | "title": {
74 | "query": "spicy sauce",
75 | "slop": 5
76 | }
77 | }
78 | }
79 | ]
80 | }
81 | }
82 | }
83 | ```
--------------------------------------------------------------------------------
/Improving Search Results/fuzzy-match-query.md:
--------------------------------------------------------------------------------
1 | # Fuzzy `match` query
2 |
3 | ## Searching with `fuzziness` set to `auto`
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "match": {
10 | "name": {
11 | "query": "l0bster",
12 | "fuzziness": "auto"
13 | }
14 | }
15 | }
16 | }
17 | ```
18 |
19 | ```
20 | GET /products/_search
21 | {
22 | "query": {
23 | "match": {
24 | "name": {
25 | "query": "lobster",
26 | "fuzziness": "auto"
27 | }
28 | }
29 | }
30 | }
31 | ```
32 |
33 | ## Fuzziness is per term (and specifying an integer)
34 |
35 | ```
36 | GET /products/_search
37 | {
38 | "query": {
39 | "match": {
40 | "name": {
41 | "query": "l0bster love",
42 | "operator": "and",
43 | "fuzziness": 1
44 | }
45 | }
46 | }
47 | }
48 | ```
49 |
50 | ## Switching letters around with transpositions
51 |
52 | ```
53 | GET /products/_search
54 | {
55 | "query": {
56 | "match": {
57 | "name": {
58 | "query": "lvie",
59 | "fuzziness": 1
60 | }
61 | }
62 | }
63 | }
64 | ```
65 |
66 | ## Disabling transpositions
67 |
68 | ```
69 | GET /products/_search
70 | {
71 | "query": {
72 | "match": {
73 | "name": {
74 | "query": "lvie",
75 | "fuzziness": 1,
76 | "fuzzy_transpositions": false
77 | }
78 | }
79 | }
80 | }
81 | ```
--------------------------------------------------------------------------------
/Improving Search Results/fuzzy-query.md:
--------------------------------------------------------------------------------
1 | # `fuzzy` query
2 |
3 | ```
4 | GET /products/_search
5 | {
6 | "query": {
7 | "fuzzy": {
8 | "name": {
9 | "value": "LOBSTER",
10 | "fuzziness": "auto"
11 | }
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ```
18 | GET /products/_search
19 | {
20 | "query": {
21 | "fuzzy": {
22 | "name": {
23 | "value": "lobster",
24 | "fuzziness": "auto"
25 | }
26 | }
27 | }
28 | }
29 | ```
--------------------------------------------------------------------------------
/Improving Search Results/highlighting-matches-in-fields.md:
--------------------------------------------------------------------------------
1 | # Highlighting matches in fields
2 |
3 | ## Adding a test document
4 |
5 | ```
6 | PUT /highlighting/_doc/1
7 | {
8 | "description": "Let me tell you a story about Elasticsearch. It's a full-text search engine that is built on Apache Lucene. It's really easy to use, but also packs lots of advanced features that you can use to tweak its searching capabilities. Lots of well-known and established companies use Elasticsearch, and so should you!"
9 | }
10 | ```
11 |
12 | ## Highlighting matches within the `description` field
13 |
14 | ```
15 | GET /highlighting/_search
16 | {
17 | "_source": false,
18 | "query": {
19 | "match": { "description": "Elasticsearch story" }
20 | },
21 | "highlight": {
22 | "fields": {
23 | "description" : {}
24 | }
25 | }
26 | }
27 | ```
28 |
29 | ## Specifying a custom tag
30 |
31 | ```
32 | GET /highlighting/_search
33 | {
34 | "_source": false,
35 | "query": {
36 | "match": { "description": "Elasticsearch story" }
37 | },
38 | "highlight": {
39 | "pre_tags": [ "" ],
40 | "post_tags": [ "" ],
41 | "fields": {
42 | "description" : {}
43 | }
44 | }
45 | }
46 | ```
--------------------------------------------------------------------------------
/Improving Search Results/proximity-searches.md:
--------------------------------------------------------------------------------
1 | # Proximity searches
2 |
3 | ## Adding test documents
4 |
5 | ```
6 | PUT /proximity/_doc/1
7 | {
8 | "title": "Spicy Sauce"
9 | }
10 | ```
11 |
12 | ```
13 | PUT /proximity/_doc/2
14 | {
15 | "title": "Spicy Tomato Sauce"
16 | }
17 | ```
18 |
19 | ```
20 | PUT /proximity/_doc/3
21 | {
22 | "title": "Spicy Tomato and Garlic Sauce"
23 | }
24 | ```
25 |
26 | ```
27 | PUT /proximity/_doc/4
28 | {
29 | "title": "Tomato Sauce (spicy)"
30 | }
31 | ```
32 |
33 | ```
34 | PUT /proximity/_doc/5
35 | {
36 | "title": "Spicy and very delicious Tomato Sauce"
37 | }
38 | ```
39 |
40 | ## Adding the `slop` parameter to a `match_phrase` query
41 |
42 | ```
43 | GET /proximity/_search
44 | {
45 | "query": {
46 | "match_phrase": {
47 | "title": {
48 | "query": "spicy sauce",
49 | "slop": 1
50 | }
51 | }
52 | }
53 | }
54 | ```
55 |
56 | ```
57 | GET /proximity/_search
58 | {
59 | "query": {
60 | "match_phrase": {
61 | "title": {
62 | "query": "spicy sauce",
63 | "slop": 2
64 | }
65 | }
66 | }
67 | }
68 | ```
--------------------------------------------------------------------------------
/Improving Search Results/stemming.md:
--------------------------------------------------------------------------------
1 | # Stemming
2 |
3 | ## Creating a test index
4 |
5 | ```
6 | PUT /stemming_test
7 | {
8 | "settings": {
9 | "analysis": {
10 | "filter": {
11 | "synonym_test": {
12 | "type": "synonym",
13 | "synonyms": [
14 | "firm => company",
15 | "love, enjoy"
16 | ]
17 | },
18 | "stemmer_test" : {
19 | "type" : "stemmer",
20 | "name" : "english"
21 | }
22 | },
23 | "analyzer": {
24 | "my_analyzer": {
25 | "tokenizer": "standard",
26 | "filter": [
27 | "lowercase",
28 | "synonym_test",
29 | "stemmer_test"
30 | ]
31 | }
32 | }
33 | }
34 | },
35 | "mappings": {
36 | "properties": {
37 | "description": {
38 | "type": "text",
39 | "analyzer": "my_analyzer"
40 | }
41 | }
42 | }
43 | }
44 | ```
45 |
46 | ## Adding a test document
47 |
48 | ```
49 | PUT /stemming_test/_doc/1
50 | {
51 | "description": "I love working for my firm!"
52 | }
53 | ```
54 |
55 | ## Matching the document with the base word (`work`)
56 |
57 | ```
58 | GET /stemming_test/_search
59 | {
60 | "query": {
61 | "match": {
62 | "description": "enjoy work"
63 | }
64 | }
65 | }
66 | ```
67 |
68 | ## The query is stemmed, so the document still matches
69 |
70 | ```
71 | GET /stemming_test/_search
72 | {
73 | "query": {
74 | "match": {
75 | "description": "love working"
76 | }
77 | }
78 | }
79 | ```
80 |
81 | ## Synonyms and stemmed words are still highlighted
82 |
83 | ```
84 | GET /stemming_test/_search
85 | {
86 | "query": {
87 | "match": {
88 | "description": "enjoy work"
89 | }
90 | },
91 | "highlight": {
92 | "fields": {
93 | "description": {}
94 | }
95 | }
96 | }
97 | ```
--------------------------------------------------------------------------------
/Joining Queries/add-departments-test-data.md:
--------------------------------------------------------------------------------
1 | # Add departments test data
2 |
3 | ## Create a new index
4 |
5 | ```
6 | PUT /department
7 | {
8 | "mappings": {
9 | "properties": {
10 | "name": {
11 | "type": "text",
12 | "fields": {
13 | "keyword": {
14 | "type": "keyword"
15 | }
16 | }
17 | },
18 | "employees": {
19 | "type": "nested"
20 | }
21 | }
22 | }
23 | }
24 | ```
25 |
26 | ## Add two test documents
27 |
28 | ```
29 | PUT /department/_doc/1
30 | {
31 | "name": "Development",
32 | "employees": [
33 | {
34 | "name": "Eric Green",
35 | "age": 39,
36 | "gender": "M",
37 | "position": "Big Data Specialist"
38 | },
39 | {
40 | "name": "James Taylor",
41 | "age": 27,
42 | "gender": "M",
43 | "position": "Software Developer"
44 | },
45 | {
46 | "name": "Gary Jenkins",
47 | "age": 21,
48 | "gender": "M",
49 | "position": "Intern"
50 | },
51 | {
52 | "name": "Julie Powell",
53 | "age": 26,
54 | "gender": "F",
55 | "position": "Intern"
56 | },
57 | {
58 | "name": "Benjamin Smith",
59 | "age": 46,
60 | "gender": "M",
61 | "position": "Senior Software Engineer"
62 | }
63 | ]
64 | }
65 | ```
66 |
67 | ```
68 | PUT /department/_doc/2
69 | {
70 | "name": "HR & Marketing",
71 | "employees": [
72 | {
73 | "name": "Patricia Lewis",
74 | "age": 42,
75 | "gender": "F",
76 | "position": "Senior Marketing Manager"
77 | },
78 | {
79 | "name": "Maria Anderson",
80 | "age": 56,
81 | "gender": "F",
82 | "position": "Head of HR"
83 | },
84 | {
85 | "name": "Margaret Harris",
86 | "age": 19,
87 | "gender": "F",
88 | "position": "Intern"
89 | },
90 | {
91 | "name": "Ryan Nelson",
92 | "age": 31,
93 | "gender": "M",
94 | "position": "Marketing Manager"
95 | },
96 | {
97 | "name": "Kathy Williams",
98 | "age": 49,
99 | "gender": "F",
100 | "position": "Senior Marketing Manager"
101 | },
102 | {
103 | "name": "Jacqueline Hill",
104 | "age": 28,
105 | "gender": "F",
106 | "position": "Junior Marketing Manager"
107 | },
108 | {
109 | "name": "Donald Morris",
110 | "age": 39,
111 | "gender": "M",
112 | "position": "SEO Specialist"
113 | },
114 | {
115 | "name": "Evelyn Henderson",
116 | "age": 24,
117 | "gender": "F",
118 | "position": "Intern"
119 | },
120 | {
121 | "name": "Earl Moore",
122 | "age": 21,
123 | "gender": "M",
124 | "position": "Junior SEO Specialist"
125 | },
126 | {
127 | "name": "Phillip Sanchez",
128 | "age": 35,
129 | "gender": "M",
130 | "position": "SEM Specialist"
131 | }
132 | ]
133 | }
134 | ```
--------------------------------------------------------------------------------
/Joining Queries/adding-documents.md:
--------------------------------------------------------------------------------
1 | # Adding documents
2 |
3 | ## Adding departments
4 |
5 | ```
6 | PUT /department/_doc/1
7 | {
8 | "name": "Development",
9 | "join_field": "department"
10 | }
11 | ```
12 |
13 | ```
14 | PUT /department/_doc/2
15 | {
16 | "name": "Marketing",
17 | "join_field": "department"
18 | }
19 | ```
20 |
21 | ## Adding employees for departments
22 |
23 | ```
24 | PUT /department/_doc/3?routing=1
25 | {
26 | "name": "Bo Andersen",
27 | "age": 28,
28 | "gender": "M",
29 | "join_field": {
30 | "name": "employee",
31 | "parent": 1
32 | }
33 | }
34 | ```
35 |
36 | ```
37 | PUT /department/_doc/4?routing=2
38 | {
39 | "name": "John Doe",
40 | "age": 44,
41 | "gender": "M",
42 | "join_field": {
43 | "name": "employee",
44 | "parent": 2
45 | }
46 | }
47 | ```
48 |
49 | ```
50 | PUT /department/_doc/5?routing=1
51 | {
52 | "name": "James Evans",
53 | "age": 32,
54 | "gender": "M",
55 | "join_field": {
56 | "name": "employee",
57 | "parent": 1
58 | }
59 | }
60 | ```
61 |
62 | ```
63 | PUT /department/_doc/6?routing=1
64 | {
65 | "name": "Daniel Harris",
66 | "age": 52,
67 | "gender": "M",
68 | "join_field": {
69 | "name": "employee",
70 | "parent": 1
71 | }
72 | }
73 | ```
74 |
75 | ```
76 | PUT /department/_doc/7?routing=2
77 | {
78 | "name": "Jane Park",
79 | "age": 23,
80 | "gender": "F",
81 | "join_field": {
82 | "name": "employee",
83 | "parent": 2
84 | }
85 | }
86 | ```
87 |
88 | ```
89 | PUT /department/_doc/8?routing=1
90 | {
91 | "name": "Christina Parker",
92 | "age": 29,
93 | "gender": "F",
94 | "join_field": {
95 | "name": "employee",
96 | "parent": 1
97 | }
98 | }
99 | ```
--------------------------------------------------------------------------------
/Joining Queries/mapping-document-relationships.md:
--------------------------------------------------------------------------------
1 | # Mapping document relationships
2 |
3 | ```
4 | PUT /department/_mapping
5 | {
6 | "properties": {
7 | "join_field": {
8 | "type": "join",
9 | "relations": {
10 | "department": "employee"
11 | }
12 | }
13 | }
14 | }
15 | ```
16 |
--------------------------------------------------------------------------------
/Joining Queries/multi-level-relations.md:
--------------------------------------------------------------------------------
1 | # Multi-level relations
2 |
3 | ## Creating the index with mapping
4 |
5 | ```
6 | PUT /company
7 | {
8 | "mappings": {
9 | "properties": {
10 | "join_field": {
11 | "type": "join",
12 | "relations": {
13 | "company": ["department", "supplier"],
14 | "department": "employee"
15 | }
16 | }
17 | }
18 | }
19 | }
20 | ```
21 |
22 | ## Adding a company
23 |
24 | ```
25 | PUT /company/_doc/1
26 | {
27 | "name": "My Company Inc.",
28 | "join_field": "company"
29 | }
30 | ```
31 |
32 | ## Adding a department
33 |
34 | ```
35 | PUT /company/_doc/2?routing=1
36 | {
37 | "name": "Development",
38 | "join_field": {
39 | "name": "department",
40 | "parent": 1
41 | }
42 | }
43 | ```
44 |
45 | ## Adding an employee
46 |
47 | ```
48 | PUT /company/_doc/3?routing=1
49 | {
50 | "name": "Bo Andersen",
51 | "join_field": {
52 | "name": "employee",
53 | "parent": 2
54 | }
55 | }
56 | ```
57 |
58 | ## Adding some more test data
59 | ```
60 | PUT /company/_doc/4
61 | {
62 | "name": "Another Company, Inc.",
63 | "join_field": "company"
64 | }
65 | ```
66 |
67 | ```
68 | PUT /company/_doc/5?routing=4
69 | {
70 | "name": "Marketing",
71 | "join_field": {
72 | "name": "department",
73 | "parent": 4
74 | }
75 | }
76 | ```
77 |
78 | ```
79 | PUT /company/_doc/6?routing=4
80 | {
81 | "name": "John Doe",
82 | "join_field": {
83 | "name": "employee",
84 | "parent": 5
85 | }
86 | }
87 | ```
88 |
89 | ## Example of querying multi-level relations
90 |
91 | ```
92 | GET /company/_search
93 | {
94 | "query": {
95 | "has_child": {
96 | "type": "department",
97 | "query": {
98 | "has_child": {
99 | "type": "employee",
100 | "query": {
101 | "term": {
102 | "name.keyword": "John Doe"
103 | }
104 | }
105 | }
106 | }
107 | }
108 | }
109 | }
110 | ```
--------------------------------------------------------------------------------
/Joining Queries/parent-child-inner-hits.md:
--------------------------------------------------------------------------------
1 | # Parent/child inner hits
2 |
3 | ## Including inner hits for the `has_child` query
4 |
5 | ```
6 | GET /department/_search
7 | {
8 | "query": {
9 | "has_child": {
10 | "type": "employee",
11 | "inner_hits": {},
12 | "query": {
13 | "bool": {
14 | "must": [
15 | {
16 | "range": {
17 | "age": {
18 | "gte": 50
19 | }
20 | }
21 | }
22 | ],
23 | "should": [
24 | {
25 | "term": {
26 | "gender.keyword": "M"
27 | }
28 | }
29 | ]
30 | }
31 | }
32 | }
33 | }
34 | }
35 | ```
36 |
37 | ## Including inner hits for the `has_parent` query
38 |
39 | ```
40 | GET /department/_search
41 | {
42 | "query": {
43 | "has_parent": {
44 | "inner_hits": {},
45 | "parent_type": "department",
46 | "query": {
47 | "term": {
48 | "name.keyword": "Development"
49 | }
50 | }
51 | }
52 | }
53 | }
54 | ```
--------------------------------------------------------------------------------
/Joining Queries/querying-by-parent-id.md:
--------------------------------------------------------------------------------
1 | # Querying by parent
2 |
3 | ```
4 | GET /department/_search
5 | {
6 | "query": {
7 | "parent_id": {
8 | "type": "employee",
9 | "id": 1
10 | }
11 | }
12 | }
13 | ```
--------------------------------------------------------------------------------
/Joining Queries/querying-child-documents-by-parent.md:
--------------------------------------------------------------------------------
1 | # Querying child documents by parent
2 |
3 | ## Matching child documents by parent criteria
4 |
5 | ```
6 | GET /department/_search
7 | {
8 | "query": {
9 | "has_parent": {
10 | "parent_type": "department",
11 | "query": {
12 | "term": {
13 | "name.keyword": "Development"
14 | }
15 | }
16 | }
17 | }
18 | }
19 | ```
20 |
21 | ## Incorporating the parent documents' relevance scores
22 |
23 | ```
24 | GET /department/_search
25 | {
26 | "query": {
27 | "has_parent": {
28 | "parent_type": "department",
29 | "score": true,
30 | "query": {
31 | "term": {
32 | "name.keyword": "Development"
33 | }
34 | }
35 | }
36 | }
37 | }
38 | ```
--------------------------------------------------------------------------------
/Joining Queries/querying-parent-by-child-documents.md:
--------------------------------------------------------------------------------
1 | # Querying parent by child documents
2 |
3 | ## Finding parents with child documents matching a `bool` query
4 |
5 | ```
6 | GET /department/_search
7 | {
8 | "query": {
9 | "has_child": {
10 | "type": "employee",
11 | "query": {
12 | "bool": {
13 | "must": [
14 | {
15 | "range": {
16 | "age": {
17 | "gte": 50
18 | }
19 | }
20 | }
21 | ],
22 | "should": [
23 | {
24 | "term": {
25 | "gender.keyword": "M"
26 | }
27 | }
28 | ]
29 | }
30 | }
31 | }
32 | }
33 | }
34 | ```
35 |
36 | ## Taking relevance scores into account with `score_mode`
37 |
38 | ```
39 | GET /department/_search
40 | {
41 | "query": {
42 | "has_child": {
43 | "type": "employee",
44 | "score_mode": "sum",
45 | "query": {
46 | "bool": {
47 | "must": [
48 | {
49 | "range": {
50 | "age": {
51 | "gte": 50
52 | }
53 | }
54 | }
55 | ],
56 | "should": [
57 | {
58 | "term": {
59 | "gender.keyword": "M"
60 | }
61 | }
62 | ]
63 | }
64 | }
65 | }
66 | }
67 | }
68 | ```
69 |
70 | ## Specifying the minimum and maximum number of children
71 |
72 | ```
73 | GET /department/_search
74 | {
75 | "query": {
76 | "has_child": {
77 | "type": "employee",
78 | "score_mode": "sum",
79 | "min_children": 2,
80 | "max_children": 5,
81 | "query": {
82 | "bool": {
83 | "must": [
84 | {
85 | "range": {
86 | "age": {
87 | "gte": 50
88 | }
89 | }
90 | }
91 | ],
92 | "should": [
93 | {
94 | "term": {
95 | "gender.keyword": "M"
96 | }
97 | }
98 | ]
99 | }
100 | }
101 | }
102 | }
103 | }
104 | ```
--------------------------------------------------------------------------------
/Joining Queries/terms-lookup-mechanism.md:
--------------------------------------------------------------------------------
1 | # Terms lookup mechanism
2 |
3 | ## Adding test data
4 |
5 | ```
6 | PUT /users/_doc/1
7 | {
8 | "name": "John Roberts",
9 | "following" : [2, 3]
10 | }
11 | ```
12 |
13 | ```
14 | PUT /users/_doc/2
15 | {
16 | "name": "Elizabeth Ross",
17 | "following" : []
18 | }
19 | ```
20 |
21 | ```
22 | PUT /users/_doc/3
23 | {
24 | "name": "Jeremy Brooks",
25 | "following" : [1, 2]
26 | }
27 | ```
28 |
29 | ```
30 | PUT /users/_doc/4
31 | {
32 | "name": "Diana Moore",
33 | "following" : [3, 1]
34 | }
35 | ```
36 |
37 | ```
38 | PUT /stories/_doc/1
39 | {
40 | "user": 3,
41 | "content": "Wow look, a penguin!"
42 | }
43 | ```
44 |
45 | ```
46 | PUT /stories/_doc/2
47 | {
48 | "user": 1,
49 | "content": "Just another day at the office... #coffee"
50 | }
51 | ```
52 |
53 | ```
54 | PUT /stories/_doc/3
55 | {
56 | "user": 1,
57 | "content": "Making search great again! #elasticsearch #elk"
58 | }
59 | ```
60 |
61 | ```
62 | PUT /stories/_doc/4
63 | {
64 | "user": 4,
65 | "content": "Had a blast today! #rollercoaster #amusementpark"
66 | }
67 | ```
68 |
69 | ```
70 | PUT /stories/_doc/5
71 | {
72 | "user": 4,
73 | "content": "Yay, I just got hired as an Elasticsearch consultant - so excited!"
74 | }
75 | ```
76 |
77 | ```
78 | PUT /stories/_doc/6
79 | {
80 | "user": 2,
81 | "content": "Chilling at the beach @ Greece #vacation #goodtimes"
82 | }
83 | ```
84 |
85 | ## Querying stories from a user's followers
86 |
87 | ```
88 | GET /stories/_search
89 | {
90 | "query": {
91 | "terms": {
92 | "user": {
93 | "index": "users",
94 | "id": "1",
95 | "path": "following"
96 | }
97 | }
98 | }
99 | }
100 | ```
101 |
--------------------------------------------------------------------------------
/LICENSE.md:
--------------------------------------------------------------------------------
1 | Apache License
2 | Version 2.0, January 2004
3 | http://www.apache.org/licenses/
4 |
5 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
6 |
7 | 1. Definitions.
8 |
9 | "License" shall mean the terms and conditions for use, reproduction,
10 | and distribution as defined by Sections 1 through 9 of this document.
11 |
12 | "Licensor" shall mean the copyright owner or entity authorized by
13 | the copyright owner that is granting the License.
14 |
15 | "Legal Entity" shall mean the union of the acting entity and all
16 | other entities that control, are controlled by, or are under common
17 | control with that entity. For the purposes of this definition,
18 | "control" means (i) the power, direct or indirect, to cause the
19 | direction or management of such entity, whether by contract or
20 | otherwise, or (ii) ownership of fifty percent (50%) or more of the
21 | outstanding shares, or (iii) beneficial ownership of such entity.
22 |
23 | "You" (or "Your") shall mean an individual or Legal Entity
24 | exercising permissions granted by this License.
25 |
26 | "Source" form shall mean the preferred form for making modifications,
27 | including but not limited to software source code, documentation
28 | source, and configuration files.
29 |
30 | "Object" form shall mean any form resulting from mechanical
31 | transformation or translation of a Source form, including but
32 | not limited to compiled object code, generated documentation,
33 | and conversions to other media types.
34 |
35 | "Work" shall mean the work of authorship, whether in Source or
36 | Object form, made available under the License, as indicated by a
37 | copyright notice that is included in or attached to the work
38 | (an example is provided in the Appendix below).
39 |
40 | "Derivative Works" shall mean any work, whether in Source or Object
41 | form, that is based on (or derived from) the Work and for which the
42 | editorial revisions, annotations, elaborations, or other modifications
43 | represent, as a whole, an original work of authorship. For the purposes
44 | of this License, Derivative Works shall not include works that remain
45 | separable from, or merely link (or bind by name) to the interfaces of,
46 | the Work and Derivative Works thereof.
47 |
48 | "Contribution" shall mean any work of authorship, including
49 | the original version of the Work and any modifications or additions
50 | to that Work or Derivative Works thereof, that is intentionally
51 | submitted to Licensor for inclusion in the Work by the copyright owner
52 | or by an individual or Legal Entity authorized to submit on behalf of
53 | the copyright owner. For the purposes of this definition, "submitted"
54 | means any form of electronic, verbal, or written communication sent
55 | to the Licensor or its representatives, including but not limited to
56 | communication on electronic mailing lists, source code control systems,
57 | and issue tracking systems that are managed by, or on behalf of, the
58 | Licensor for the purpose of discussing and improving the Work, but
59 | excluding communication that is conspicuously marked or otherwise
60 | designated in writing by the copyright owner as "Not a Contribution."
61 |
62 | "Contributor" shall mean Licensor and any individual or Legal Entity
63 | on behalf of whom a Contribution has been received by Licensor and
64 | subsequently incorporated within the Work.
65 |
66 | 2. Grant of Copyright License. Subject to the terms and conditions of
67 | this License, each Contributor hereby grants to You a perpetual,
68 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
69 | copyright license to reproduce, prepare Derivative Works of,
70 | publicly display, publicly perform, sublicense, and distribute the
71 | Work and such Derivative Works in Source or Object form.
72 |
73 | 3. Grant of Patent License. Subject to the terms and conditions of
74 | this License, each Contributor hereby grants to You a perpetual,
75 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable
76 | (except as stated in this section) patent license to make, have made,
77 | use, offer to sell, sell, import, and otherwise transfer the Work,
78 | where such license applies only to those patent claims licensable
79 | by such Contributor that are necessarily infringed by their
80 | Contribution(s) alone or by combination of their Contribution(s)
81 | with the Work to which such Contribution(s) was submitted. If You
82 | institute patent litigation against any entity (including a
83 | cross-claim or counterclaim in a lawsuit) alleging that the Work
84 | or a Contribution incorporated within the Work constitutes direct
85 | or contributory patent infringement, then any patent licenses
86 | granted to You under this License for that Work shall terminate
87 | as of the date such litigation is filed.
88 |
89 | 4. Redistribution. You may reproduce and distribute copies of the
90 | Work or Derivative Works thereof in any medium, with or without
91 | modifications, and in Source or Object form, provided that You
92 | meet the following conditions:
93 |
94 | (a) You must give any other recipients of the Work or
95 | Derivative Works a copy of this License; and
96 |
97 | (b) You must cause any modified files to carry prominent notices
98 | stating that You changed the files; and
99 |
100 | (c) You must retain, in the Source form of any Derivative Works
101 | that You distribute, all copyright, patent, trademark, and
102 | attribution notices from the Source form of the Work,
103 | excluding those notices that do not pertain to any part of
104 | the Derivative Works; and
105 |
106 | (d) If the Work includes a "NOTICE" text file as part of its
107 | distribution, then any Derivative Works that You distribute must
108 | include a readable copy of the attribution notices contained
109 | within such NOTICE file, excluding those notices that do not
110 | pertain to any part of the Derivative Works, in at least one
111 | of the following places: within a NOTICE text file distributed
112 | as part of the Derivative Works; within the Source form or
113 | documentation, if provided along with the Derivative Works; or,
114 | within a display generated by the Derivative Works, if and
115 | wherever such third-party notices normally appear. The contents
116 | of the NOTICE file are for informational purposes only and
117 | do not modify the License. You may add Your own attribution
118 | notices within Derivative Works that You distribute, alongside
119 | or as an addendum to the NOTICE text from the Work, provided
120 | that such additional attribution notices cannot be construed
121 | as modifying the License.
122 |
123 | You may add Your own copyright statement to Your modifications and
124 | may provide additional or different license terms and conditions
125 | for use, reproduction, or distribution of Your modifications, or
126 | for any such Derivative Works as a whole, provided Your use,
127 | reproduction, and distribution of the Work otherwise complies with
128 | the conditions stated in this License.
129 |
130 | 5. Submission of Contributions. Unless You explicitly state otherwise,
131 | any Contribution intentionally submitted for inclusion in the Work
132 | by You to the Licensor shall be under the terms and conditions of
133 | this License, without any additional terms or conditions.
134 | Notwithstanding the above, nothing herein shall supersede or modify
135 | the terms of any separate license agreement you may have executed
136 | with Licensor regarding such Contributions.
137 |
138 | 6. Trademarks. This License does not grant permission to use the trade
139 | names, trademarks, service marks, or product names of the Licensor,
140 | except as required for reasonable and customary use in describing the
141 | origin of the Work and reproducing the content of the NOTICE file.
142 |
143 | 7. Disclaimer of Warranty. Unless required by applicable law or
144 | agreed to in writing, Licensor provides the Work (and each
145 | Contributor provides its Contributions) on an "AS IS" BASIS,
146 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
147 | implied, including, without limitation, any warranties or conditions
148 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
149 | PARTICULAR PURPOSE. You are solely responsible for determining the
150 | appropriateness of using or redistributing the Work and assume any
151 | risks associated with Your exercise of permissions under this License.
152 |
153 | 8. Limitation of Liability. In no event and under no legal theory,
154 | whether in tort (including negligence), contract, or otherwise,
155 | unless required by applicable law (such as deliberate and grossly
156 | negligent acts) or agreed to in writing, shall any Contributor be
157 | liable to You for damages, including any direct, indirect, special,
158 | incidental, or consequential damages of any character arising as a
159 | result of this License or out of the use or inability to use the
160 | Work (including but not limited to damages for loss of goodwill,
161 | work stoppage, computer failure or malfunction, or any and all
162 | other commercial damages or losses), even if such Contributor
163 | has been advised of the possibility of such damages.
164 |
165 | 9. Accepting Warranty or Additional Liability. While redistributing
166 | the Work or Derivative Works thereof, You may choose to offer,
167 | and charge a fee for, acceptance of support, warranty, indemnity,
168 | or other liability obligations and/or rights consistent with this
169 | License. However, in accepting such obligations, You may act only
170 | on Your own behalf and on Your sole responsibility, not on behalf
171 | of any other Contributor, and only if You agree to indemnify,
172 | defend, and hold each Contributor harmless for any liability
173 | incurred by, or claims asserted against, such Contributor by reason
174 | of your accepting any such warranty or additional liability.
175 |
176 | END OF TERMS AND CONDITIONS
177 |
178 | APPENDIX: How to apply the Apache License to your work.
179 |
180 | To apply the Apache License to your work, attach the following
181 | boilerplate notice, with the fields enclosed by brackets "{}"
182 | replaced with your own identifying information. (Don't include
183 | the brackets!) The text should be enclosed in the appropriate
184 | comment syntax for the file format. We also recommend that a
185 | file or class name and description of purpose be included on the
186 | same "printed page" as the copyright notice for easier
187 | identification within third-party archives.
188 |
189 | Copyright 2020 Coding Explained
190 |
191 | Licensed under the Apache License, Version 2.0 (the "License");
192 | you may not use this file except in compliance with the License.
193 | You may obtain a copy of the License at
194 |
195 | http://www.apache.org/licenses/LICENSE-2.0
196 |
197 | Unless required by applicable law or agreed to in writing, software
198 | distributed under the License is distributed on an "AS IS" BASIS,
199 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
200 | See the License for the specific language governing permissions and
201 | limitations under the License.
--------------------------------------------------------------------------------
/Managing Documents/batch-processing.md:
--------------------------------------------------------------------------------
1 | # Batch processing
2 |
3 | ## Indexing documents
4 |
5 | ```
6 | POST /_bulk
7 | { "index": { "_index": "products", "_id": 200 } }
8 | { "name": "Espresso Machine", "price": 199, "in_stock": 5 }
9 | { "create": { "_index": "products", "_id": 201 } }
10 | { "name": "Milk Frother", "price": 149, "in_stock": 14 }
11 | ```
12 |
13 | ## Updating and deleting documents
14 |
15 | ```
16 | POST /_bulk
17 | { "update": { "_index": "products", "_id": 201 } }
18 | { "doc": { "price": 129 } }
19 | { "delete": { "_index": "products", "_id": 200 } }
20 | ```
21 |
22 | ## Specifying the index name in the request path
23 |
24 | ```
25 | POST /products/_bulk
26 | { "update": { "_id": 201 } }
27 | { "doc": { "price": 129 } }
28 | { "delete": { "_id": 200 } }
29 | ```
30 |
31 | ## Retrieving all documents
32 |
33 | ```
34 | GET /products/_search
35 | {
36 | "query": {
37 | "match_all": {}
38 | }
39 | }
40 | ```
--------------------------------------------------------------------------------
/Managing Documents/creating-and-deleting-indices.md:
--------------------------------------------------------------------------------
1 | # Creating & Deleting Indices
2 |
3 | ## Deleting an index
4 |
5 | ```
6 | DELETE /pages
7 | ```
8 |
9 | ## Creating an index (with settings)
10 |
11 | ```
12 | PUT /products
13 | {
14 | "settings": {
15 | "number_of_shards": 2,
16 | "number_of_replicas": 2
17 | }
18 | }
19 | ```
--------------------------------------------------------------------------------
/Managing Documents/delete-by-query.md:
--------------------------------------------------------------------------------
1 | # Delete by query
2 |
3 | ## Deleting documents that match a given query
4 |
5 | ```
6 | POST /products/_delete_by_query
7 | {
8 | "query": {
9 | "match_all": { }
10 | }
11 | }
12 | ```
13 |
14 | ## Ignoring (counting) version conflicts
15 |
16 | The `conflicts` key may be added as a query parameter instead, i.e. `?conflicts=proceed`.
17 |
18 | ```
19 | POST /products/_delete_by_query
20 | {
21 | "conflicts": "proceed",
22 | "query": {
23 | "match_all": { }
24 | }
25 | }
26 | ```
--------------------------------------------------------------------------------
/Managing Documents/deleting-documents.md:
--------------------------------------------------------------------------------
1 | # Deleting documents
2 |
3 | ```
4 | DELETE /products/_doc/101
5 | ```
--------------------------------------------------------------------------------
/Managing Documents/importing-data-with-curl.md:
--------------------------------------------------------------------------------
1 | # Importing data with cURL
2 |
3 | ## Navigating to bulk file directory
4 |
5 | ```
6 | cd /path/to/data/file/directory
7 | ```
8 |
9 | ### Examples
10 | ```
11 | # macOS and Linux
12 | cd ~/Desktop
13 |
14 | # Windows
15 | cd C:\Users\[your_username]\Desktop
16 | ```
17 |
18 | ## Importing data into local clusters
19 |
20 | ```
21 | # Without CA certificate validation. This is fine for development clusters, but don't do this in production!
22 | curl -k -u elastic -H "Content-Type:application/x-ndjson" -XPOST https://localhost:9200/products/_bulk --data-binary "@products-bulk.json"
23 |
24 | # With CA certificate validation. The certificate is located at $ES_HOME/config/certs/http_ca.crt
25 | curl --cacert /path/to/http_ca.crt -u elastic -H "Content-Type:application/x-ndjson" -XPOST https://localhost:9200/products/_bulk --data-binary "@products-bulk.json"
26 | ```
27 |
28 | ## Importing data into Elastic Cloud
29 |
30 | First, create an API key within Kibana (Stack Management > Security > API keys). Replace `API_TOKEN` below with the base64 encoded API key.
31 |
32 | ```bash
33 | curl -H "Content-Type:application/x-ndjson" -H "Authorization:ApiKey API_TOKEN" -XPOST https://elastic-cloud-endpoint.com/products/_bulk --data-binary "@products-bulk.json"
34 | ```
35 |
--------------------------------------------------------------------------------
/Managing Documents/indexing-documents.md:
--------------------------------------------------------------------------------
1 | # Indexing documents
2 |
3 | ## Indexing document with auto generated ID:
4 |
5 | ```
6 | POST /products/_doc
7 | {
8 | "name": "Coffee Maker",
9 | "price": 64,
10 | "in_stock": 10
11 | }
12 | ```
13 |
14 | ## Indexing document with custom ID:
15 |
16 | ```
17 | PUT /products/_doc/100
18 | {
19 | "name": "Toaster",
20 | "price": 49,
21 | "in_stock": 4
22 | }
23 | ```
--------------------------------------------------------------------------------
/Managing Documents/optimistic-concurrency-control.md:
--------------------------------------------------------------------------------
1 | # Optimistic concurrency control
2 |
3 | ## Retrieve the document (and its primary term and sequence number)
4 | ```
5 | GET /products/_doc/100
6 | ```
7 |
8 | ## Update the `in_stock` field only if the document has not been updated since retrieving it
9 | ```
10 | POST /products/_update/100?if_primary_term=X&if_seq_no=X
11 | {
12 | "doc": {
13 | "in_stock": 123
14 | }
15 | }
16 | ```
--------------------------------------------------------------------------------
/Managing Documents/replacing-documents.md:
--------------------------------------------------------------------------------
1 | # Replacing documents
2 |
3 | ```
4 | PUT /products/_doc/100
5 | {
6 | "name": "Toaster",
7 | "price": 79,
8 | "in_stock": 4
9 | }
10 | ```
--------------------------------------------------------------------------------
/Managing Documents/retrieving-documents-by-id.md:
--------------------------------------------------------------------------------
1 | # Retrieving documents by ID
2 |
3 | ```
4 | GET /products/_doc/100
5 | ```
--------------------------------------------------------------------------------
/Managing Documents/scripted-updates.md:
--------------------------------------------------------------------------------
1 | # Scripted updates
2 |
3 | ## Reducing the current value of `in_stock` by one
4 |
5 | ```
6 | POST /products/_update/100
7 | {
8 | "script": {
9 | "source": "ctx._source.in_stock--"
10 | }
11 | }
12 | ```
13 |
14 | ## Assigning an arbitrary value to `in_stock`
15 |
16 | ```
17 | POST /products/_update/100
18 | {
19 | "script": {
20 | "source": "ctx._source.in_stock = 10"
21 | }
22 | }
23 | ```
24 |
25 | ## Using parameters within scripts
26 |
27 | ```
28 | POST /products/_update/100
29 | {
30 | "script": {
31 | "source": "ctx._source.in_stock -= params.quantity",
32 | "params": {
33 | "quantity": 4
34 | }
35 | }
36 | }
37 | ```
38 |
39 | ## Conditionally setting the operation to `noop`
40 |
41 | ```
42 | POST /products/_update/100
43 | {
44 | "script": {
45 | "source": """
46 | if (ctx._source.in_stock == 0) {
47 | ctx.op = 'noop';
48 | }
49 |
50 | ctx._source.in_stock--;
51 | """
52 | }
53 | }
54 | ```
55 |
56 | ## Conditionally update a field value
57 |
58 | ```
59 | POST /products/_update/100
60 | {
61 | "script": {
62 | "source": """
63 | if (ctx._source.in_stock > 0) {
64 | ctx._source.in_stock--;
65 | }
66 | """
67 | }
68 | }
69 | ```
70 |
71 | ## Conditionally delete a document
72 |
73 | ```
74 | POST /products/_update/100
75 | {
76 | "script": {
77 | "source": """
78 | if (ctx._source.in_stock < 0) {
79 | ctx.op = 'delete';
80 | }
81 |
82 | ctx._source.in_stock--;
83 | """
84 | }
85 | }
86 | ```
--------------------------------------------------------------------------------
/Managing Documents/update-by-query.md:
--------------------------------------------------------------------------------
1 | # Update by query
2 |
3 | ## Updating documents matching a query
4 |
5 | Replace the `match_all` query with any query that you would like.
6 |
7 | ```
8 | POST /products/_update_by_query
9 | {
10 | "script": {
11 | "source": "ctx._source.in_stock--"
12 | },
13 | "query": {
14 | "match_all": {}
15 | }
16 | }
17 | ```
18 |
19 | ## Ignoring (counting) version conflicts
20 |
21 | The `conflicts` key may be added as a query parameter instead, i.e. `?conflicts=proceed`.
22 |
23 | ```
24 | POST /products/_update_by_query
25 | {
26 | "conflicts": "proceed",
27 | "script": {
28 | "source": "ctx._source.in_stock--"
29 | },
30 | "query": {
31 | "match_all": {}
32 | }
33 | }
34 | ```
35 |
36 | ## Matches all of the documents within the `products` index
37 |
38 | ```
39 | GET /products/_search
40 | {
41 | "query": {
42 | "match_all": {}
43 | }
44 | }
45 | ```
--------------------------------------------------------------------------------
/Managing Documents/updating-documents.md:
--------------------------------------------------------------------------------
1 | # Updating documents
2 |
3 | ## Updating an existing field
4 |
5 | ```
6 | POST /products/_update/100
7 | {
8 | "doc": {
9 | "in_stock": 3
10 | }
11 | }
12 | ```
13 |
14 | ## Adding a new field
15 |
16 | _Yes, the syntax is the same as the above. ;-)_
17 |
18 | ```
19 | POST /products/_update/100
20 | {
21 | "doc": {
22 | "tags": ["electronics"]
23 | }
24 | }
25 | ```
--------------------------------------------------------------------------------
/Managing Documents/upserts.md:
--------------------------------------------------------------------------------
1 | # Upserts
2 |
3 | ```
4 | POST /products/_update/101
5 | {
6 | "script": {
7 | "source": "ctx._source.in_stock++"
8 | },
9 | "upsert": {
10 | "name": "Blender",
11 | "price": 399,
12 | "in_stock": 5
13 | }
14 | }
15 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/adding-analyzers-to-existing-indices.md:
--------------------------------------------------------------------------------
1 | # Adding analyzers to existing indices
2 |
3 | ## Close `analyzer_test` index
4 | ```
5 | POST /analyzer_test/_close
6 | ```
7 |
8 | ## Add new analyzer
9 | ```
10 | PUT /analyzer_test/_settings
11 | {
12 | "analysis": {
13 | "analyzer": {
14 | "my_second_analyzer": {
15 | "type": "custom",
16 | "tokenizer": "standard",
17 | "char_filter": ["html_strip"],
18 | "filter": [
19 | "lowercase",
20 | "stop",
21 | "asciifolding"
22 | ]
23 | }
24 | }
25 | }
26 | }
27 | ```
28 |
29 | ## Open `analyzer_test` index
30 | ```
31 | POST /analyzer_test/_open
32 | ```
33 |
34 | ## Retrieve index settings
35 | ```
36 | GET /analyzer_test/_settings
37 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/adding-explicit-mappings.md:
--------------------------------------------------------------------------------
1 | # Adding explicit mappings
2 |
3 | ## Add field mappings for `reviews` index
4 | ```
5 | PUT /reviews
6 | {
7 | "mappings": {
8 | "properties": {
9 | "rating": { "type": "float" },
10 | "content": { "type": "text" },
11 | "product_id": { "type": "integer" },
12 | "author": {
13 | "properties": {
14 | "first_name": { "type": "text" },
15 | "last_name": { "type": "text" },
16 | "email": { "type": "keyword" }
17 | }
18 | }
19 | }
20 | }
21 | }
22 | ```
23 |
24 | ## Index a test document
25 | ```
26 | PUT /reviews/_doc/1
27 | {
28 | "rating": 5.0,
29 | "content": "Outstanding course! Bo really taught me a lot about Elasticsearch!",
30 | "product_id": 123,
31 | "author": {
32 | "first_name": "John",
33 | "last_name": "Doe",
34 | "email": "johndoe123@example.com"
35 | }
36 | }
37 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/adding-mappings-to-existing-indices.md:
--------------------------------------------------------------------------------
1 | # Adding mappings to existing indices
2 |
3 | ## Add new field mapping to existing index
4 | ```
5 | PUT /reviews/_mapping
6 | {
7 | "properties": {
8 | "created_at": {
9 | "type": "date"
10 | }
11 | }
12 | }
13 | ```
14 |
15 | ## Retrieve the mapping
16 | ```
17 | GET /reviews/_mapping
18 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/combining-explicit-and-dynamic-mapping.md:
--------------------------------------------------------------------------------
1 | # Combining explicit and dynamic mapping
2 |
3 | ## Create index with one field mapping
4 | ```
5 | PUT /people
6 | {
7 | "mappings": {
8 | "properties": {
9 | "first_name": {
10 | "type": "text"
11 | }
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ## Index a test document with an unmapped field
18 | ```
19 | POST /people/_doc
20 | {
21 | "first_name": "Bo",
22 | "last_name": "Andersen"
23 | }
24 | ```
25 |
26 | ## Retrieve mapping
27 | ```
28 | GET /people/_mapping
29 | ```
30 |
31 | ## Clean up
32 | ```
33 | DELETE /people
34 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/configuring-dynamic-mapping.md:
--------------------------------------------------------------------------------
1 | # Configuring dynamic mapping
2 |
3 | ## Disable dynamic mapping
4 | ```
5 | PUT /people
6 | {
7 | "mappings": {
8 | "dynamic": false,
9 | "properties": {
10 | "first_name": {
11 | "type": "text"
12 | }
13 | }
14 | }
15 | }
16 | ```
17 |
18 | ## Set dynamic mapping to `strict`
19 | ```
20 | PUT /people
21 | {
22 | "mappings": {
23 | "dynamic": "strict",
24 | "properties": {
25 | "first_name": {
26 | "type": "text"
27 | }
28 | }
29 | }
30 | }
31 | ```
32 |
33 | ## Index a test document
34 | ```
35 | POST /people/_doc
36 | {
37 | "first_name": "Bo",
38 | "last_name": "Andersen"
39 | }
40 | ```
41 |
42 | ## Retrieve mapping
43 | ```
44 | GET /people/_mapping
45 | ```
46 |
47 | ## Search `first_name` field
48 | ```
49 | GET /people/_search
50 | {
51 | "query": {
52 | "match": {
53 | "first_name": "Bo"
54 | }
55 | }
56 | }
57 | ```
58 |
59 | ## Search `last_name` field
60 | ```
61 | GET /people/_search
62 | {
63 | "query": {
64 | "match": {
65 | "last_name": "Andersen"
66 | }
67 | }
68 | }
69 | ```
70 |
71 | ## Inheritance for the `dynamic` parameter
72 | The following example sets the `dynamic` parameter to `"strict"` at the root level, but overrides it with a value of
73 | `true` for the `specifications.other` field mapping.
74 |
75 | ### Mapping
76 | ```
77 | PUT /computers
78 | {
79 | "mappings": {
80 | "dynamic": "strict",
81 | "properties": {
82 | "name": {
83 | "type": "text"
84 | },
85 | "specifications": {
86 | "properties": {
87 | "cpu": {
88 | "properties": {
89 | "name": {
90 | "type": "text"
91 | }
92 | }
93 | },
94 | "other": {
95 | "dynamic": true,
96 | "properties": { ... }
97 | }
98 | }
99 | }
100 | }
101 | }
102 | }
103 | ```
104 |
105 | ### Example document (invalid)
106 | ```
107 | POST /computers/_doc
108 | {
109 | "name": "Gamer PC",
110 | "specifications": {
111 | "cpu": {
112 | "name": "Intel Core i7-9700K",
113 | "frequency": 3.6
114 | }
115 | }
116 | }
117 | ```
118 |
119 | ### Example document (OK)
120 | ```
121 | POST /computers/_doc
122 | {
123 | "name": "Gamer PC",
124 | "specifications": {
125 | "cpu": {
126 | "name": "Intel Core i7-9700K"
127 | },
128 | "other": {
129 | "security": "Kensington"
130 | }
131 | }
132 | }
133 | ```
134 |
135 | ## Enabling numeric detection
136 | When enabling numeric detection, Elasticsearch will check the contents of strings to see if they contain only numeric
137 | values - and map the fields accordingly as either `float` or `long`.
138 |
139 | ### Mapping
140 | ```
141 | PUT /computers
142 | {
143 | "mappings": {
144 | "numeric_detection": true
145 | }
146 | }
147 | ```
148 |
149 | ### Example document
150 | ```
151 | POST /computers/_doc
152 | {
153 | "specifications": {
154 | "other": {
155 | "max_ram_gb": "32", # long
156 | "bluetooth": "5.2" # float
157 | }
158 | }
159 | }
160 | ```
161 |
162 | ## Date detection
163 |
164 | ### Disabling date detection
165 | ```
166 | PUT /computers
167 | {
168 | "mappings": {
169 | "date_detection": false
170 | }
171 | }
172 | ```
173 |
174 | ### Configuring dynamic date formats
175 | ```
176 | PUT /computers
177 | {
178 | "mappings": {
179 | "dynamic_date_formats": ["dd-MM-yyyy"]
180 | }
181 | }
182 | ```
183 |
184 | ## Clean up
185 | ```
186 | DELETE /people
187 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/creating-custom-analyzers.md:
--------------------------------------------------------------------------------
1 | # Creating custom analyzers
2 |
3 | ## Remove HTML tags and convert HTML entities
4 | ```
5 | POST /_analyze
6 | {
7 | "char_filter": ["html_strip"],
8 | "text": "I'm in a good mood - and I love açaí!"
9 | }
10 | ```
11 |
12 | ## Add the `standard` tokenizer
13 | ```
14 | POST /_analyze
15 | {
16 | "char_filter": ["html_strip"],
17 | "tokenizer": "standard",
18 | "text": "I'm in a good mood - and I love açaí!"
19 | }
20 | ```
21 |
22 | ## Add the `lowercase` token filter
23 | ```
24 | POST /_analyze
25 | {
26 | "char_filter": ["html_strip"],
27 | "tokenizer": "standard",
28 | "filter": [
29 | "lowercase"
30 | ],
31 | "text": "I'm in a good mood - and I love açaí!"
32 | }
33 | ```
34 |
35 | ## Add the `stop` token filter
36 |
37 | This removes English stop words by default.
38 | ```
39 | POST /_analyze
40 | {
41 | "char_filter": ["html_strip"],
42 | "tokenizer": "standard",
43 | "filter": [
44 | "lowercase",
45 | "stop"
46 | ],
47 | "text": "I'm in a good mood - and I love açaí!"
48 | }
49 | ```
50 |
51 | ## Add the `asciifolding` token filter
52 |
53 | Convert characters to their ASCII equivalent.
54 | ```
55 | POST /_analyze
56 | {
57 | "char_filter": ["html_strip"],
58 | "tokenizer": "standard",
59 | "filter": [
60 | "lowercase",
61 | "stop",
62 | "asciifolding"
63 | ],
64 | "text": "I'm in a good mood - and I love açaí!"
65 | }
66 | ```
67 |
68 | ## Create a custom analyzer named `my_custom_analyzer`
69 | ```
70 | PUT /analyzer_test
71 | {
72 | "settings": {
73 | "analysis": {
74 | "analyzer": {
75 | "my_custom_analyzer": {
76 | "type": "custom",
77 | "char_filter": ["html_strip"],
78 | "tokenizer": "standard",
79 | "filter": [
80 | "lowercase",
81 | "stop",
82 | "asciifolding"
83 | ]
84 | }
85 | }
86 | }
87 | }
88 | }
89 | ```
90 |
91 | ## Configure the analyzer to remove Danish stop words
92 |
93 | To run this query, change the index name to avoid a conflict. Remember to remove the comments. :wink:
94 | ```
95 | PUT /analyzer_test
96 | {
97 | "settings": {
98 | "analysis": {
99 | "filter": {
100 | "danish_stop": {
101 | "type": "stop",
102 | "stopwords": "_danish_"
103 | }
104 | },
105 | "char_filter": {
106 | # Add character filters here
107 | },
108 | "tokenizer": {
109 | # Add tokenizers here
110 | },
111 | "analyzer": {
112 | "my_custom_analyzer": {
113 | "type": "custom",
114 | "char_filter": ["html_strip"],
115 | "tokenizer": "standard",
116 | "filter": [
117 | "lowercase",
118 | "danish_stop",
119 | "asciifolding"
120 | ]
121 | }
122 | }
123 | }
124 | }
125 | }
126 | ```
127 |
128 | ## Test the custom analyzer
129 | ```
130 | POST /analyzer_test/_analyze
131 | {
132 | "analyzer": "my_custom_analyzer",
133 | "text": "I'm in a good mood - and I love açaí!"
134 | }
135 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/defining-field-aliases.md:
--------------------------------------------------------------------------------
1 | # Defining field aliases
2 |
3 | ## Add `comment` alias pointing to the `content` field
4 | ```
5 | PUT /reviews/_mapping
6 | {
7 | "properties": {
8 | "comment": {
9 | "type": "alias",
10 | "path": "content"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | ## Using the field alias
17 | ```
18 | GET /reviews/_search
19 | {
20 | "query": {
21 | "match": {
22 | "comment": "outstanding"
23 | }
24 | }
25 | }
26 | ```
27 |
28 | ## Using the "original" field name still works
29 | ```
30 | GET /reviews/_search
31 | {
32 | "query": {
33 | "match": {
34 | "content": "outstanding"
35 | }
36 | }
37 | }
38 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/dynamic-templates.md:
--------------------------------------------------------------------------------
1 | # Dynamic templates
2 |
3 | ## Map whole numbers to `integer` instead of `long`
4 | ```
5 | PUT /dynamic_template_test
6 | {
7 | "mappings": {
8 | "dynamic_templates": [
9 | {
10 | "integers": {
11 | "match_mapping_type": "long",
12 | "mapping": {
13 | "type": "integer"
14 | }
15 | }
16 | }
17 | ]
18 | }
19 | }
20 | ```
21 |
22 | ## Test the dynamic template
23 | ```
24 | POST /dynamic_template_test/_doc
25 | {
26 | "in_stock": 123
27 | }
28 | ```
29 |
30 | ## Retrieve mapping (and dynamic template)
31 | ```
32 | GET /dynamic_template_test/_mapping
33 | ```
34 |
35 | ## Modify default mapping for strings (set `ignore_above` to 512)
36 | ```
37 | PUT /test_index
38 | {
39 | "mappings": {
40 | "dynamic_templates": [
41 | {
42 | "strings": {
43 | "match_mapping_type": "string",
44 | "mapping": {
45 | "type": "text",
46 | "fields": {
47 | "keyword": {
48 | "type": "keyword",
49 | "ignore_above": 512
50 | }
51 | }
52 | }
53 | }
54 | }
55 | ]
56 | }
57 | }
58 | ```
59 |
60 | ## Using `match` and `unmatch`
61 | ```
62 | PUT /test_index
63 | {
64 | "mappings": {
65 | "dynamic_templates": [
66 | {
67 | "strings_only_text": {
68 | "match_mapping_type": "string",
69 | "match": "text_*",
70 | "unmatch": "*_keyword",
71 | "mapping": {
72 | "type": "text"
73 | }
74 | }
75 | },
76 | {
77 | "strings_only_keyword": {
78 | "match_mapping_type": "string",
79 | "match": "*_keyword",
80 | "mapping": {
81 | "type": "keyword"
82 | }
83 | }
84 | }
85 | ]
86 | }
87 | }
88 |
89 | POST /test_index/_doc
90 | {
91 | "text_product_description": "A description.",
92 | "text_product_id_keyword": "ABC-123"
93 | }
94 | ```
95 |
96 | ## Setting `match_pattern` to `regex`
97 | ```
98 | PUT /test_index
99 | {
100 | "mappings": {
101 | "dynamic_templates": [
102 | {
103 | "names": {
104 | "match_mapping_type": "string",
105 | "match": "^[a-zA-Z]+_name$",
106 | "match_pattern": "regex",
107 | "mapping": {
108 | "type": "text"
109 | }
110 | }
111 | }
112 | ]
113 | }
114 | }
115 |
116 | POST /test_index/_doc
117 | {
118 | "first_name": "John",
119 | "middle_name": "Edward",
120 | "last_name": "Doe"
121 | }
122 | ```
123 |
124 | ## Using `path_match`
125 | ```
126 | PUT /test_index
127 | {
128 | "mappings": {
129 | "dynamic_templates": [
130 | {
131 | "copy_to_full_name": {
132 | "match_mapping_type": "string",
133 | "path_match": "employer.name.*",
134 | "mapping": {
135 | "type": "text",
136 | "copy_to": "full_name"
137 | }
138 | }
139 | }
140 | ]
141 | }
142 | }
143 |
144 | POST /test_index/_doc
145 | {
146 | "employer": {
147 | "name": {
148 | "first_name": "John",
149 | "middle_name": "Edward",
150 | "last_name": "Doe"
151 | }
152 | }
153 | }
154 | ```
155 |
156 | ## Using placeholders
157 | ```
158 | PUT /test_index
159 | {
160 | "mappings": {
161 | "dynamic_templates": [
162 | {
163 | "no_doc_values": {
164 | "match_mapping_type": "*",
165 | "mapping": {
166 | "type": "{dynamic_type}",
167 | "index": false
168 | }
169 | }
170 | }
171 | ]
172 | }
173 | }
174 |
175 | POST /test_index/_doc
176 | {
177 | "name": "John Doe",
178 | "age": 26
179 | }
180 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/how-dates-work-in-elasticsearch.md:
--------------------------------------------------------------------------------
1 | # How dates work in Elasticsearch
2 |
3 | ## Supplying only a date
4 | ```
5 | PUT /reviews/_doc/2
6 | {
7 | "rating": 4.5,
8 | "content": "Not bad. Not bad at all!",
9 | "product_id": 123,
10 | "created_at": "2015-03-27",
11 | "author": {
12 | "first_name": "Average",
13 | "last_name": "Joe",
14 | "email": "avgjoe@example.com"
15 | }
16 | }
17 | ```
18 |
19 | ## Supplying both a date and time
20 | ```
21 | PUT /reviews/_doc/3
22 | {
23 | "rating": 3.5,
24 | "content": "Could be better",
25 | "product_id": 123,
26 | "created_at": "2015-04-15T13:07:41Z",
27 | "author": {
28 | "first_name": "Spencer",
29 | "last_name": "Pearson",
30 | "email": "spearson@example.com"
31 | }
32 | }
33 | ```
34 |
35 | ## Specifying the UTC offset
36 | ```
37 | PUT /reviews/_doc/4
38 | {
39 | "rating": 5.0,
40 | "content": "Incredible!",
41 | "product_id": 123,
42 | "created_at": "2015-01-28T09:21:51+01:00",
43 | "author": {
44 | "first_name": "Adam",
45 | "last_name": "Jones",
46 | "email": "adam.jones@example.com"
47 | }
48 | }
49 | ```
50 |
51 | ## Supplying a timestamp (milliseconds since the epoch)
52 | ```
53 | # Equivalent to 2015-07-04T12:01:24Z
54 | PUT /reviews/_doc/5
55 | {
56 | "rating": 4.5,
57 | "content": "Very useful",
58 | "product_id": 123,
59 | "created_at": 1436011284000,
60 | "author": {
61 | "first_name": "Taylor",
62 | "last_name": "West",
63 | "email": "twest@example.com"
64 | }
65 | }
66 | ```
67 |
68 | ## Retrieving documents
69 | ```
70 | GET /reviews/_search
71 | {
72 | "query": {
73 | "match_all": {}
74 | }
75 | }
76 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/how-the-keyword-data-type-works.md:
--------------------------------------------------------------------------------
1 | # How the `keyword` data type works
2 |
3 | ## Testing the `keyword` analyzer
4 | ```
5 | POST /_analyze
6 | {
7 | "text": "2 guys walk into a bar, but the third... DUCKS! :-)",
8 | "analyzer": "keyword"
9 | }
10 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/index-templates.md:
--------------------------------------------------------------------------------
1 | # Index templates
2 |
3 | ## Adding/updating an index template
4 |
5 | This adds a new index template or updates an existing one.
6 |
7 | ```
8 | PUT /_index_template/access-logs
9 | {
10 | "index_patterns": ["access-logs-*"],
11 | "template": {
12 | "settings": {
13 | "number_of_shards": 2,
14 | "index.mapping.coerce": false
15 | },
16 | "mappings": {
17 | "properties": {
18 | "@timestamp": { "type": "date" },
19 | "url.original": { "type": "wildcard" },
20 | "url.path": { "type": "wildcard" },
21 | "url.scheme": { "type": "keyword" },
22 | "url.domain": { "type": "keyword" },
23 | "client.geo.continent_name": { "type": "keyword" },
24 | "client.geo.country_name": { "type": "keyword" },
25 | "client.geo.region_name": { "type": "keyword" },
26 | "client.geo.city_name": { "type": "keyword" },
27 | "user_agent.original": { "type": "keyword" },
28 | "user_agent.name": { "type": "keyword" },
29 | "user_agent.version": { "type": "keyword" },
30 | "user_agent.device.name": { "type": "keyword" },
31 | "user_agent.os.name": { "type": "keyword" },
32 | "user_agent.os.version": { "type": "keyword" }
33 | }
34 | }
35 | }
36 | }
37 | ```
38 |
39 | ## Indexing a document into a new index
40 |
41 | The index name matches the index pattern defined within the above index template.
42 | The index template's settings and mappings will be therefore be applied to the new index.
43 |
44 | ```
45 | POST /access-logs-2023-01/_doc
46 | {
47 | "@timestamp": "2023-01-01T00:00:00Z",
48 | "url.original": "https://example.com/products",
49 | "url.path": "/products",
50 | "url.scheme": "https",
51 | "url.domain": "example.com",
52 | "client.geo.continent_name": "Europe",
53 | "client.geo.country_name": "Denmark",
54 | "client.geo.region_name": "Capital City Region",
55 | "client.geo.city_name": "Copenhagen",
56 | "user_agent.original": "Mozilla/5.0 (iPhone; CPU iPhone OS 12_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/12.0 Mobile/15E148 Safari/604.1",
57 | "user_agent.name": "Safari",
58 | "user_agent.version": "12.0",
59 | "user_agent.device.name": "iPhone",
60 | "user_agent.os.name": "iOS",
61 | "user_agent.os.version": "12.1.0"
62 | }
63 | ```
64 |
65 | ## Manually creating an index
66 |
67 | The index template's settings/mappings will also be applied to this index.
68 |
69 | ```
70 | PUT /access-logs-2023-02
71 | {
72 | "settings": {
73 | "number_of_shards": 1
74 | },
75 | "mappings": {
76 | "properties": {
77 | "url.query": {
78 | "type": "keyword"
79 | }
80 | }
81 | }
82 | }
83 | ```
84 |
85 | ## Inspecting the new indices
86 |
87 | ```
88 | GET /access-logs-2023-01
89 | GET /access-logs-2023-02
90 | ```
91 |
92 | ## Retrieving an index template
93 |
94 | ```
95 | GET /_index_template/access-logs
96 | ```
97 |
98 | ## Deleting an index template
99 |
100 | ```
101 | DELETE /_index_template/access-logs
102 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/multi-field-mappings.md:
--------------------------------------------------------------------------------
1 | # Multi-field mappings
2 |
3 | ## Add `keyword` mapping to a `text` field
4 | ```
5 | PUT /multi_field_test
6 | {
7 | "mappings": {
8 | "properties": {
9 | "description": {
10 | "type": "text"
11 | },
12 | "ingredients": {
13 | "type": "text",
14 | "fields": {
15 | "keyword": {
16 | "type": "keyword"
17 | }
18 | }
19 | }
20 | }
21 | }
22 | }
23 | ```
24 |
25 | ## Index a test document
26 | ```
27 | POST /multi_field_test/_doc
28 | {
29 | "description": "To make this spaghetti carbonara, you first need to...",
30 | "ingredients": ["Spaghetti", "Bacon", "Eggs"]
31 | }
32 | ```
33 |
34 | ## Retrieve documents
35 | ```
36 | GET /multi_field_test/_search
37 | {
38 | "query": {
39 | "match_all": {}
40 | }
41 | }
42 | ```
43 |
44 | ## Querying the `text` mapping
45 | ```
46 | GET /multi_field_test/_search
47 | {
48 | "query": {
49 | "match": {
50 | "ingredients": "Spaghetti"
51 | }
52 | }
53 | }
54 | ```
55 |
56 | ## Querying the `keyword` mapping
57 | ```
58 | GET /multi_field_test/_search
59 | {
60 | "query": {
61 | "term": {
62 | "ingredients.keyword": "Spaghetti"
63 | }
64 | }
65 | }
66 | ```
67 |
68 | ## Clean up
69 | ```
70 | DELETE /multi_field_test
71 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/reindexing-documents-with-the-reindex-api.md:
--------------------------------------------------------------------------------
1 | # Reindexing documents with the Reindex API
2 |
3 | ## Add new index with new mapping
4 | ```
5 | PUT /reviews_new
6 | {
7 | "mappings" : {
8 | "properties" : {
9 | "author" : {
10 | "properties" : {
11 | "email" : {
12 | "type" : "keyword",
13 | "ignore_above" : 256
14 | },
15 | "first_name" : {
16 | "type" : "text"
17 | },
18 | "last_name" : {
19 | "type" : "text"
20 | }
21 | }
22 | },
23 | "content" : {
24 | "type" : "text"
25 | },
26 | "created_at" : {
27 | "type" : "date"
28 | },
29 | "product_id" : {
30 | "type" : "keyword"
31 | },
32 | "rating" : {
33 | "type" : "float"
34 | }
35 | }
36 | }
37 | }
38 | ```
39 |
40 | ## Retrieve mapping
41 | ```
42 | GET /reviews/_mappings
43 | ```
44 |
45 | ## Reindex documents into `reviews_new`
46 | ```
47 | POST /_reindex
48 | {
49 | "source": {
50 | "index": "reviews"
51 | },
52 | "dest": {
53 | "index": "reviews_new"
54 | }
55 | }
56 | ```
57 |
58 | ## Delete all documents
59 | ```
60 | POST /reviews_new/_delete_by_query
61 | {
62 | "query": {
63 | "match_all": {}
64 | }
65 | }
66 | ```
67 |
68 | ## Convert `product_id` values to strings
69 | ```
70 | POST /_reindex
71 | {
72 | "source": {
73 | "index": "reviews"
74 | },
75 | "dest": {
76 | "index": "reviews_new"
77 | },
78 | "script": {
79 | "source": """
80 | if (ctx._source.product_id != null) {
81 | ctx._source.product_id = ctx._source.product_id.toString();
82 | }
83 | """
84 | }
85 | }
86 | ```
87 |
88 | ## Retrieve documents
89 | ```
90 | GET /reviews_new/_search
91 | {
92 | "query": {
93 | "match_all": {}
94 | }
95 | }
96 | ```
97 |
98 | ## Reindex specific documents
99 | ```
100 | POST /_reindex
101 | {
102 | "source": {
103 | "index": "reviews",
104 | "query": {
105 | "match_all": { }
106 | }
107 | },
108 | "dest": {
109 | "index": "reviews_new"
110 | }
111 | }
112 | ```
113 |
114 | ## Reindex only positive reviews
115 | ```
116 | POST /_reindex
117 | {
118 | "source": {
119 | "index": "reviews",
120 | "query": {
121 | "range": {
122 | "rating": {
123 | "gte": 4.0
124 | }
125 | }
126 | }
127 | },
128 | "dest": {
129 | "index": "reviews_new"
130 | }
131 | }
132 | ```
133 |
134 | ## Removing fields (source filtering)
135 | ```
136 | POST /_reindex
137 | {
138 | "source": {
139 | "index": "reviews",
140 | "_source": ["content", "created_at", "rating"]
141 | },
142 | "dest": {
143 | "index": "reviews_new"
144 | }
145 | }
146 | ```
147 |
148 | ## Changing a field's name
149 | ```
150 | POST /_reindex
151 | {
152 | "source": {
153 | "index": "reviews"
154 | },
155 | "dest": {
156 | "index": "reviews_new"
157 | },
158 | "script": {
159 | "source": """
160 | # Rename "content" field to "comment"
161 | ctx._source.comment = ctx._source.remove("content");
162 | """
163 | }
164 | }
165 | ```
166 |
167 | ## Ignore reviews with ratings below 4.0
168 | ```
169 | POST /_reindex
170 | {
171 | "source": {
172 | "index": "reviews"
173 | },
174 | "dest": {
175 | "index": "reviews_new"
176 | },
177 | "script": {
178 | "source": """
179 | if (ctx._source.rating < 4.0) {
180 | ctx.op = "noop"; # Can also be set to "delete"
181 | }
182 | """
183 | }
184 | }
185 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/retrieving-mappings.md:
--------------------------------------------------------------------------------
1 | # Retrieving mappings
2 |
3 | ## Retrieving mappings for the `reviews` index
4 | ```
5 | GET /reviews/_mapping
6 | ```
7 |
8 | ## Retrieving mapping for the `content` field
9 | ```
10 | GET /reviews/_mapping/field/content
11 | ```
12 |
13 | ## Retrieving mapping for the `author.email` field
14 | ```
15 | GET /reviews/_mapping/field/author.email
16 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/understanding-arrays.md:
--------------------------------------------------------------------------------
1 | # Understanding arrays
2 |
3 | ## Arrays of strings are concatenated when analyzed
4 | ```
5 | POST /_analyze
6 | {
7 | "text": ["Strings are simply", "merged together."],
8 | "analyzer": "standard"
9 | }
10 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/understanding-type-coercion.md:
--------------------------------------------------------------------------------
1 | # Understanding type coercion
2 |
3 | ## Supplying a floating point
4 | ```
5 | PUT /coercion_test/_doc/1
6 | {
7 | "price": 7.4
8 | }
9 | ```
10 |
11 | ## Supplying a floating point within a string
12 | ```
13 | PUT /coercion_test/_doc/2
14 | {
15 | "price": "7.4"
16 | }
17 | ```
18 |
19 | ## Supplying an invalid value
20 | ```
21 | PUT /coercion_test/_doc/3
22 | {
23 | "price": "7.4m"
24 | }
25 | ```
26 |
27 | ## Retrieve document
28 | ```
29 | GET /coercion_test/_doc/2
30 | ```
31 |
32 | ## Clean up
33 | ```
34 | DELETE /coercion_test
35 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/updating-analyzers.md:
--------------------------------------------------------------------------------
1 | # Updating analyzers
2 |
3 | ## Add `description` mapping using `my_custom_analyzer`
4 | ```
5 | PUT /analyzer_test/_mapping
6 | {
7 | "properties": {
8 | "description": {
9 | "type": "text",
10 | "analyzer": "my_custom_analyzer"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | ## Index a test document
17 | ```
18 | POST /analyzer_test/_doc
19 | {
20 | "description": "Is that Peter's cute-looking dog?"
21 | }
22 | ```
23 |
24 | ## Search query using `keyword` analyzer
25 | ```
26 | GET /analyzer_test/_search
27 | {
28 | "query": {
29 | "match": {
30 | "description": {
31 | "query": "that",
32 | "analyzer": "keyword"
33 | }
34 | }
35 | }
36 | }
37 | ```
38 |
39 | ## Close `analyzer_test` index
40 | ```
41 | POST /analyzer_test/_close
42 | ```
43 |
44 | ## Update `my_custom_analyzer` (remove `stop` token filter)
45 | ```
46 | PUT /analyzer_test/_settings
47 | {
48 | "analysis": {
49 | "analyzer": {
50 | "my_custom_analyzer": {
51 | "type": "custom",
52 | "tokenizer": "standard",
53 | "char_filter": ["html_strip"],
54 | "filter": [
55 | "lowercase",
56 | "asciifolding"
57 | ]
58 | }
59 | }
60 | }
61 | }
62 | ```
63 |
64 | ## Open `analyzer_test` index
65 | ```
66 | POST /analyzer_test/_open
67 | ```
68 |
69 | ## Retrieve index settings
70 | ```
71 | GET /analyzer_test/_settings
72 | ```
73 |
74 | ## Reindex documents
75 | ```
76 | POST /analyzer_test/_update_by_query?conflicts=proceed
77 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/updating-existing-mappings.md:
--------------------------------------------------------------------------------
1 | # Updating existing mappings
2 |
3 | ## Generally, field mappings cannot be updated
4 |
5 | This query won't work.
6 | ```
7 | PUT /reviews/_mapping
8 | {
9 | "properties": {
10 | "product_id": {
11 | "type": "keyword"
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ## Some mapping parameters can be changed
18 |
19 | The `ignore_above` mapping parameter _can_ be updated, for instance.
20 | ```
21 | PUT /reviews/_mapping
22 | {
23 | "properties": {
24 | "author": {
25 | "properties": {
26 | "email": {
27 | "type": "keyword",
28 | "ignore_above": 256
29 | }
30 | }
31 | }
32 | }
33 | }
34 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/using-dot-notation-in-field-names.md:
--------------------------------------------------------------------------------
1 | # Using dot notation in field names
2 |
3 | ## Using dot notation for the `author` object
4 | ```
5 | PUT /reviews_dot_notation
6 | {
7 | "mappings": {
8 | "properties": {
9 | "rating": { "type": "float" },
10 | "content": { "type": "text" },
11 | "product_id": { "type": "integer" },
12 | "author.first_name": { "type": "text" },
13 | "author.last_name": { "type": "text" },
14 | "author.email": { "type": "keyword" }
15 | }
16 | }
17 | }
18 | ```
19 |
20 | ## Retrieve mapping
21 | ```
22 | GET /reviews_dot_notation/_mapping
23 | ```
--------------------------------------------------------------------------------
/Mapping & Analysis/using-the-analyze-api.md:
--------------------------------------------------------------------------------
1 | # Using the Analyze API
2 |
3 | ## Analyzing a string with the `standard` analyzer
4 | ```
5 | POST /_analyze
6 | {
7 | "text": "2 guys walk into a bar, but the third... DUCKS! :-)",
8 | "analyzer": "standard"
9 | }
10 | ```
11 |
12 | ## Building the equivalent of the `standard` analyzer
13 | ```
14 | POST /_analyze
15 | {
16 | "text": "2 guys walk into a bar, but the third... DUCKS! :-)",
17 | "char_filter": [],
18 | "tokenizer": "standard",
19 | "filter": ["lowercase"]
20 | }
21 | ```
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | This repository contains all of the queries used within the [Complete Guide to Elasticsearch course](https://l.codingexplained.com/r/elasticsearch-course?src=github).
--------------------------------------------------------------------------------
/Searching for Data/boosting-query.md:
--------------------------------------------------------------------------------
1 | # Boosting query
2 |
3 | ## Matching juice products
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "size": 20,
9 | "query": {
10 | "match": {
11 | "name": "juice"
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ## Match juice products, but deprioritize apple juice
18 |
19 | ```
20 | GET /products/_search
21 | {
22 | "size": 20,
23 | "query": {
24 | "boosting": {
25 | "positive": {
26 | "match": {
27 | "name": "juice"
28 | }
29 | },
30 | "negative": {
31 | "match": {
32 | "name": "apple"
33 | }
34 | },
35 | "negative_boost": 0.5
36 | }
37 | }
38 | }
39 | ```
40 |
41 | ## Without filtering (deprioritize everything apples)
42 |
43 | ```
44 | GET /products/_search
45 | {
46 | "query": {
47 | "boosting": {
48 | "positive": {
49 | "match_all": {}
50 | },
51 | "negative": {
52 | "match": {
53 | "name": "apple"
54 | }
55 | },
56 | "negative_boost": 0.5
57 | }
58 | }
59 | }
60 | ```
61 |
62 | ## More examples
63 |
64 | ### "I like pasta"
65 |
66 | Boost the relevance scores for pasta products.
67 |
68 | ```
69 | GET /recipes/_search
70 | {
71 | "query": {
72 | "bool": {
73 | "must": [
74 | { "match_all": {} }
75 | ],
76 | "should": [
77 | {
78 | "term": {
79 | "ingredients.name.keyword": "Pasta"
80 | }
81 | }
82 | ]
83 | }
84 | }
85 | }
86 | ```
87 |
88 | ### "I don't like bacon"
89 |
90 | Reduce the relevance scores for bacon products.
91 |
92 | ```
93 | GET /recipes/_search
94 | {
95 | "query": {
96 | "boosting": {
97 | "positive": {
98 | "match_all": {}
99 | },
100 | "negative": {
101 | "term": {
102 | "ingredients.name.keyword": "Bacon"
103 | }
104 | },
105 | "negative_boost": 0.5
106 | }
107 | }
108 | }
109 | ```
110 |
111 | ### Pasta products, preferably without bacon
112 |
113 | ```
114 | GET /recipes/_search
115 | {
116 | "query": {
117 | "boosting": {
118 | "positive": {
119 | "term": {
120 | "ingredients.name.keyword": "Pasta"
121 | }
122 | },
123 | "negative": {
124 | "term": {
125 | "ingredients.name.keyword": "Bacon"
126 | }
127 | },
128 | "negative_boost": 0.5
129 | }
130 | }
131 | }
132 | ```
133 |
134 | ### "I like pasta, but not bacon"
135 |
136 | ```
137 | GET /recipes/_search
138 | {
139 | "query": {
140 | "boosting": {
141 | "positive": {
142 | "bool": {
143 | "must": [
144 | { "match_all": {} }
145 | ],
146 | "should": [
147 | {
148 | "term": {
149 | "ingredients.name.keyword": "Pasta"
150 | }
151 | }
152 | ]
153 | }
154 | },
155 | "negative": {
156 | "term": {
157 | "ingredients.name.keyword": "Bacon"
158 | }
159 | },
160 | "negative_boost": 0.5
161 | }
162 | }
163 | }
164 | ```
--------------------------------------------------------------------------------
/Searching for Data/disjunction-max.md:
--------------------------------------------------------------------------------
1 | # Disjunction max (`dis_max`)
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "dis_max": {
10 | "queries": [
11 | { "match": { "name": "vegetable" } },
12 | { "match": { "tags": "vegetable" } }
13 | ]
14 | }
15 | }
16 | }
17 | ```
18 |
19 | ## Specifying a tie breaker
20 |
21 | ```
22 | GET /products/_search
23 | {
24 | "query": {
25 | "dis_max": {
26 | "queries": [
27 | { "match": { "name": "vegetable" } },
28 | { "match": { "tags": "vegetable" } }
29 | ],
30 | "tie_breaker": 0.3
31 | }
32 | }
33 | }
34 | ```
--------------------------------------------------------------------------------
/Searching for Data/introduction-to-relevance-scoring.md:
--------------------------------------------------------------------------------
1 | # Introduction to relevance scoring
2 |
3 | The below query is the same as in the previous lecture, but here it is anyway for your convenience.
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "match": {
10 | "name": "pasta chicken"
11 | }
12 | }
13 | }
14 | ```
--------------------------------------------------------------------------------
/Searching for Data/nested-inner-hits.md:
--------------------------------------------------------------------------------
1 | # Nested inner hits
2 |
3 | ## Enabling inner hits
4 |
5 | ```
6 | GET /recipes/_search
7 | {
8 | "query": {
9 | "nested": {
10 | "path": "ingredients",
11 | "inner_hits": {},
12 | "query": {
13 | "bool": {
14 | "must": [
15 | {
16 | "match": {
17 | "ingredients.name": "parmesan"
18 | }
19 | },
20 | {
21 | "range": {
22 | "ingredients.amount": {
23 | "gte": 100
24 | }
25 | }
26 | }
27 | ]
28 | }
29 | }
30 | }
31 | }
32 | }
33 | ```
34 |
35 | ## Specifying custom key and/or number of inner hits
36 |
37 | ```
38 | GET /recipes/_search
39 | {
40 | "query": {
41 | "nested": {
42 | "path": "ingredients",
43 | "inner_hits": {
44 | "name": "my_hits",
45 | "size": 10
46 | },
47 | "query": {
48 | "bool": {
49 | "must": [
50 | { "match": { "ingredients.name": "parmesan" } },
51 | { "range": { "ingredients.amount": { "gte": 100 } } }
52 | ]
53 | }
54 | }
55 | }
56 | }
57 | }
58 | ```
--------------------------------------------------------------------------------
/Searching for Data/phrase-searches.md:
--------------------------------------------------------------------------------
1 | # Phrase searches
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "match_phrase": {
10 | "name": "mango juice"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | ## More examples
17 |
18 | ```
19 | GET /products/_search
20 | {
21 | "query": {
22 | "match_phrase": {
23 | "name": "juice mango"
24 | }
25 | }
26 | }
27 | ```
28 |
29 | ```
30 | GET /products/_search
31 | {
32 | "query": {
33 | "match_phrase": {
34 | "name": "Juice (mango)"
35 | }
36 | }
37 | }
38 | ```
39 |
40 | ```
41 | GET /products/_search
42 | {
43 | "query": {
44 | "match_phrase": {
45 | "description": "browse the internet"
46 | }
47 | }
48 | }
49 | ```
--------------------------------------------------------------------------------
/Searching for Data/prefixes-wildcards-regular-expressions.md:
--------------------------------------------------------------------------------
1 | # Prefixes, wildcards & regular expressions
2 |
3 | ## Searching for a prefix
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "prefix": {
10 | "name.keyword": {
11 | "value": "Past"
12 | }
13 | }
14 | }
15 | }
16 | ```
17 |
18 | ## Wildcards
19 |
20 | ### Single character wildcard (`?`)
21 |
22 | ```
23 | GET /products/_search
24 | {
25 | "query": {
26 | "wildcard": {
27 | "tags.keyword": {
28 | "value": "Past?"
29 | }
30 | }
31 | }
32 | }
33 | ```
34 |
35 | ### Zero or more characters wildcard (`*`)
36 |
37 | ```
38 | GET /products/_search
39 | {
40 | "query": {
41 | "wildcard": {
42 | "tags.keyword": {
43 | "value": "Bee*"
44 | }
45 | }
46 | }
47 | }
48 | ```
49 |
50 | ## Regexp
51 |
52 | ```
53 | GET /products/_search
54 | {
55 | "query": {
56 | "regexp": {
57 | "tags.keyword": {
58 | "value": "Bee(f|r)+"
59 | }
60 | }
61 | }
62 | }
63 | ```
64 |
65 | ## Case insensitive searches
66 |
67 | All of the above queries can be made case insensitive by adding the `case_insensitive` parameter, e.g.:
68 |
69 | ```
70 | GET /products/_search
71 | {
72 | "query": {
73 | "prefix": {
74 | "name.keyword": {
75 | "value": "Past",
76 | "case_insensitive": true
77 | }
78 | }
79 | }
80 | }
81 | ```
--------------------------------------------------------------------------------
/Searching for Data/querying-by-field-existence.md:
--------------------------------------------------------------------------------
1 | # Querying by field existence
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "exists": {
10 | "field": "tags.keyword"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | **SQL:** `SELECT * FROM products WHERE tags IS NOT NULL`
17 |
18 | ## Inverting the query
19 |
20 | There is no dedicated query for this, so we do it with the `bool` query.
21 |
22 | ```
23 | GET /products/_search
24 | {
25 | "query": {
26 | "bool": {
27 | "must_not": [
28 | {
29 | "exists": {
30 | "field": "tags.keyword"
31 | }
32 | }
33 | ]
34 | }
35 | }
36 | }
37 | ```
38 |
39 | **SQL:** `SELECT * FROM products WHERE tags IS NULL`
--------------------------------------------------------------------------------
/Searching for Data/querying-nested-objects.md:
--------------------------------------------------------------------------------
1 | # Querying nested objects
2 |
3 | ## Importing test data
4 |
5 | Follow [these instructions](/Managing%20Documents/importing-data-with-curl.md) and specify `recipes-bulk.json` as the file name.
6 |
7 | ## Searching arrays of objects (the wrong way)
8 |
9 | ```
10 | GET /recipes/_search
11 | {
12 | "query": {
13 | "bool": {
14 | "must": [
15 | {
16 | "match": {
17 | "ingredients.name": "parmesan"
18 | }
19 | },
20 | {
21 | "range": {
22 | "ingredients.amount": {
23 | "gte": 100
24 | }
25 | }
26 | }
27 | ]
28 | }
29 | }
30 | }
31 | ```
32 |
33 | ## Create the correct mapping (using the `nested` data type)
34 |
35 | ```
36 | DELETE /recipes
37 | ```
38 |
39 | ```
40 | PUT /recipes
41 | {
42 | "mappings": {
43 | "properties": {
44 | "title": { "type": "text" },
45 | "description": { "type": "text" },
46 | "preparation_time_minutes": { "type": "integer" },
47 | "steps": { "type": "text" },
48 | "created": { "type": "date" },
49 | "ratings": { "type": "float" },
50 | "servings": {
51 | "properties": {
52 | "min": { "type": "integer" },
53 | "max": { "type": "integer" }
54 | }
55 | },
56 | "ingredients": {
57 | "type": "nested",
58 | "properties": {
59 | "name": {
60 | "type": "text",
61 | "fields": {
62 | "keyword": {
63 | "type": "keyword"
64 | }
65 | }
66 | },
67 | "amount": { "type": "integer" },
68 | "unit": { "type": "keyword" }
69 | }
70 | }
71 | }
72 | }
73 | }
74 | ```
75 |
76 | [Import the test data again](#importing-test-data).
77 |
78 | ## Using the `nested` query
79 |
80 | ```
81 | GET /recipes/_search
82 | {
83 | "query": {
84 | "nested": {
85 | "path": "ingredients",
86 | "query": {
87 | "bool": {
88 | "must": [
89 | {
90 | "match": {
91 | "ingredients.name": "parmesan"
92 | }
93 | },
94 | {
95 | "range": {
96 | "ingredients.amount": {
97 | "gte": 100
98 | }
99 | }
100 | }
101 | ]
102 | }
103 | }
104 | }
105 | }
106 | }
107 | ```
--------------------------------------------------------------------------------
/Searching for Data/querying-with-boolean-logic.md:
--------------------------------------------------------------------------------
1 | # Querying with boolean logic
2 |
3 | ## `must`
4 |
5 | Query clauses added within the `must` occurrence type are required to match.
6 |
7 | ```
8 | GET /products/_search
9 | {
10 | "query": {
11 | "bool": {
12 | "must": [
13 | {
14 | "term": {
15 | "tags.keyword": "Alcohol"
16 | }
17 | }
18 | ]
19 | }
20 | }
21 | }
22 | ```
23 |
24 | **SQL:** `SELECT * FROM products WHERE tags IN ("Alcohol")`
25 |
26 | ## `must_not`
27 |
28 | Query clauses added within the `must_not` occurrence type are required to _not_ match.
29 |
30 | ```
31 | GET /products/_search
32 | {
33 | "query": {
34 | "bool": {
35 | "must": [
36 | {
37 | "term": {
38 | "tags.keyword": "Alcohol"
39 | }
40 | }
41 | ],
42 | "must_not": [
43 | {
44 | "term": {
45 | "tags.keyword": "Wine"
46 | }
47 | }
48 | ]
49 | }
50 | }
51 | }
52 | ```
53 |
54 | **SQL:** `SELECT * FROM products WHERE tags IN ("Alcohol") AND tags NOT IN ("Wine")`
55 |
56 | ## `should`
57 |
58 | Matching query clauses within the `should` occurrence type boost a matching document's relevance score.
59 |
60 | ```
61 | GET /products/_search
62 | {
63 | "query": {
64 | "bool": {
65 | "must": [
66 | {
67 | "term": {
68 | "tags.keyword": "Alcohol"
69 | }
70 | }
71 | ],
72 | "must_not": [
73 | {
74 | "term": {
75 | "tags.keyword": "Wine"
76 | }
77 | }
78 | ],
79 | "should": [
80 | {
81 | "term": {
82 | "tags.keyword": "Beer"
83 | }
84 | }
85 | ]
86 | }
87 | }
88 | }
89 | ```
90 |
91 | An example with a few more adding more `should` query clauses:
92 |
93 | ```
94 | GET /products/_search
95 | {
96 | "query": {
97 | "bool": {
98 | "must": [
99 | {
100 | "term": {
101 | "tags.keyword": "Alcohol"
102 | }
103 | }
104 | ],
105 | "must_not": [
106 | {
107 | "term": {
108 | "tags.keyword": "Wine"
109 | }
110 | }
111 | ],
112 | "should": [
113 | {
114 | "term": {
115 | "tags.keyword": "Beer"
116 | }
117 | },
118 | {
119 | "match": {
120 | "name": "beer"
121 | }
122 | },
123 | {
124 | "match": {
125 | "description": "beer"
126 | }
127 | }
128 | ]
129 | }
130 | }
131 | }
132 | ```
133 |
134 | ## `minimum_should_match`
135 |
136 | Since only `should` query clauses are specified, at least one of them must match.
137 |
138 | ```
139 | GET /products/_search
140 | {
141 | "query": {
142 | "bool": {
143 | "should": [
144 | {
145 | "term": {
146 | "tags.keyword": "Beer"
147 | }
148 | },
149 | {
150 | "match": {
151 | "name": "beer"
152 | }
153 | }
154 | ]
155 | }
156 | }
157 | }
158 | ```
159 |
160 | Since a `must` query clause is specified, all of the `should` query clauses are optional.
161 | They are therefore only used to boost the relevance scores of matching documents.
162 |
163 | ```
164 | GET /products/_search
165 | {
166 | "query": {
167 | "bool": {
168 | "must": [
169 | {
170 | "term": {
171 | "tags.keyword": "Alcohol"
172 | }
173 | }
174 | ],
175 | "should": [
176 | {
177 | "term": {
178 | "tags.keyword": "Beer"
179 | }
180 | },
181 | {
182 | "match": {
183 | "name": "beer"
184 | }
185 | }
186 | ]
187 | }
188 | }
189 | }
190 | ```
191 |
192 | This behavior can be configured with the `minimum_should_match` parameter as follows.
193 |
194 | ```
195 | GET /products/_search
196 | {
197 | "query": {
198 | "bool": {
199 | "must": [
200 | {
201 | "term": {
202 | "tags.keyword": "Alcohol"
203 | }
204 | }
205 | ],
206 | "should": [
207 | {
208 | "term": {
209 | "tags.keyword": "Beer"
210 | }
211 | },
212 | {
213 | "match": {
214 | "name": "beer"
215 | }
216 | }
217 | ],
218 | "minimum_should_match": 1
219 | }
220 | }
221 | }
222 | ```
223 |
224 | ## `filter`
225 |
226 | Query clauses defined within the `filter` occurrence type must match.
227 | This is similar to the `must` occurrence type. The difference is that
228 | `filter` query clauses do not affect relevance scores and may be cached.
229 |
230 | ```
231 | GET /products/_search
232 | {
233 | "query": {
234 | "bool": {
235 | "filter": [
236 | {
237 | "term": {
238 | "tags.keyword": "Alcohol"
239 | }
240 | }
241 | ]
242 | }
243 | }
244 | }
245 | ```
246 |
247 | ## Examples
248 |
249 | ### Example #1
250 |
251 | **SQL:** `SELECT * FROM products WHERE (tags IN ("Beer") OR name LIKE '%Beer%') AND in_stock <= 100`
252 |
253 | **Variation #1**
254 |
255 | ```
256 | GET /products/_search
257 | {
258 | "query": {
259 | "bool": {
260 | "filter": [
261 | {
262 | "range": {
263 | "in_stock": {
264 | "lte": 100
265 | }
266 | }
267 | }
268 | ],
269 | "must": [
270 | {
271 | "bool": {
272 | "should": [
273 | { "term": { "tags.keyword": "Beer" } },
274 | { "match": { "name": "Beer" } }
275 | ]
276 | }
277 | }
278 | ]
279 | }
280 | }
281 | }
282 | ```
283 |
284 | **Variation #2**
285 |
286 | ```
287 | GET /products/_search
288 | {
289 | "query": {
290 | "bool": {
291 | "filter": [
292 | {
293 | "range": {
294 | "in_stock": {
295 | "lte": 100
296 | }
297 | }
298 | }
299 | ],
300 | "should": [
301 | { "term": { "tags.keyword": "Beer" } },
302 | { "match": { "name": "Beer" } }
303 | ],
304 | "minimum_should_match": 1
305 | }
306 | }
307 | }
308 |
309 | ```
310 |
311 | ### Example #2
312 |
313 | **SQL:** `SELECT * FROM products WHERE tags IN ("Beer") AND (name LIKE '%Beer%' OR description LIKE '%Beer%') AND in_stock <= 100`
314 |
315 | **Variation #1**
316 |
317 | ```
318 | GET /products/_search
319 | {
320 | "query": {
321 | "bool": {
322 | "filter": [
323 | {
324 | "range": {
325 | "in_stock": {
326 | "lte": 100
327 | }
328 | }
329 | },
330 | {
331 | "term": {
332 | "tags.keyword": "Beer"
333 | }
334 | }
335 | ],
336 | "should": [
337 | { "match": { "name": "Beer" } },
338 | { "match": { "description": "Beer" } }
339 | ],
340 | "minimum_should_match": 1
341 | }
342 | }
343 | }
344 | ```
345 |
346 | **Variation #2**
347 |
348 | ```
349 | GET /products/_search
350 | {
351 | "query": {
352 | "bool": {
353 | "filter": [
354 | {
355 | "range": {
356 | "in_stock": {
357 | "lte": 100
358 | }
359 | }
360 | },
361 | {
362 | "term": {
363 | "tags.keyword": "Beer"
364 | }
365 | }
366 | ],
367 | "must": [
368 | {
369 | "multi_match": {
370 | "query": "Beer",
371 | "fields": ["name", "description"]
372 | }
373 | }
374 | ]
375 | }
376 | }
377 | }
378 | ```
--------------------------------------------------------------------------------
/Searching for Data/range-searches.md:
--------------------------------------------------------------------------------
1 | # Range searches
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "range": {
10 | "in_stock": {
11 | "gte": 1,
12 | "lte": 5
13 | }
14 | }
15 | }
16 | }
17 | ```
18 |
19 | **SQL:** `SELECT * FROM products WHERE in_stock >= 1 AND in_stock <= 5`
20 |
21 | ```
22 | GET /products/_search
23 | {
24 | "query": {
25 | "range": {
26 | "in_stock": {
27 | "gt": 1,
28 | "lt": 5
29 | }
30 | }
31 | }
32 | }
33 | ```
34 |
35 | **SQL:** `SELECT * FROM products WHERE in_stock > 1 AND in_stock < 5`
36 |
37 | ## Querying dates
38 |
39 | ### Basic usage
40 |
41 | ```
42 | GET /products/_search
43 | {
44 | "query": {
45 | "range": {
46 | "created": {
47 | "gte": "2020/01/01",
48 | "lte": "2020/01/31"
49 | }
50 | }
51 | }
52 | }
53 | ```
54 |
55 | ### Specifying the time
56 |
57 | ```
58 | GET /products/_search
59 | {
60 | "query": {
61 | "range": {
62 | "created": {
63 | "gte": "2020/01/01 00:00:00",
64 | "lte": "2020/01/31 23:59:59"
65 | }
66 | }
67 | }
68 | }
69 | ```
70 |
71 | ### Specifying a UTC offset
72 |
73 | ```
74 | GET /products/_search
75 | {
76 | "query": {
77 | "range": {
78 | "created": {
79 | "time_zone": "+01:00",
80 | "gte": "2020/01/01 01:00:00",
81 | "lte": "2020/02/01 00:59:59"
82 | }
83 | }
84 | }
85 | }
86 | ```
87 |
88 | ### Specifying a date format
89 |
90 | ```
91 | GET /products/_search
92 | {
93 | "query": {
94 | "range": {
95 | "created": {
96 | "format": "dd/MM/yyyy",
97 | "gte": "01/01/2020",
98 | "lte": "31/01/2020"
99 | }
100 | }
101 | }
102 | }
103 | ```
--------------------------------------------------------------------------------
/Searching for Data/retrieving-documents-by-ids.md:
--------------------------------------------------------------------------------
1 | # Retrieving documents by IDs
2 |
3 | ```
4 | GET /products/_search
5 | {
6 | "query": {
7 | "ids": {
8 | "values": ["100", "200", "300"]
9 | }
10 | }
11 | }
12 | ```
--------------------------------------------------------------------------------
/Searching for Data/searching-for-terms.md:
--------------------------------------------------------------------------------
1 | # Searching for terms
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "term": {
10 | "tags.keyword": "Vegetable"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | ## Explicit syntax
17 |
18 | A more explicit syntax than the above. Use this if you need to add parameters to the query.
19 |
20 | ```
21 | GET /products/_search
22 | {
23 | "query": {
24 | "term": {
25 | "tags.keyword": {
26 | "value": "Vegetable"
27 | }
28 | }
29 | }
30 | }
31 | ```
32 |
33 | ## Case insensitive search
34 |
35 | ```
36 | GET /products/_search
37 | {
38 | "query": {
39 | "term": {
40 | "tags.keyword": {
41 | "value": "Vegetable",
42 | "case_insensitive": true
43 | }
44 | }
45 | }
46 | }
47 | ```
48 |
49 | ## Searching for multiple terms
50 |
51 | ```
52 | GET /products/_search
53 | {
54 | "query": {
55 | "terms": {
56 | "tags.keyword": ["Soup", "Meat"]
57 | }
58 | }
59 | }
60 | ```
61 |
62 | ## Searching for booleans
63 |
64 | ```
65 | GET /products/_search
66 | {
67 | "query": {
68 | "term": {
69 | "is_active": true
70 | }
71 | }
72 | }
73 | ```
74 |
75 | ## Searching for numbers
76 |
77 | ```
78 | GET /products/_search
79 | {
80 | "query": {
81 | "term": {
82 | "in_stock": 1
83 | }
84 | }
85 | }
86 | ```
87 |
88 | ## Searching for dates
89 |
90 | ```
91 | GET /products/_search
92 | {
93 | "query": {
94 | "term": {
95 | "created": "2007/10/14"
96 | }
97 | }
98 | }
99 | ```
100 |
101 | ## Searching for timestamps
102 |
103 | ```
104 | GET /products/_search
105 | {
106 | "query": {
107 | "term": {
108 | "created": "2007/10/14 12:34:56"
109 | }
110 | }
111 | }
112 | ```
--------------------------------------------------------------------------------
/Searching for Data/searching-multiple-fields.md:
--------------------------------------------------------------------------------
1 | # Searching multiple fields
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "multi_match": {
10 | "query": "vegetable",
11 | "fields": ["name", "tags"]
12 | }
13 | }
14 | }
15 | ```
16 |
17 | ## Per-field relevance boosting
18 |
19 | ```
20 | GET /products/_search
21 | {
22 | "query": {
23 | "multi_match": {
24 | "query": "vegetable",
25 | "fields": ["name^2", "tags"]
26 | }
27 | }
28 | }
29 | ```
30 |
31 | ## Specifying a tie breaker
32 |
33 | ```
34 | GET /products/_search
35 | {
36 | "query": {
37 | "multi_match": {
38 | "query": "vegetable broth",
39 | "fields": ["name", "description"],
40 | "tie_breaker": 0.3
41 | }
42 | }
43 | }
44 | ```
--------------------------------------------------------------------------------
/Searching for Data/the-match-query.md:
--------------------------------------------------------------------------------
1 | # The match query
2 |
3 | ## Basic usage
4 |
5 | ```
6 | GET /products/_search
7 | {
8 | "query": {
9 | "match": {
10 | "name": "pasta"
11 | }
12 | }
13 | }
14 | ```
15 |
16 | Full text queries are analyzed (and therefore case insensitive), so the below query yields the same results.
17 |
18 | ```
19 | GET /products/_search
20 | {
21 | "query": {
22 | "match": {
23 | "name": "PASTA"
24 | }
25 | }
26 | }
27 | ```
28 |
29 | ## Searching for multiple terms
30 |
31 | ```
32 | GET /products/_search
33 | {
34 | "query": {
35 | "match": {
36 | "name": "PASTA CHICKEN"
37 | }
38 | }
39 | }
40 | ```
41 |
42 | ## Specifying the operator
43 |
44 | Defaults to `or`. The below makes both terms required.
45 |
46 | ```
47 | GET /products/_search
48 | {
49 | "query": {
50 | "match": {
51 | "name": {
52 | "query": "pasta chicken",
53 | "operator": "and"
54 | }
55 | }
56 | }
57 | }
58 | ```
--------------------------------------------------------------------------------
/recipes-bulk.json:
--------------------------------------------------------------------------------
1 | {"index":{"_id":1}}
2 | {"title":"Fast and Easy Pasta With Blistered Cherry Tomato Sauce","description":"Cherry tomatoes are almost always sweeter, riper, and higher in pectin than larger tomatoes at the supermarket. All of these factors mean that cherry tomatoes are fantastic for making a rich, thick, flavorful sauce. Even better: It takes only four ingredients and about 10 minutes, start to finish — less time than it takes to cook the pasta you're gonna serve it with.","preparation_time_minutes":12,"servings":{"min":4,"max":6},"steps":["Place pasta in a large skillet or sauté pan and cover with water and a big pinch of salt. Bring to a boil over high heat, stirring occasionally. Boil until just shy of al dente, about 1 minute less than the package instructions recommend.","Meanwhile, heat garlic and 4 tablespoons (60ml) olive oil in a 12-inch skillet over medium heat, stirring frequently, until garlic is softened but not browned, about 3 minutes. Add tomatoes and cook, stirring, until tomatoes begin to burst. You can help them along by pressing on them with the back of a wooden spoon as they soften.","Continue to cook until sauce is rich and creamy, about 5 minutes longer. Stir in basil and season to taste with salt and pepper.","When pasta is cooked, drain, reserving 1 cup of pasta water. Add pasta to sauce and increase heat to medium-high. Cook, stirring and tossing constantly and adding reserved pasta water as necessary to adjust consistency to a nice, creamy flow. Remove from heat, stir in remaining 2 tablespoons (30ml) olive oil, and grate in a generous shower of Parmesan cheese. Serve immediately, passing extra Parmesan at the table."],"ingredients":[{"name":"Dry pasta","amount":450,"unit":"grams"},{"name":"Kosher salt"},{"name":"Cloves garlic","amount":4,"unit":"pcs"},{"name":"Extra-virgin olive oil","amount":90,"unit":"ml"},{"name":"Cherry tomatoes","amount":750,"unit":"grams"},{"name":"Fresh basil leaves","amount":30,"unit":"grams"},{"name":"Freshly ground black pepper"},{"name":"Parmesan cheese"}],"created":"2017-03-29T14:43:21Z","ratings":[4.5,5.0,3.0,4.5]}
3 | {"index":{"_id":2}}
4 | {"title":"Pasta With Butternut Squash and Sage Brown Butter","description":"Brown butter-based pasta sauces are some of the simplest things around. They're emulsions made with a flavorful fat and pasta cooking water that coats the pasta in a thin, creamy sheen of flavor. Throw in some sautéd squash and some sage and you've got yourself a great 30-minute meal. It's a classic fall and winter dish that can be made right on the stovetop.","preparation_time_minutes":30,"servings":{"min":4,"max":6},"steps":["Heat olive oil in a large stainless steel or cast-iron skillet over high heat until very lightly smoking. Immediately add squash, season with salt and pepper, and cook, stirring and tossing occasionally, until well-browned and squash is tender, about 5 minutes. Add butter and shallots and continue cooking, stirring frequently, until butter is lightly browned and smells nutty, about 1 minute longer. Add sage and stir to combine (sage should crackle and let off a great aroma). Remove from heat and stir in lemon juice. Set aside.","In a medium saucepan, combine pasta with enough room temperature or hot water to cover by about 2 inches. Season with salt. Set over high heat and bring to a boil while stirring frequently. Cook, stirring frequently, until pasta is just shy of al dente, about 2 minutes less than the package directions. Drain pasta, reserving a couple cups of the starchy cooking liquid.","Add pasta to skillet with squash along with a splash of pasta water. Bring to a simmer over high heat and cook until the pasta is perfectly al dente, stirring and tossing constantly and adding a splash of water as needed to keep the sauce loose and shiny. Off heat, stir in Parmigiano-Reggiano. Adjust seasoning with salt and pepper and texture with more pasta water as needed. Serve immediately, topped with more cheese at the table."],"ingredients":[{"name":"extra-virgin olive oil","amount":30,"unit":"ml"},{"name":"Butternut squash","amount":450,"unit":"grams"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Unsalted butter","amount":30,"unit":"grams"},{"name":"Shallot","amount":1,"unit":"pcs"},{"name":"Fresh sage leaves","amount":15,"unit":"grams"},{"name":"Lemon juice","amount":15,"unit":"ml"},{"name":"Penne","amount":450,"unit":"grams"},{"name":"Grated fresh Parmigiano-Reggiano cheese","amount":30,"unit":"grams"}],"created":"2014-12-19T10:21:01Z","ratings":[3.0,4.0,3.5,2.0]}
5 | {"index":{"_id":3}}
6 | {"title":"Ricotta Gnocchi With Asparagus and Prosciutto","description":"Fresh ricotta gnocchi may be the fastest fresh-pasta recipe I know. With a little practice, I've gotten it down to under ten minutes (8 minutes 53 seconds, to be precise). But the great part about this recipe is that it serves as a suitable base for a huge variety of sauces and flavors. For instance, last week a friend of mine brought over some delicious first-of-the-season fresh asparagus which we combined with prosciutto and an easy cream sauce to make a delicious impromptu (and fast!) meal on the spot.","preparation_time_minutes":20,"servings":{"min":3,"max":4},"steps":["Set a large pot of salted water over high heat. Meanwhile, heat olive oil in a large skillet or slope-sided saucepan over medium-high heat until shimmering. Add prosciutto and cook, stirring, until mostly crisp, about 2 minutes. Add scallions and garlic and cook, stirring, until fragrant, about 1 minute. Add asparagus and cook, tossing and stirring frequently, until asparagus is just starting to turn tender, about 2 minutes.","Add heavy cream and half of Parmesan. Cook, stirring, until cream thickens and coats the asparagus pieces, about 4 minutes. Season to taste with salt and pepper.","Add gnocchi to now-boiling pot of water, stir gently, and cook until gnocchi float for 30 seconds, about 3 minutes total. Drain gnocchi, reserving 1/4 cup of pasta cooking water. Add gnocchi, lemon juice, half of lemon zest, chives, and 2 tablespoons of cooking water to saucepan with sauce and bring to a hard boil, stirring gently. Add more pasta water to thin sauce to desired consistency. Serve immediately, topped with lemon zest and additional Parmesan cheese."],"ingredients":[{"name":"Extra-virgin olive oil","amount":1,"unit":"tbsp"},{"name":"Thinly sliced prosciutto","amount":115,"unit":"grams"},{"name":"Thinly sliced green garlic","amount":0.25,"unit":"cups"},{"name":"Cloves garlic","amount":2,"unit":"pcs"},{"name":"Asparagus","amount":450,"unit":"grams"},{"name":"Heavy cream","amount":1,"unit":"cups"},{"name":"Grated Parmigiano-Reggiano cheese","amount":55,"unit":"grams"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Lemon juice","amount":1,"unit":"tbsp"},{"name":"grated zest","amount":1,"unit":"tsp"},{"name":"Minced fresh chives","amount":2,"unit":"tbsp"},{"name":"Ricotta"}],"created":"2007-01-23T07:51:22Z","ratings":[5.0]}
7 | {"index":{"_id":4}}
8 | {"title":"Shrimp Fra Diavolo (Spaghetti With Spicy Tomato Sauce)","description":"Lobster fra diavolo is a classic Italian-American pasta dish, but the lobster version is a lot easier for restaurants than home cooks. Shrimp make an excellent stand-in, as long as you know how to infuse the spicy tomato sauce with some real shellfish flavor.","preparation_time_minutes":30,"servings":{"min":4,"max":4},"steps":["Bring a large pot of salted water to a boil. In a medium bowl, toss the shrimp well with the 1/2 teaspoon salt and the baking soda. Set aside.","In a large skillet or sauté pan, heat 4 tablespoons olive oil over medium-high heat until shimmering, Add reserved shrimp shells and cook, stirring constantly, until they've all turned a reddish color, about 4 minutes. Off the heat, remove the shells using tongs, a slotted spatula, and/or a slotted spoon, allowing any excess oil to drain back into skillet as you go; discard the shells. You should still have plenty of oil left in the skillet.","Return the skillet to medium-high heat, add the shrimp, and cook, stirring and turning occasionally, until shrimp are just starting to brown in spots and are almost fully cooked through, about 3 minutes. Off the heat, transfer shrimp to a plate and set aside.","Return the skillet to medium-low heat. Add garlic, oregano, and chili flakes and cook, stirring, until garlic is just beginning to turn golden, about 3 minutes. Add brandy, if using, and cook until almost fully evaporated. Add tomatoes and clam juice and bring to a simmer. Season with salt.","Boil the pasta in the pot of salted water until al dente. Drain, reserving about 1 cup of the pasta cooking water, and add pasta to the sauce along with a splash of the pasta cooking water. Add shrimp and cook over medium-high heat, stirring, until the sauce reduces and clings to pasta and the shrimp are fully heated through; add more pasta water as necessary if the sauce becomes too dry. Season with salt, if necessary.","Stir in parsley and remaining 2 tablespoons olive oil. Serve right away."],"ingredients":[{"name":"Kosher salt","amount":0.5,"unit":"tsp"},{"name":"Large shrimp","amount":340,"unit":"grams"},{"name":"Large pinch baking soda"},{"name":"Extra-virgin olive oil","amount":90,"unit":"ml"},{"name":"Cloves garlic","amount":4,"unit":"pcs"},{"name":"Dried oregano","amount":1.5,"unit":"tsp"},{"name":"Red chili flakes ","amount":1.5,"unit":"tsp"},{"name":"Brandy","amount":30,"unit":"ml"},{"name":"Whole peeled tomatoes","amount":800,"unit":"grams"},{"name":"Clam juice","amount":120,"unit":"ml"},{"name":"Spaghetti","amount":450,"unit":"grams"},{"name":"Minced flat-leaf parsley leaves","amount":0.25,"unit":"cups"}],"created":"2001-07-01T09:43:51Z","ratings":[1.5,3.0,3.5,2.0]}
9 | {"index":{"_id":5}}
10 | {"title":"Stovetop Macaroni and Cheese","description":"This macaroni and cheese — this pot of creamy, gooey, cheesy, glorious macaroni and cheese — was made with three ingredients in about 10 minutes. Seriously. That's one fewer ingredient than you need to add to the pot to make a box of Kraft macaroni and cheese. Not only that, but all three ingredients are staples, with shelf lives of weeks or months, which means that a simple lunch is always on hand.","preparation_time_minutes":8,"servings":{"min":2,"max":2},"steps":["Place macaroni in a medium saucepan or skillet and add just enough cold water to cover. Add a pinch of salt and bring to a boil over high heat, stirring frequently. Continue to cook, stirring, until water has been almost completely absorbed and macaroni is just shy of al dente, about 6 minutes.","Immediately add evaporated milk and bring to a boil. Add cheese. Reduce heat to low and cook, stirring continuously, until cheese is melted and liquid has reduced to a creamy sauce, about 2 minutes longer. Season to taste with more salt and serve immediately."],"ingredients":[{"name":"Elbow macaroni","amount":170,"unit":"grams"},{"name":"Evaporated milk","amount":180,"unit":"ml"},{"name":"Cheddar cheese","amount":170,"unit":"grams"}],"created":"2013-04-17T21:01:55Z","ratings":[5.0,5.0,4.5,5.0,4.0,5.0]}
11 | {"index":{"_id":6}}
12 | {"title":"Spaghetti Aglio e Olio Recipe","description":"One of the most basic pasta sauces, aglio e olio uses just garlic and olive oil (and maybe a pinch of red pepper flakes for heat). It sounds too simple to be good, but it's among the best.","preparation_time_minutes":10,"servings":{"min":4,"max":4},"steps":["In a pot of salted boiling water, cook spaghetti until just shy of al dente (about 1 minute less than the package directs). Reserve pasta cooking water.","Meanwhile, in a large skillet, combine 6 tablespoons oil and garlic. Add pinch of red pepper flakes, if using. Cook over medium heat until garlic is very lightly golden, about 5 minutes. (Adjust heat as necessary to keep it gently sizzling.)","Transfer pasta to skillet along with 1/2 cup pasta water, increase heat to high, and cook, stirring and tossing rapidly, until a creamy, emulsified sauce forms and coats the noodles. Remove from heat, add remaining 2 tablespoons olive oil, and stir well to combine. Mix in parsley, if using, and serve right away."],"ingredients":[{"name":"Kosher salt"},{"name":"Dried spaghetti","amount":450,"unit":"grams"},{"name":"Extra-virgin olive oil","amount":120,"unit":"ml"},{"name":"Cloves garlic","amount":4,"unit":"pcs"},{"name":"Red pepper flakes"},{"name":"Minced flat-leaf parsley"}],"created":"2016-09-23T22:34:00Z","ratings":[0.5,3.5,2.5,3.0,1.0,4.0]}
13 | {"index":{"_id":7}}
14 | {"title":"Cacio e Pepe (Spaghetti With Black Pepper and Pecorino Romano)","description":"If you were to watch a practiced hand make cacio e pepe, you might think the instructions were as simple as this: Cook spaghetti and drain. Toss with olive oil, butter, black pepper, and grated Pecorino Romano cheese. Serve. But we all know that the simplest recipes can often be the most confounding, and so it is with cacio e pepe. Follow those instructions and, if you're lucky, you'll get what you're after: a creamy, emulsified sauce that coats each strand of spaghetti with flavor. More likely, you're gonna get what I (and, from the stories I've heard, many others as well) got on the first few tries — spaghetti in a thin, greasy sauce, or spaghetti with clumps of cheese that refuse to melt. Or, worse, both at the same time. Here's how to make it perfectly every time.","preparation_time_minutes":15,"servings":{"min":2,"max":3},"steps":["Heat 3 tablespoons olive oil and about a teaspoon of black pepper in a medium skillet over medium-low heat until ingredients are fragrant and pepper is barely starting to sizzle, about 1 minute. Set aside.","Place spaghetti in a large skillet and cover with water. Season with a small pinch of salt, then bring to a boil over high heat, prodding spaghetti occasionally with a fork or wooden spoon to prevent it from clumping. Cook until spaghetti is al dente (typically about 1 minute less than the package recommends). Transfer 2 to 3 tablespoons of pasta cooking water to the skillet with the olive oil/pepper mixture. Stir in butter. Using tongs, lift spaghetti and transfer it to the oil/butter mixture.","Add cheese and remaining tablespoon olive oil to the skillet and stir with a fork until cheese is completely melted. Add a few more tablespoons of pasta water to the skillet to adjust consistency, reheating as necessary until the sauce is creamy and coats each strand of spaghetti. Season to taste with salt and more black pepper. Serve immediately, passing extra grated cheese and black pepper at the table."],"ingredients":[{"name":"Extra-virgin olive oil","amount":60,"unit":"ml"},{"name":"Coarsely ground black pepper"},{"name":"Kosher salt"},{"name":"Spaghetti","amount":225,"unit":"grams"},{"name":"Unsalted butter","amount":30,"unit":"grams"},{"name":"Pecorino Romano cheese","amount":55,"unit":"grams"}],"created":"2009-11-24T00:59:21Z","ratings":[4.0,3.0,4.5,3.5]}
15 | {"index":{"_id":8}}
16 | {"title":"Vegan Carbonara Pasta","description":"Carbonara may be one of the most difficult recipes to vegan-ify, since every major ingredient in the sauce is off-limits. But by eating lots of the real deal and getting mighty crafty with an array of unlikely ingredients, I managed to create a vegan carbonara that captures the essence of the original like no other: It's silky and rich, unctuous, and studded with meaty bits, with the sharp, lactic tang of Pecorino Romano (but, of course, no actual Pecorino Romano).","preparation_time_minutes":30,"servings":{"min":4,"max":4},"steps":["In a blender, combine tofu, sauerkraut brine, nutritional yeast, miso, cayenne or chili flakes, smoked paprika, black pepper, and vinegar or lemon juice. Blend at high speed, stopping to scrape down sides if necessary, until a very smooth, silky sauce forms. Season with salt. Add 1/4 cup (60ml) olive oil and blend in at low speed just until emulsified.","In a large sauté pan, heat remaining 1/4 cup (60ml) olive oil over medium-high heat until shimmering. Add mushrooms and cook, stirring, until browned, about 6 minutes.","In a pot of salted boiling water, cook pasta until just al dente. Transfer pasta to pan with mushrooms, reserving pasta-cooking water. Pour on just enough creamy sauce to coat all the pasta, then add about 1/4 cup (60ml) pasta-cooking water. Cook over medium heat, stirring, until sauce forms a silky glaze that coats pasta. Serve."],"ingredients":[{"name":"Silken tofu","amount":200,"unit":"grams"},{"name":"Sauerkraut brine","amount":120,"unit":"ml"},{"name":"Nutritional yeast","amount":15,"unit":"grams"},{"name":"White miso","amount":15,"unit":"ml"},{"name":"Cayenne pepper"},{"name":"Smoked paprika","amount":0.25,"unit":"tsp"},{"name":"Freshly ground black pepper","amount":2,"unit":"tsp"},{"name":"White wine vinegar","amount":1,"unit":"tsp"},{"name":"Kosher salt"},{"name":"Extra-virgin olive oil","amount":120,"unit":"ml"},{"name":"King oyster mushrooms","amount":115,"unit":"grams"},{"name":"Dry spaghetti or penne","amount":450,"unit":"grams"}],"created":"2002-01-04T12:55:42Z","ratings":[3.0,2.5,4.0,4.5,3.5,4.0]}
17 | {"index":{"_id":9}}
18 | {"title":"Bucatini all'Amatriciana","description":"Debate rages over the correct way to make a classic Roman amatriciana sauce of cured pork and tomatoes. We tested all the variables to come up with this ideal version, which packs a delicate heat, gentle black-pepper spice, sharp Pecorino Romano cheese, and the intriguing interplay of sweet-tart tomato sauce and rich, fatty cured pork.","preparation_time_minutes":20,"servings":{"min":4,"max":4},"steps":["In a large skillet, heat olive oil over medium-high heat until shimmering. Add guanciale and pepper flakes and cook, stirring, until lightly browned, about 5 minutes. Add wine and cook, scraping up any browned bits on bottom of pan, until nearly evaporated, about 3 minutes.","Add tomatoes and bring to a simmer. Season with salt and pepper.","Meanwhile, boil pasta in salted water until just shy of al dente, about 1 minute less than package recommends. Using tongs, transfer pasta to sauce, along with 1/4 cup pasta cooking water. Cook over high heat, stirring and tossing rapidly, until pasta is al dente and sauce has thickened and begins to coat noodles. Remove from heat, add cheese, and stir rapidly to incorporate. Season to taste with more salt and pepper. Serve right away, passing more cheese at the table."],"ingredients":[{"name":"Extra-virgin olive oil","amount":1,"unit":"tsp"},{"name":"Guanciale","amount":170,"unit":"grams"},{"name":"Pinch red pepper flakes"},{"name":"Dry white wine","amount":60,"unit":"ml"},{"name":"Can whole peeled tomatoes","amount":425,"unit":"grams"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Dried bucatini pasta","amount":450,"unit":"grams"},{"name":"Grated Pecorino Romano cheese","amount":30,"unit":"grams"}],"created":"2016-09-24T16:51:22Z","ratings":[5.0,5.0,4.5,5.0,4.0,5.0,3.5,4.5]}
19 | {"index":{"_id":10}}
20 | {"title":"Penne With Hot-As-You-Dare Arrabbiata Sauce","description":"Exceedingly simple in concept and execution, arrabbiata sauce is tomato sauce with the distinction of being spicy enough to earn its \"angry\" moniker. Here's how to make it, from start to finish.","preparation_time_minutes":15,"servings":{"min":4,"max":4},"steps":["In a medium saucepan of boiling salted water, cook penne until just short of al dente, about 1 minute less than the package recommends.","Meanwhile, in a large skillet, combine oil, garlic, and pepper flakes. Cook over medium heat until garlic is very lightly golden, about 5 minutes. (Adjust heat as necessary to keep it gently sizzling.)","Add tomatoes, stir to combine, and bring to a bare simmer. When pasta is ready, transfer it to sauce using a strainer or slotted spoon. (Alternatively, drain pasta through a colander, reserving 1 cup of cooking water. Add drained pasta to sauce.)","Add about 1/4 cup pasta water to sauce and increase heat to bring pasta and sauce to a vigorous simmer. Cook, stirring and shaking the pan and adding more pasta water as necessary to keep sauce loose, until pasta is perfectly al dente, 1 to 2 minutes longer. (The pasta will cook more slowly in the sauce than it did in the water.)","Continue cooking pasta until sauce thickens and begins to coat noodles, then remove from heat and toss in cheese and parsley, stirring vigorously to incorporate. Stir in a drizzle of fresh olive oil, if desired. Season with salt and serve right away, passing more cheese at the table."],"ingredients":[{"name":"Kosher salt"},{"name":"Penne pasta","amount":450,"unit":"grams"},{"name":"Extra-virgin olive oil","amount":3,"unit":"tbsp"},{"name":"Clove garlic","amount":1,"unit":"pcs"},{"name":"Crushed red pepper"},{"name":"Can whole peeled tomatoes","amount":400,"unit":"grams"},{"name":"Finely grated Parmesan cheese","amount":60,"unit":"grams"},{"name":"Minced flat-leaf parsley leaves","amount":1,"unit":"handful"}],"created":"2017-04-27T15:14:52Z","ratings":[1.5,2.0,4.0,3.5,3.0,5.0,1.5]}
21 | {"index":{"_id":11}}
22 | {"title":"Spaghetti Puttanesca (Pasta or Spaghetti With Capers, Olives, and Anchovies)","description":"\"Puttanesca\" literally translates to \"in the style of prostitutes,\" supposedly because the pungent aromas of garlic, anchovies, capers, and olives tossed with pasta were how Neapolitan prostitutes would lead customers to their doors. This is one of those stories that seem, in the words of Douglas Adams, apocryphal or at least wildly inaccurate. That said, it's a fitting title — puttanesca packs an aromatic punch and then some.","preparation_time_minutes":15,"servings":{"min":2,"max":3},"steps":["Place spaghetti in a large skillet, sauté pan, or saucepan and cover with water. Add a small pinch of salt. Bring to a boil over high heat, stirring occasionally to prevent pasta from sticking.","Meanwhile, in a medium skillet, combine 4 tablespoons (60ml) oil, garlic, anchovies, and red pepper flakes. Cook over medium heat until garlic is very lightly golden, about 5 minutes. (Adjust heat as necessary to keep it gently sizzling.) Add capers and olives and stir to combine.","Add tomatoes, stir to combine, and bring to a bare simmer. Continue to simmer until pasta is cooked to just under al dente (about 1 minute less than the package recommends).","Using tongs, transfer pasta to sauce. Alternatively, drain pasta through a colander, reserving 1 cup of the cooking water. Add drained pasta to sauce.","Add a few tablespoons of pasta water to sauce and increase heat to bring pasta and sauce to a vigorous simmer. Cook, stirring and shaking the pan and adding more pasta water as necessary to keep sauce loose, until pasta is perfectly al dente, 1 to 2 minutes longer. (The pasta will cook more slowly in the sauce than it did in the water.) Stir in remaining olive oil, parsley, and cheese.","Season with salt and pepper. (Be generous with the pepper and scant with the salt — the dish will be plenty salty from the other ingredients.) If using, stir in canned tuna and break it up with a fork. Serve immediately with more grated cheese at the table."],"ingredients":[{"name":"Dried spaghetti","amount":225,"unit":"grams"},{"name":"Kosher salt"},{"name":"Extra-virgin olive oil","amount":6,"unit":"tbsp"},{"name":"Cloves garlic","amount":4,"unit":"pcs"},{"name":"Anchovy fillets","amount":5,"unit":"pcs"},{"name":"Large pinch red pepper flakes"},{"name":"Capers","amount":0.25,"unit":"cups"},{"name":"Pitted black olives","amount":0.25,"unit":"cups"},{"name":"Whole peeled tomatoes","amount":225,"unit":"grams"},{"name":"Minced fresh parsley leaves","amount":1,"unit":"handful"},{"name":"Finely grated Pecorino Romano or Parmesan cheese","amount":30,"unit":"grams"},{"name":"Freshly ground black pepper"},{"name":"Can oil-packed tuna","amount":140,"unit":"grams"}],"created":"2011-02-21T11:01:26Z","ratings":[0.5,1.0,0.5,1.5,2.0]}
23 | {"index":{"_id":12}}
24 | {"title":"Penne With Melted-Vegetable Sauce","description":"Made with vegetables that have been cooked until meltingly soft, this penne pasta dish is one of those great examples of what makes classic rustic Italian cooking so special: It makes the most of humble and unassuming ingredients, turning them into something downright delicious.","preparation_time_minutes":30,"servings":{"min":4,"max":4},"steps":["In a medium pot of salted boiling water, cook potato until a piece is easily crushed between fingers, about 5 minutes. Using fine strainer, transfer to large mixing bowl. Working one vegetable at a time, continue by boiling carrots, string beans, fennel, and onion until each is well done, about 5 minutes each; add each vegetable to mixing bowl as it is ready.","Add penne to boiling water and cook until al dente, following timing on package. Drain, reserving 1 cup of cooking water.","Meanwhile, add garlic, olive oil, and parsley to vegetables, and mix thoroughly until potatoes have broken down for form a chunky puree. Season with salt and pepper.","Add penne and a healthy grating of Parmigiano-Reggiano to vegetable sauce and stir to combine, adding cooking water 1 tablespoon at a time if sauce is too thick. Spoon into bowls, top with additional grated cheese, and serve."],"ingredients":[{"name":"Kosher salt"},{"name":"Large russet potato","amount":1,"unit":"pcs"},{"name":"Medium carrots","amount":2,"unit":"pcs"},{"name":"String beans","amount":85,"unit":"grams"},{"name":"Small fennel bulb","amount":0.5,"unit":"pcs"},{"name":"Small red onion","amount":1,"unit":"pcs"},{"name":"Dried penne","amount":450,"unit":"grams"},{"name":"Cloves garlic","amount":3,"unit":"pcs"},{"name":"Extra-virgin olive oil","amount":0.75,"unit":"cups"},{"name":"Minced parsley","amount":0.25,"unit":"cups"},{"name":"Freshly ground black pepper"},{"name":"Grated Parmigiano-Reggiano"}],"created":"2017-05-20T13:15:27Z","ratings":[4.0,4.5,4.0,3.5,3.5,3.0,4.0]}
25 | {"index":{"_id":13}}
26 | {"title":"Pesto Pasta With Potatoes and Green Beans","description":"This classic Genovese method of preparing pasta with pesto includes cubes of potato and pieces of green bean, all cooked together in the pasta pot until tender.","preparation_time_minutes":15,"servings":{"min":4,"max":4},"steps":["In a large pot of salted boiling water, boil pasta, potato, and green beans until pasta is al dente and potato and green beans are very tender. Drain, reserving 1 cup cooking water, and transfer pasta, potato, and green beans to a large mixing or serving bowl.","Add pesto sauce to pasta along with 1/4 cup pasta cooking water. Toss well to emulsify pesto and pasta water into a creamy sauce. Add more pasta water, 1 tablespoon at a time, as needed, if pasta is too dry. Drizzle in fresh olive oil, if desired. Serve with Parmigiano Reggiano on the side."],"ingredients":[{"name":"Kosher salt"},{"name":"Dried pasta","amount":450,"unit":"grams"},{"name":"Peeled Yukon Gold potato","amount":140,"unit":"grams"},{"name":"Green beans","amount":110,"unit":"grams"},{"name":"Pesto sauce"},{"name":"Extra-virgin olive oil"},{"name":"Grated Parmigiano Reggiano"}],"created":"2010-12-21T08:52:29Z","ratings":[]}
27 | {"index":{"_id":14}}
28 | {"title":"Lighter Fettuccine Alfredo Recipe","description":"Don't get me wrong — I'm not a health nut or a calorie counter. But let's face it: The feeling you get after downing a bowl of creamy, cheesy fettuccine Alfredo ain't the best. Wouldn't it be great to have a quick and easy version that's just as good as the typical cream-packed rendition, but has a cleaner flavor and doesn't leave you in a food coma?","preparation_time_minutes":30,"servings":{"min":4,"max":4},"steps":["Combine cheese, heavy cream, egg, cornstarch, olive oil, and lemon zest (if using) in a large bowl. Season lightly with salt and heavily with black pepper and whisk to combine. Set aside.","In a large Dutch oven or saucepan, bring 2 quarts of water and 2 tablespoons (24g) of salt to a boil over high heat. Add pasta and cook, stirring frequently to prevent sticking, until cooked but still very firm (not quite al dente), about 45 seconds for fresh pasta or 1 minute less than package directions indicate for dried pasta. Drain pasta into a colander set over a large bowl. Transfer 2 cups (480ml) of cooking water to a liquid measuring cup and discard the rest. Transfer pasta to the now-empty bowl. Add garlic and butter and toss to coat.","Whisking constantly, slowly add 1 1/2 cups of pasta cooking water to bowl with cheese mixture. Transfer cheese mixture to the now-empty pasta cooking pot, scraping the bottom to make sure you get everything. Cook over medium-high heat, stirring constantly with a rubber spatula, until mixture comes to a boil and thickens, about 45 seconds. Season sauce to taste with more salt and pepper as desired. Transfer pasta to sauce mixture and turn to coat. Just before serving, stir in more pasta water to thin sauce out as necessary. Serve immediately, sprinkled with minced herbs, black pepper, and cheese, and drizzled with additional olive oil."],"ingredients":[{"name":"Grated Parmigiano Reggiano cheese","amount":140,"unit":"grams"},{"name":"Heavy cream","amount":30,"unit":"ml"},{"name":"Egg","amount":1,"unit":"pcs"},{"name":"Cornstarch","amount":1,"unit":"tsp"},{"name":"Extra-virgin olive oil","amount":2,"unit":"tbsp"},{"name":"Grated lemon zest","amount":0.5,"unit":"tsp"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Fresh fettuccine","amount":450,"unit":"grams"},{"name":"Minced garlic","amount":1,"unit":"tsp"},{"name":"Unsalted butter","amount":30,"unit":"grams"},{"name":"Minced fresh parsley"}],"created":"2008-06-04T14:13:56Z","ratings":[4.0]}
29 | {"index":{"_id":15}}
30 | {"title":"Lighter Tuna Noodle Casserole","description":"Pasta with a light and creamy sauce, tender chunks of tuna, and peas is ready in about 15 minutes start to finish. This is the kind of recipe that I wish I'd known in college. All it takes is a single large skillet or pot, one burner or hot plate, a bowl, and a fork. That's it. And on top of that, it turns out a dish that's not just good-given-the-constraints, but legitimately good-enough-that-I-would've-made-it-for-that-girl-I-was-trying-to-impress-in-college or even good-enough-for-a-mildly-romantic-weeknight-dinner-with-the-wife.","preparation_time_minutes":15,"servings":{"min":4,"max":4},"steps":["Place noodles in a large saucepan or skillet and cover with room temperature tap water by 1 inch. Season lightly with salt. Bring to a boil over high heat and cook, stirring occasionally, until almost done (follow package directions for cook times). Drain all but 1 cup of water. Return to high heat and simmer until water is almost completely reduced.","Meanwhile, combine créme fraiche, egg, cornstarch, and 1 tablespoon lemon juice in a medium bowl and whisk with a fork or whisk until homogenous.","When noodles are cooked, add créme fraiche mixture to skillet and cook, stirring and tossing constantly, until sauce thickens and coats noodles. Season generously with black pepper and salt. Off heat, gently fold in tuna, peas, parsley, and remaining lemon juice until warmed through. Serve immediately drizzled with extra-virgin olive oil and sprinkled with crushed potato chips, if desired."],"ingredients":[{"name":"Dried wide egg noodles","amount":170,"unit":"grams"},{"name":"Kosher salt"},{"name":"Créme fraiche","amount":170,"unit":"grams"},{"name":"Egg","amount":1,"unit":"pcs"},{"name":"Cornstarch","amount":2,"unit":"tsp"},{"name":"Freshly ground black pepper"},{"name":"Lemon juice","amount":2,"unit":"tbsp"},{"name":"Tuna","amount":1,"unit":"cans"},{"name":"Frozen peas","amount":1,"unit":"cups"},{"name":"Picked fresh parsley leaves","amount":0.25,"unit":"cups"},{"name":"Extra-virgin olive oil"},{"name":"Crushed potato chips"}],"created":"2010-08-21T12:51:36Z","ratings":[4.5,3.5]}
31 | {"index":{"_id":16}}
32 | {"title":"One-Skillet Orecchiette With Shrimp, Spinach, and Mushrooms","description":"Earthy, meaty mushrooms, tender shrimp, and silky strands of spinach are the stars of this easy, one-pot pasta dish. A perfect choice for a weeknight dinner, this recipe comes together in less than 30 minutes and makes minimal mess as the action occurs in a single pan.","preparation_time_minutes":25,"servings":{"min":4,"max":4},"steps":["Heat 1 tablespoon olive oil in a pot or 12-inch skillet over medium-high heat until shimmering. Add mushrooms, season with salt, and cook until golden brown, about 3 minutes. Set aside.","In the same pot, add the remaining 1 tablespoon oil and cook until shimmering. Add shallot and a pinch of salt and cook until softened, about 2 minutes. Add garlic and chili flakes and cook until fragrant, about 30 seconds. Then add white wine, scraping up any brown bits in the pan and allow it to reduce slightly, about 1 minute.","Add the pasta and stock, season with salt and bring to a boil. Cook orecchiette, stirring occasionally, until the broth is mostly absorbed and the pasta is al dente, about 10 minutes.","Stir in the spinach and shrimp and cook until the spinach is wilted and the shrimp are pink throughout, about 3 minutes. Stir in the mushrooms and lemon juice, and season with salt and pepper. Serve immediately."],"ingredients":[{"name":"Extra-virgin olive oil","amount":2,"unit":"tbsp"},{"name":"Oyster mushrooms","amount":2,"unit":"cups"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Shallot","amount":1,"unit":"pcs"},{"name":"Cloves garlic","amount":2,"unit":"pcs"},{"name":"Dried red chili flakes"},{"name":"Dry white wine","amount":0.25,"unit":"cups"},{"name":"Dried orecchiette","amount":450,"unit":"grams"},{"name":"Homemade chicken stock","amount":950,"unit":"ml"},{"name":"Spinach","amount":2,"unit":"cups"},{"name":"Medium shrimp","amount":450,"unit":"grams"},{"name":"Lemon juice","amount":2,"unit":"tbsp"}],"created":"2005-07-30T09:31:01Z","ratings":[]}
33 | {"index":{"_id":17}}
34 | {"title":"Skillet Pasta With Mushrooms, Pancetta, and Wilted Greens","description":"This easy one-pot pasta dish is filled with browned bits of pancetta, earthy Shiitake mushrooms and wilted greens, and comes together in just 25 minutes. Finished with nutty shavings of Parmesan and freshly cracked black pepper, it's a perfect weeknight meal.","preparation_time_minutes":30,"servings":{"min":4,"max":4},"steps":["Heat oil in a large saucepan over medium-high heat until shimmering. Add pancetta and cook until fat begins to render and pancetta is lightly browned, about 2 minutes. Stir in mushrooms, tossing until coated with oil. Cook, stirring, until mushrooms have browned and pancetta has rendered most of its fat, about 5 minutes longer. Stir in shallots and serrano and cook until softened, about 1 minute. Remove from heat and scrape pancetta mixture into a bowl. Set aside.","Add stock to now-empty saucepan and bring to a boil over high heat. Add pasta. Cook pasta, stirring occasionally, for 2 minutes less than package instructions, then stir in greens. Cook for 1 minute longer, then stir in pancetta mixture, along with all the rendered fat. Cook until the pasta is al dente, greens are wilted, and liquid has mostly evaporated and formed an emulsified, creamy sauce, about 1 minute (if necessary, add water a tablespoon at a time if liquid completely evaporates before pasta is tender).","Remove from heat and stir in lemon juice and extra-virgin olive oil. Season to taste with salt and pepper. Spoon onto plates and top with Parmesan cheese. Serve immediately."],"ingredients":[{"name":"Extra-virgin olive oil","amount":0.5,"unit":"tbsp"},{"name":"Panchetta","amount":115,"unit":"grams"},{"name":"Shiitake mushrooms","amount":2,"unit":"cups"},{"name":"Shallot","amount":1,"unit":"pcs"},{"name":"Serrano chile pepper","amount":0.5,"unit":"pcs"},{"name":"Homemade chicken or vegetable stock","amount":5,"unit":"cups"},{"name":"Dried fusilli","amount":450,"unit":"grams"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Fresh lemon juice","amount":2,"unit":"tbsp"},{"name":"Greens (e.g. kale)","amount":2,"unit":"handful"},{"name":"Parmesan"}],"created":"2014-03-03T17:21:08Z","ratings":[5.0,1.5,3.0,4.5]}
35 | {"index":{"_id":18}}
36 | {"title":"Vegetarian Citrus Pasta With Swiss Chard","description":"Sumac is a Middle Eastern spice that has a tart lemony flavor. You can find it in specialty shops or order it online. I used whole wheat fusilli, which required about a half cup more liquid than regular pasta and adds a denser mouthfeel to the dish. You can use regular as well — just decrease the cooking liquid by 1/2 cup.","preparation_time_minutes":20,"servings":{"min":4,"max":4},"steps":["Heat the oil in a 12-inch skillet over medium-high heat until shimmering. Add the chard stems and season with salt and cook until they begin to soften, about 3 minutes. Add the shallots and cook until they have softened, adding another pinch of salt and the red chile flakes, about 2 minutes.","Add the stock to the pan with the fusilli and adjust the heat to maintain a vigorous boil and cook according to the package directions until the pasta has about 4 minutes left in the cooking process. Then, stir in the chard leaves and cook until the pasta and chard are done. Stir in the lemon juice and zest and adjust the seasoning as needed. Divide among plates, sprinkle with the sumac and freshly grated cheese."],"ingredients":[{"name":"Olive oil","amount":2,"unit":"tbsp"},{"name":"Swiss chard","amount":3,"unit":"handful"},{"name":"Large shallot","amount":1,"unit":"pcs"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Dried red chile flakes"},{"name":"Homemade vegetable stock","amount":4.5,"unit":"cups"},{"name":"Whole wheat fusilli pasta","amount":450,"unit":"grams"},{"name":"Fresh lemon juice","amount":2,"unit":"tbsp"},{"name":"Sumac","amount":1,"unit":"tbsp"},{"name":"Parmigiano-Reggiano"}],"created":"2007-10-29T09:12:52Z","ratings":[5.0,5.0,4.5,3.0,2.5,4.0,4.5,3.5,4.0]}
37 | {"index":{"_id":19}}
38 | {"title":"Pasta With Mushrooms, Brussels Sprouts, and Parmesan","description":"A quick dinner of orecchiette pasta tossed in a clingy sauce made with mushrooms, shallots, thyme, and brussels sprouts leaves.","preparation_time_minutes":25,"servings":{"min":4,"max":4},"steps":["Bring a medium pot of salted water to a boil. Heat 1 tablespoon olive oil in a large skillet over high heat until smoking. Add brussels sprouts, toss to coat in oil, season with salt and pepper, cook without moving until well charred on one side. Toss and continue to cook until leaves are bright green and charred in spots, about 2 minutes total. Transfer to a bowl and set aside.","Heat remaining 2 tablespoons oil in the same skillet over high heat until lightly smoking. Add mushrooms and cook, tossing occasionally, until moisture has been evaporated and the mushrooms are well browned, about 4 minutes. Add shallots, garlic, and thyme and cook, stirring, until shallots are softened and fragrant, about 1 minute. Add butter, lemon juice, and stock. Simmer until sauce is reduced and emulsified, about 1 minute. Season to taste with salt and pepper (this may not be necessary if stock is store-bought). Set aside off heat.","Add orecchiette to pot and cook, stirring occasionally, until nearly al dente (about 1 minute less than the package instructions). Drain, reserving 1/2 cup cooking water. Add orecchiette, half of Parmesan, reserved pasta cooking water, and a generous amount of black pepper to mushrooms. Cook, stirring, over high heat until pasta is fully al dente and liquid has thickened into a sauce that coats the pasta, about 1 minute. If sauce looks greasy or broken, add 2 tablespoons of stock or water and stir vigorously to bring it back together. Stir in brussels sprouts leaves and serve, topping with more cheese at the table."],"ingredients":[{"name":"Olive oil","amount":3,"unit":"tbsp"},{"name":"Brussels sprouts","amount":115,"unit":"grams"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Mixed mushrooms","amount":225,"unit":"grams"},{"name":"Shallot","amount":1,"unit":"pcs"},{"name":"Cloves garlic","amount":2,"unit":"pcs"},{"name":"Fresh thyme leaves","amount":1,"unit":"tsp"},{"name":"Butter","amount":4,"unit":"tbsp"},{"name":"Freshly squeezed lemon juice","amount":2,"unit":"tsp"},{"name":"Homemade vegetable or chicken stock","amount":0.5,"unit":"cups"},{"name":"Dried orecchiette","amount":450,"unit":"grams"},{"name":"Freshly grated Parmesan cheese","amount":55,"unit":"grams"}],"created":"2015-02-14T11:01:52Z","ratings":[3.5,4.0,3.5,3.0,4.5,4.0]}
39 | {"index":{"_id":20}}
40 | {"title":"Easy Skillet Baked Ziti With Sausage and Ricotta","description":"Nothing says comfort like a baked pasta dish loaded with creamy sauce and cheese. If my full-fledged No-Boil Baked Ziti is the completist, Super Mario 3 version of the dish, this week's skillet ziti is like using the magic whistle to jump straight to World 8. Not quite as satisfying, but a great alternative if time is of the essence.","preparation_time_minutes":30,"servings":{"min":6,"max":8},"steps":["Place pasta in a large bowl and cover with hot water. Season generously with salt. Let rest, stirring twice during the first 10 minutes, while you prepare the other ingredients.","Use a hand blender or countertop blender to process tomatoes until mostly smooth, but still a little chunky. Set aside 3/4 cup of tomatoes. Combine remaining tomatoes, heavy cream, and chicken stock in a medium bowl. Season to taste with salt and set aside.","Heat oil and butter in a large straight-sided sauté pan or Dutch oven over medium-high heat, swirling, until butter is mostly melted. Add sausage and cook, mashing with a potato masher or a whisk, until sausage is no longer pink, about 5 minutes. Add onion and garlic, reduce heat to medium, and cook, stirring frequently, until softened but not browned, about 5 minutes. Add oregano, red pepper flakes, and half of parsley and cook, stirring, until fragrant, about 1 minute.","Add tomato and cream mixture to pan with sausage. Drain noodles in a large colander set in the sink, then add to pan and stir to combine. Stir in half of ricotta, then rapidly stir in half of mozzarella cheese. (Do not over-stir, or the mixture will stretch and stick to your spoon.) Spoon reserved 3/4 cup tomatoes over top of pasta. Dollop with remaining ricotta and scatter remaining mozzarella over top. Sprinkle with half of Parmigiano-Reggiano. Cover and cook over the lowest possible heat for 3 minutes. Remove from heat and let rest, covered, for 5 minutes.","Uncover, sprinkle with remaining Parmigiano-Reggiano and parsley, and serve immediately."],"ingredients":[{"name":"Dry ziti or penne","amount":450,"unit":"grams"},{"name":"Kosher salt"},{"name":"Whole peeled tomatoes","amount":1,"unit":"cans"},{"name":"Heavy cream","amount":1,"unit":"cups"},{"name":"Homemade chicken stock","amount":1,"unit":"cups"},{"name":"Extra-virgin olive oil","amount":2,"unit":"tbsp"},{"name":"Unsalted butter","amount":2,"unit":"tbsp"},{"name":"Italian sausage","amount":450,"unit":"grams"},{"name":"Large onion","amount":1,"unit":"pcs"},{"name":"Cloves garlic","amount":4,"unit":"pcs"},{"name":"Dried oregano","amount":1,"unit":"tbsp"},{"name":"Crushed red pepper flakes","amount":1,"unit":"tsp"},{"name":"Finely minced fresh parsley leaves","amount":0.25,"unit":"cups"},{"name":"Ricotta cheese","amount":350,"unit":"grams"},{"name":"Low-moisture whole-milk mozzarella cheese","amount":450,"unit":"grams"},{"name":"Roughly grated Parmigiano-Reggiano","amount":85,"unit":"grams"}],"created":"2016-01-29T14:21:01Z","ratings":[3.0]}
41 | {"index":{"_id":21}}
42 | {"title":"Crispy Baked Pasta With Mushrooms, Sausage, and Parmesan Cream Sauce","description":"This recipe starts off with crumbled Italian sausage cooked down in a bit of butter. I sauté a few types of mushrooms in the rendered fat, then flavor them with shallots, garlic, and a little bit of soy sauce and lemon juice (this helps bring out their savoriness while also lightening them up). They get finished in a simple creamy sauce flavored with Parmesan cheese. Add some pasta, top it all of with crisp bread crumbs, bake it directly in the cast iron pan you cooked it in, and you've got yourself a one-skillet meal fit for normal everyday folks who perhaps might occasionally feel like kings.","preparation_time_minutes":30,"servings":{"min":4,"max":6},"steps":["Bring a large pot of salted water to a boil and keep at a bare simmer. Combine bread crumbs, 2 ounces cheese, half of parsley, half of chives, 1/4 of shallots, 1/4 of garlic, and olive oil in a medium bowl and massage with hands until combined. Season to taste with salt and pepper.","Melt butter in a large cast iron skillet over medium-high heat until foaming. Add sausage and cook, mashing it with a potato masher or a wooden spoon until broken up and well browned, about 7 minutes. Use a slotted spoon to transfer sausage to a small bowl, leaving fat behind.","Increase heat to high, add mushrooms to skillet, and cook, stirring frequently, until moisture has evaporated and mushrooms are well-browned, about 10 minutes. Add shallots and garlic and cook, stirring, until fragrant, about 30 seconds. Add soy sauce and lemon juice and stir to combine.","Add flour and cook, stirring, until a thin film begins to form on the bottom of the pan, about 1 minute. Slowly whisk in chicken broth followed by heavy cream. Bring to a simmer and cook until thickened, about 2 minutes. Stir in remaining grated cheese until melted. Stir in remaining parsley and chives. Stir in sausage. Season to taste with salt and lots of black pepper.","Adjust rack to 10 inches below broiler element and preheat broiler to high. Cook pasta in salted water according to package directions, removing it when still just shy of al dente. Drain, reserving 1 cup of cooking liquid. Return to pot. Add mushroom mixture and stir to combine, adding liquid to adjust consistency. Pasta should be very loose but not soupy. Return to cast iron skillet and top with bread crumbs. Broil until golden brown, rotating pan as necessary, 2 to 3 minutes. Serve immediately."],"ingredients":[{"name":"Panko-style bread crumbs","amount":1,"unit":"cups"},{"name":"Grated Parmesan cheese","amount":175,"unit":"grams"},{"name":"Chopped fresh parsley leaves","amount":0.25,"unit":"cups"},{"name":"Finely minced fresh chives","amount":2,"unit":"tbsp"},{"name":"Shallots","amount":2,"unit":"pcs"},{"name":"Cloves garlic","amount":2,"unit":"pcs"},{"name":"Extra-virgin olive oil","amount":2,"unit":"tbsp"},{"name":"Kosher salt"},{"name":"Freshly ground black pepper"},{"name":"Unsalted butter","amount":2,"unit":"tbsp"},{"name":"Italian sausage","amount":225,"unit":"grams"},{"name":"Mixed mushrooms","amount":450,"unit":"grams"},{"name":"Soy sauce","amount":1,"unit":"tbsp"},{"name":"Fresh lemon juice","amount":1,"unit":"tbsp"},{"name":"Flour","amount":2.5,"unit":"tbsp"},{"name":"Homemade chicken stock","amount":2,"unit":"cups"},{"name":"Heavy cream","amount":1,"unit":"cups"},{"name":"Fresh ridged pasta","amount":350,"unit":"grams"}],"created":"2002-10-21T15:07:53Z","ratings":[4.0,3.5]}
43 |
--------------------------------------------------------------------------------