Skip to content

Commit

Permalink
Chapter/chapter330:64_Geohash_grid_agg.asciidoc (elasticsearch-cn#281)
Browse files Browse the repository at this point in the history
* geohash_grid_agg translate completed

* update

* add spaces

* update
  • Loading branch information
javasgl authored and medcl committed Dec 1, 2016
1 parent 686ab2c commit 3dfe4fd
Showing 1 changed file with 14 additions and 32 deletions.
46 changes: 14 additions & 32 deletions 330_Geo_aggs/64_Geohash_grid_agg.asciidoc
Original file line number Diff line number Diff line change
@@ -1,24 +1,15 @@
[[geohash-grid-agg]]
=== Geohash Grid Aggregation
=== Geohash 网格聚合

The number of results returned by a query may be far too many to display each
geo-point individually on a map.((("geohash_grid aggregation")))((("aggregations", "geohash_grid"))) The `geohash_grid` aggregation buckets nearby
geo-points together by calculating the geohash for each point, at the level of
precision that you define.
通过一个查询返回的结果数量对在地图上单独的显示每一个位置点而言可能太多了。((("geohash_grid aggregation")))((("aggregations", "geohash_grid"))) `geohash_grid` 按照你定义的精度计算每一个点的 geohash 值而将附近的位置聚合在一起。

The result is a grid of cells--one cell per geohash--that can be
displayed on a map. By changing the precision of the geohash, you can
summarize information across the whole world, by country, or by city block.
结果是一个网格—一个单元格表示一个可以显示在地图上的 geohash 。通过改变 geohash 的精度,你可以按国家或者城市街区来概括全世界。

The aggregation is _sparse_—it((("sparse aggregations"))) returns only cells that contain documents.
If your geohashes are too precise and too many buckets are generated, it will
return, by default, the 10,000 most populous cells--those containing the
most documents.((("buckets", "generated by geohash_grid aggregation, controlling"))) However, it still needs to generate _all_ the buckets in
order to figure out which are the most populous 10,000. You need to control
the number of buckets generated by doing the following:
聚合是稀疏的—它((("sparse aggregations"))) 仅返回那些含有文档的单元。
如果 geohashes 太精确,将产生太多的 buckets,它将默认返回那些包含了大量文档、最密集的10000个单元。((("buckets", "generated by geohash_grid aggregation, controlling"))) 然而,为了计算哪些是最密集的 Top10000 ,它还是需要产生 _所有_ 的 buckets 。可以通过以下方式来控制 buckets 的产生数量:

1. Limit the result with a `geo_bounding_box` query.
2. Choose an appropriate `precision` for the size of your bounding box.
1. 使用 `geo_bounding_box` 来限制结果。
2. 为你的边界大小选择一个适当的 `precision` (精度)

[source,json]
----------------------------
Expand Down Expand Up @@ -53,15 +44,12 @@ GET /attractions/restaurant/_search
}
}
----------------------------
<1> The bounding box limits the scope of the search to the greater New York area.
<2> Geohashes of precision `5` are approximately 5km x 5km.
<1> 边界框将搜索限制在大纽约区的范围
<2> Geohashes 精度为 `5` 大约是 5km x 5km

Geohashes with precision `5` measure about 25km^2^ each, so 10,000 cells at
this precision would cover 250,000km^2^. The bounding box that we specified
measures approximately 44km x 33km, or about 1,452km^2^, so we are well within
safe limits; we definitely won't create too many buckets in memory.
Geohashes 精度为 `5` ,每个约25平方公里,所以10000个单元按这个精度将覆盖250000平方公里。我们指定的边界范围,约44km x 33km,或约1452平方公里,所以我们的边界在安全范围内;我们绝对不会在内存中创建了太多的 buckets。

The response from the preceding request looks like this:
前面的请求响应看起来是这样的:

[source,json]
----------------------------
Expand All @@ -82,17 +70,11 @@ The response from the preceding request looks like this:
}
...
----------------------------
<1> Each bucket contains the geohash as the `key`.
<1> 每个 bucket 包含作为 `key` 的 geohash 值

Again, we didn't specify any sub-aggregations, so all we got back was the
document count. We could have asked for popular restaurant types, average
price, or other details.
同样,我们也没有指定任何子聚合,所以我们得到是文档计数。如果需要,我们也可以了解这些 buckets 中受欢迎的餐厅类型、平均价格或其他细节。

[TIP]
====
To plot these buckets on a map, you need a library that
understands how to convert a geohash into the equivalent bounding box or
central point. Libraries exist in JavaScript and other languages
that will perform this conversion for you, but you can also use information from
<<geo-bounds-agg>> to perform a similar job.
要在地图上绘制这些 buckets,你需要一个将 geohash 转换成同等边界框或中心点的库。JavaScript 和其他语言已有的库会为你执行这个转换,但你也可以从使用 <<geo-bounds-agg,geo-bounds-agg>> 的信息来进行类似的工作。
====

0 comments on commit 3dfe4fd

Please sign in to comment.