Hi @Klavs ,
You can use geohash to aggregate or cluster documents into a bucket and show only the count of the documents in every cluster. This will significantly reduce latency since you are not fetching all documents from OpenSearch.
For higher precision or higher zoom level, make sure that you are providing a filter using geo_bounding_box, like below, else, it will create many small cells with millions of bucket.
POST /index_name/_search?size=0
{
"aggregations": {
"higher_zoom": {
"filter": {
"geo_bounding_box": {
"location": {
"lat": 83.76,
"lon": -81.2
"bottom_right": "POINT (3.0 42.2)"
}
}
},
"aggregations": {
"1": {
"geohash_grid": {
"field": "location",
"precision": 8
}
}
}
}
}
}
The response will have list of buckets with key as geo_hash value and count of documents on each bucket.
If you want to show those aggregation value as geo_point, you can use geo_centroid as sub aggregation within the bucket like below, this will calculate centroid based on all geo_points within the bucket, which can be used later to show it on the maps.
{
"aggs": {
"filter_agg": {
"filter": {
"geo_bounding_box": {
"ignore_unmapped": true,
"location": {
"top_left": {
"lat": 90,
"lon": -180
},
"bottom_right": {
"lat": -90,
"lon": 180
}
}
}
},
"aggs": {
"1": {
"geohash_grid": {
"field": "location",
"precision": 3
},
"aggs": {
"2": {
"geo_centroid": {
"field": "location"
}
}
}
}
}
}
},
"size": 0,
"excludes": []
}
}
The above is available as part of Coordinate Map in OpenSearch Dashboards. I would recommend experiment your use case using Coordinate map. If you are interested in query that is used by Dashboard, you can use inspect to understand request/response as well.
If your use case also includes fetch documents for a given bucket and display it on Maps, we could provide more details on that too.
Please let us know if you have any more questions.