I Could successfully build logstash to copy data from elasticsearch to opensearch, But i can see some of records are duplicated , so the total hits within time frame is showing double the number. Can help to comment on this?
Also i have defined the ISM Policy as below
{
"id": "digital-npd-generic-ism-policy",
"seqNo": 72546,
"primaryTerm": 1,
"policy": {
"policy_id": "digital-npd-generic-ism-policy",
"description": "A sample description of the policy",
"last_updated_time": 1727687657753,
"schema_version": 21,
"error_notification": null,
"default_state": "hot",
"states": [
{
"name": "hot",
"actions": [
{
"retry": {
"count": 3,
"backoff": "exponential",
"delay": "1m"
},
"allocation": {
"require": {
"temp": "hot"
},
"include": {},
"exclude": {},
"wait_for": false
}
},
{
"retry": {
"count": 3,
"backoff": "exponential",
"delay": "1m"
},
"rollover": {
"min_primary_shard_size": "20gb",
"copy_alias": false
}
}
],
"transitions": [
{
"state_name": "warm",
"conditions": {
"min_index_age": "7d"
}
}
]
},
{
"name": "warm",
"actions": [
{
"retry": {
"count": 3,
"backoff": "exponential",
"delay": "1m"
},
"allocation": {
"require": {
"temp": "warm"
},
"include": {},
"exclude": {},
"wait_for": false
}
}
],
"transitions": [
{
"state_name": "cold",
"conditions": {
"min_index_age": "14d"
}
}
]
},
{
"name": "cold",
"actions": [
{
"retry": {
"count": 3,
"backoff": "exponential",
"delay": "1m"
},
"allocation": {
"require": {
"temp": "cold"
},
"include": {},
"exclude": {},
"wait_for": false
}
}
],
"transitions": [
{
"state_name": "delete",
"conditions": {
"min_index_age": "60d"
}
}
]
},
{
"name": "delete",
"actions": [
{
"retry": {
"count": 3,
"backoff": "exponential",
"delay": "1m"
},
"delete": {}
}
],
"transitions": []
}
],
"ism_template": [
{
"index_patterns": [
"*",
"logstash-b2c*"
],
"priority": 100,
"last_updated_time": 1720447849482
}
]
}
}
===================
But even after my index being above 20GB, its not rolled over to new index .