Logstash Helm code for copying data from ELK to Opensearch

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
2.14.0

Describe the issue:
We were using ELK cluster for observability log monitoring in our current project and now we are moving that to opensearch tool. So we need to copy data from current ECK cluster( indices data ) to Opensearch indices. Current Elastic search version being : V 8.8.2 Opensearch version : V 2.14.0

Currently in ECK stack, we use helm chart to push the data to elastic search , Logstash version used is 8.8.2

FOr migrating the data we are planning to build a logstash pipeline to pull data from Elasticsearch as input plugin and opensearch as output plugin. Can help in getting correct version of logstash , to be compatible with both Elastic search and opensearch versions.

Also guide me with proper documentation to build helm chart.

Configuration:

Relevant Logs or Screenshots:

@Deepa Please follow the OpenSearch documentation to deploy and build Logstash 8.8.2 with OpenSearch output plugin.

I Could successfully build logstash to copy data from elasticsearch to opensearch, But i can see some of records are duplicated , so the total hits within time frame is showing double the number. Can help to comment on this?



Also i have defined the ISM Policy as below

{
    "id": "digital-npd-generic-ism-policy",
    "seqNo": 72546,
    "primaryTerm": 1,
    "policy": {
        "policy_id": "digital-npd-generic-ism-policy",
        "description": "A sample description of the policy",
        "last_updated_time": 1727687657753,
        "schema_version": 21,
        "error_notification": null,
        "default_state": "hot",
        "states": [
            {
                "name": "hot",
                "actions": [
                    {
                        "retry": {
                            "count": 3,
                            "backoff": "exponential",
                            "delay": "1m"
                        },
                        "allocation": {
                            "require": {
                                "temp": "hot"
                            },
                            "include": {},
                            "exclude": {},
                            "wait_for": false
                        }
                    },
                    {
                        "retry": {
                            "count": 3,
                            "backoff": "exponential",
                            "delay": "1m"
                        },
                        "rollover": {
                            "min_primary_shard_size": "20gb",
                            "copy_alias": false
                        }
                    }
                ],
                "transitions": [
                    {
                        "state_name": "warm",
                        "conditions": {
                            "min_index_age": "7d"
                        }
                    }
                ]
            },
            {
                "name": "warm",
                "actions": [
                    {
                        "retry": {
                            "count": 3,
                            "backoff": "exponential",
                            "delay": "1m"
                        },
                        "allocation": {
                            "require": {
                                "temp": "warm"
                            },
                            "include": {},
                            "exclude": {},
                            "wait_for": false
                        }
                    }
                ],
                "transitions": [
                    {
                        "state_name": "cold",
                        "conditions": {
                            "min_index_age": "14d"
                        }
                    }
                ]
            },
            {
                "name": "cold",
                "actions": [
                    {
                        "retry": {
                            "count": 3,
                            "backoff": "exponential",
                            "delay": "1m"
                        },
                        "allocation": {
                            "require": {
                                "temp": "cold"
                            },
                            "include": {},
                            "exclude": {},
                            "wait_for": false
                        }
                    }
                ],
                "transitions": [
                    {
                        "state_name": "delete",
                        "conditions": {
                            "min_index_age": "60d"
                        }
                    }
                ]
            },
            {
                "name": "delete",
                "actions": [
                    {
                        "retry": {
                            "count": 3,
                            "backoff": "exponential",
                            "delay": "1m"
                        },
                        "delete": {}
                    }
                ],
                "transitions": []
            }
        ],
        "ism_template": [
            {
                "index_patterns": [
                    "*",
                    "logstash-b2c*"
                ],
                "priority": 100,
                "last_updated_time": 1720447849482
            }
        ]
    }
}
===================

But even after my index being above 20GB, its not rolled over to new index .