Migration from OpenSearch1.1 to Elasticsearch7.18 using logstash

Hi Experts,

I am trying to migrate my Opensearch cluster version 1.1 to elastic cloud 7.18. I have created a logstash pipeline for the same who’s configuration looks like this :

input {
opensearch {
hosts => [“url”]
user => “”
password => “”
index => “*”
size => 100
scroll => “1m”
query => ‘{ “query”: { “match_all”: {}} }’
docinfo => true
filter {
output {
elasticsearch {
hosts => [“”]
user => “”
password => “”
index => “%{[@metadata][_index]}”
stdout { codec => rubydebug { metadata => true } }
The data for some indexes ranges ~50million and i am not sure how error handling should be done for big load if something goes wrong while migration, so i have following questions:

  1. Does logstash provides any error handling out of the box which will make sure the entire process of migration (let say data around ~60GB) completes seamless.
  2. If due to any network error migration fails, how logstash handles this,will the migration starts from the same record where it failed, how logstash avoids duplication in this scenario
  3. Can we do batch migration in logstash, if yes how ?
  4. If during batch migration any record of an batch encounters issue, does logstash supports any of rollback and retry to make sure everything get migrated successfully.

I am new to logstash so can someone suggest industry accepted way to configure pipeline to migrate clusters (~60GB) seamlessly from one version to another.

Hey @gjain0586

Probably best option is to create a snapshoot and upload it to Elasticsearch. Take note, Opensearch 1.1 is equal to Elasticsearch 7.12 NOT 7.18 you may run into issues.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.