AWS OPENSEARCH TO S3 bucket export Logstash script issue

Hi everyone ,
we need to export our data from AWS Opensearch to s3 bucket through logstash script. But when we are running the script it will also not showing any error also we are unable to find any index data on s3.

script for reference:

input {
elasticsearch {
hosts => “#########host url#########”
ssl => true
user => “#####”
password => “######”
query => ’
{
“query”: {
“match_all”: {}
}
}

docinfo => true
}
}

output {
s3 {
region => “#########”
bucket => “bucketname”
size_file => 99000000
codec => “json”
encoding => “gzip”
prefix => “update”
}

}

Just to add we are not using any rolearn and session because both server from where we are running the script and our s3 bucket is on same region.

output Sample logs :
tail -f logstash-plain.log
[2022-06-24T10:22:31,622][DEBUG][logstash.outputs.s3 ] config LogStash::Outputs::S3/@rotation_strategy = “size_and_time”
[2022-06-24T10:22:31,622][DEBUG][logstash.outputs.s3 ] config LogStash::Outputs::S3/@validate_credentials_on_root_bucket = true
[2022-06-24T10:22:57,317][DEBUG][logstash.outputs.s3 ] Start periodic rotation check
[2022-06-24T10:22:57,368][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>“main”, “pipeline.workers”=>2, “pipeline.batch.size”=>125, “pipeline.batch.delay”=>50, “pipeline.max_inflight”=>250, :thread=>“#<Thread:0x1c90ba81 run>”}
[2022-06-24T10:22:58,260][INFO ][logstash.javapipeline ] Pipeline started {“pipeline.id”=>“main”}
[2022-06-24T10:22:58,466][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2022-06-24T10:22:59,509][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2022-06-24T10:23:01,324][DEBUG][logstash.outputs.s3 ] Closing {:plugin=>“LogStash::Outputs::S3”}
[2022-06-24T10:23:01,382][DEBUG][logstash.outputs.s3 ] Uploading current workspace
[2022-06-24T10:23:01,793][INFO ][logstash.runner ] Logstash shut down.

can i expect a reply here?