Opensearch remote storage setup

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
Opensearch 2.18

Describe the issue:
I’m currently debugging OpenSearch remote storage. We have our own customized S3 storage implementation.

The configuration is roughly as follows.

We would like to use this as a remote store backend. However, after reviewing the official documentation, we couldn’t find specific guidance on how to configure OpenSearch to use a custom S3 remote storage.

Do you have any more detailed instructions or best practices for setting up remote store with a custom S3 storage like ours?

Here is the reference documentation we checked:

Configuration:

[default]
access_key = 123456
secret_key = 780
access_token =
add_encoding_exts =
add_headers =
bucket_location = US
ca_certs_file =
cache_file =
check_ssl_certificate = False
check_ssl_hostname = True
cloudfront_host = https://qiafan.nuobject.io
content_disposition =
content_type =
default_mime_type = application/octet-stream
delay_updates = False
delete_after = False
delete_after_fetch = False
delete_removed = False
dry_run = False
enable_multipart = True
encrypt = False
expiry_date =
expiry_days =
expiry_prefix =
follow_symlinks = False
force = False
get_continue = False
gpg_command = None
gpg_decrypt = %(gpg_command)s -d --verbose --no-use-agent --batch --yes --passphrase-fd %(passphrase_fd)s -o %(output_file)s %(input_file)s
gpg_encrypt = %(gpg_command)s -c --verbose --no-use-agent --batch --yes --passphrase-fd %(passphrase_fd)s -o %(output_file)s %(input_file)s
gpg_passphrase =
guess_mime_type = True
host_base = https://qifan.nuobject.io
host_bucket = https://qifan.nuobject.io/%(bucket)
human_readable_sizes = False
invalidate_default_index_on_cf = False
invalidate_default_index_root_on_cf = True
invalidate_on_cf = False
kms_key =
limit = -1
limitrate = 0
list_md5 = False
log_target_prefix =
long_listing = False
max_delete = -1
mime_type =
multipart_chunk_size_mb = 15
multipart_max_chunks = 10000
preserve_attrs = True
progress_meter = True
proxy_host =
proxy_port = 0
put_continue = False
recursive = False
recv_chunk = 65536
reduced_redundancy = False
requester_pays = False
restore_days = 1
restore_priority = Standard
send_chunk = 65536
server_side_encryption = False
signature_v2 = False
signurl_use_https = False
simpledb_host = sdb.amazonaws.com
skip_existing = False
socket_timeout = 300
stats = False
stop_on_error = False
storage_class =
throttle_max = 100
upload_id =
urlencoding_mode = normal
use_http_expect = False
use_https = True
use_mime_magic = True
verbosity = WARNING
website_endpoint = http://%(bucket)s.s3-website-%(location)s.amazonaws.com/
website_error =
website_index = index.html

Relevant Logs or Screenshots:

would someone help look at this request?

anyone that can help?