Deleted snapshot repo but Opensearch remembers it when I create a new one?

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):

2.14

Describe the issue:

After creating a snapshot repo PUT /_snapshot/<snapshot_name> {"..."} in a local S3 repo (not AWS, I’m using SeaweedFS but I don’t think that’s the problem), I can create a new snapshot fine, do a restore - all good. However if I then wipe the S3 volume and recreate the bucket, then of course I wouldn’t expect to be able to do a restore but even listing snapshots in that repo produces a 500 error response.

Then, and this is the problem, if I delete the snapshot repo and recreate it, even if I use a different name, I still get errors because it’s looking for the bucket and path from before - not only are the objects gone from S3 but the repo has been deleted from Opensearch too - so how and why is it remembering them?

This is a problem for disaster recovery I think - how are we supposed to restore over the top of an Opensearch that is throwing errors due to still querying for resources that have gone already? I think a lot of users would want to restore from a local or networked Minio or something like that, if they lose the live data and the online backup.

Configuration:

{
  "type" : "s3",
  "settings" : {
  "bucket" : "backups-bucket",
  "base_path" : "my-backups",
  "endpoint" : "https://localhost:8333",
  }
}

Relevant Logs or Screenshots:

"https://localhost:9200/_snapshot/backup-repo/_all?pretty"
{
  "error" : {
    "root_cause" : [
      {
        "type" : "repository_exception",
        "reason" : "[backup-repo] Unexpected exception when loading repository data"
      }
    ],
    "type" : "repository_exception",
    "reason" : "[backup-repo] Unexpected exception when loading repository data",
    "caused_by" : {
      "type" : "s3_exception",
      "reason" : "We encountered an internal error, please try again. (Service: S3, Status Code: 500, Request ID: 1718043213887023594)",
      "suppressed" : [
        {
          "type" : "sdk_client_exception",
          "reason" : "Request attempt 1 failure: We encountered an internal error, please try again. (Service: S3, Status Code: 500, Request ID: 1718043173920318472)"
        },
        {
          "type" : "sdk_client_exception",
          "reason" : "Request attempt 2 failure: We encountered an internal error, please try again. (Service: S3, Status Code: 500, Request ID: 1718043187215236136)"
        },
        {
          "type" : "sdk_client_exception",
          "reason" : "Request attempt 3 failure: We encountered an internal error, please try again. (Service: S3, Status Code: 500, Request ID: 1718043200572097369)"
        }
      ]
    }
  },
  "status" : 500
}