Trying to improve Opensearch performance on vector search

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):

AWS OpenSearch_2_11_R20240502-P1 (latest)

Describe the issue:

We are trying to migrate from elastic cloud to AWS Opensearch, but for some reason we cannot get to the same level of performance on vector search. Here’s the brief example of the search query we are running:

  "query": {
    "nested": {
      "path": "record_embeddings",
      "query": {
        "function_score": {
          "query": {
            "match_all": {}  // You can replace this with your nested query if needed
          "functions": [
              "script_score": {
                "script": {
                  "source": "if (doc['record_embeddings.main_description_embedding'].size() > 0) {return (1.0 + cosineSimilarity(params.prompt_embedding, doc['record_embeddings.main_description_embedding'])) / 2.0;}return 0.0;",
                  "params": {
                                    "prompt_embedding": [

This query works in ~80ms in our ES setup and takes ~400ms on OS, while we just migrated the same index using reindex AND renaming dense_vector fields to knn_vector in index mapping.

I think tried everything, including vectical scaling of search nodes, horizontal scaling using more shards/nodes, etc. The best time I can get with OS is still ~400ms. Any ideas what else I can try?


ES: 8cpu, 64gb ram
OS: tried 4 cpu/32gb/3 nodes, 8cpu/64gb/3nodes, 16cpu/128gb/3nodes, tried running 6 nodes and more shards, etc

My index contains about 400k records and I’m filtering by main_description_embedding knn_field with the following mapping:

"main_description_embedding": {
                        "type": "knn_vector",
                        "dimension": 3072,
                        "index": true,
                        "similarity": "cosine"

Relevant Logs or Screenshots: