How does opensearch fit into other models

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
2.13.0

Describe the issue:
How does opensearch fit into other models

Configuration:

I am a student and I want OpenSearch to adapt to Alibaba Cloud’s Qwen model

for example

https://help.aliyun.com/zh/dashscope/developer-reference/tongyi-qianwen-7b-14b-72b-api-detailes

Steps for Reference Documents

  1. Add connector endpoint to trusted URLs:
PUT /_cluster/settings
{
    "persistent": {
        "plugins.ml_commons.trusted_connector_endpoints_regex": [
          "^https://runtime\\.sagemaker\\..*[a-z0-9-]\\.amazonaws\\.com/.*$",
          "^https://api\\.openai\\.com/.*$",
          "^https://api\\.cohere\\.ai/.*$",
          "^https://bedrock-runtime\\..*[a-z0-9-]\\.amazonaws\\.com/.*$",
          "^https?://.*$"
        ]
    }
}

result

{
  "acknowledged": true,
  "persistent": {
    "plugins": {
      "ml_commons": {
        "trusted_connector_endpoints_regex": [
          """^https://runtime\.sagemaker\..*[a-z0-9-]\.amazonaws\.com/.*$""",
          """^https://api\.openai\.com/.*$""",
          """^https://api\.cohere\.ai/.*$""",
          """^https://bedrock-runtime\..*[a-z0-9-]\.amazonaws\.com/.*$""",
          "^https?://.*$"
        ]
      }
    }
  },
  "transient": {}
}

2, create a connector

POST /_plugins/_ml/connectors/_create
{
    "name": "qwen Chat Connector",
    "description": "The connector to public qwen model service for qwen-72b-chat",
    "version": 1,
    "protocol": "http",
    "parameters": {
        "model": "qwen-72b-chat"
    },
    "credential": {
        "qwen_key": "sk-*********************"
    },
    "actions": [
        {
            "action_type": "predict",
            "method": "POST",
            "url": "https://dashscope.aliyuncs.com/api/v1/services/aigc/text-generation/generation",
            "headers": {
                "Authorization": "Bearer ${credential.qwen_key}"
            },
            "request_body": """{ 
              "model": "${parameters.model}", 
              "input": { 
                "messages": ${parameters.messages} 
              } 
            }"""
        }
    ]
}

result

{
  "connector_id": "tDIq8I4B9O1XISSYmOLW"
}
  1. register model
POST /_plugins/_ml/models/_register
{
    "name": "qwen-72b-chat",
    "function_name": "remote",
    "description": "test model",
    "connector_id": "tDIq8I4B9O1XISSYmOLW"
}

result

{
  "task_id": "vzIs8I4B9O1XISSYoeJK",
  "status": "CREATED",
  "model_id": "wDIs8I4B9O1XISSYoeJu"
}

4.deploy model

POST /_plugins/_ml/models/wDIs8I4B9O1XISSYoeJu/_deploy

result

{
  "task_id": "yjIu8I4B9O1XISSY-OJj",
  "task_type": "DEPLOY_MODEL",
  "status": "COMPLETED"
}
  1. Test model inference
POST /_plugins/_ml/models/wDIs8I4B9O1XISSYoeJu/_predict
{
  "parameters": {
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant."
      },
      {
        "role": "user",
        "content": "Hello!"
      }
    ]
  }
}

result

{
  "inference_results": [
    {
      "output": [
        {
          "name": "response",
          "dataAsMap": {
            "output": {
              "finish_reason": "stop",
              "text": "Hello there! How can I assist you today?"
            },
            "usage": {
              "total_tokens": 18,
              "output_tokens": 10,
              "input_tokens": 8
            },
            "request_id": "24f11e08-ccd0-9657-97ba-261033a8f143"
          }
        }
      ],
      "status_code": 200
    }
  ]
}
  1. Create a search pipeline
PUT /_search/pipeline/rag_pipeline
{
  "response_processors": [
    {
      "retrieval_augmented_generation": {
        "tag": "qwen_pipeline_demo",
        "description": "Demo pipeline Using qwen Connector",
        "model_id": "wDIs8I4B9O1XISSYoeJu",
        "context_field_list": ["text"],
        "system_prompt": "You are a helpful assistant",
        "user_instructions": "Generate a concise and informative answer in less than 100 words for the given question"
      }
    }
  ]
}

result

{
  "acknowledged": true
}
  1. Ingest RAG data into an index
PUT /my_rag_test_data
{
  "settings": {
    "index.search.default_pipeline" : "rag_pipeline"
  },
  "mappings": {
    "properties": {
      "text": {
        "type": "text"
      }
    }
  }
}

result

{
  "acknowledged": true,
  "shards_acknowledged": true,
  "index": "my_rag_test_data"
}

8.ingest the supplementary data into the index

POST _bulk
{"index": {"_index": "my_rag_test_data5", "_id": "1"}}
{"text": "Abraham Lincoln was born on February 12, 1809, the second child of Thomas Lincoln and Nancy Hanks Lincoln, in a log cabin on Sinking Spring Farm near Hodgenville, Kentucky.[2] He was a descendant of Samuel Lincoln, an Englishman who migrated from Hingham, Norfolk, to its namesake, Hingham, Massachusetts, in 1638. The family then migrated west, passing through New Jersey, Pennsylvania, and Virginia.[3] Lincoln was also a descendant of the Harrison family of Virginia; his paternal grandfather and namesake, Captain Abraham Lincoln and wife Bathsheba (née Herring) moved the family from Virginia to Jefferson County, Kentucky.[b] The captain was killed in an Indian raid in 1786.[5] His children, including eight-year-old Thomas, Abraham's father, witnessed the attack.[6][c] Thomas then worked at odd jobs in Kentucky and Tennessee before the family settled in Hardin County, Kentucky, in the early 1800s."}
{"index": {"_index": "my_rag_test_data5", "_id": "2"}}
{"text": "Chart and table of population level and growth rate for the New York City metro area from 1950 to 2023. United Nations population projections are also included through the year 2035.\\nThe current metro area population of New York City in 2023 is 18,937,000, a 0.37% increase from 2022.\\nThe metro area population of New York City in 2022 was 18,867,000, a 0.23% increase from 2021.\\nThe metro area population of New York City in 2021 was 18,823,000, a 0.1% increase from 2020.\\nThe metro area population of New York City in 2020 was 18,804,000, a 0.01% decline from 2019."}

result

{
  "took": 7,
  "errors": false,
  "items": [
    {
      "index": {
        "_index": "my_rag_test_data",
        "_id": "1",
        "_version": 1,
        "result": "created",
        "_shards": {
          "total": 2,
          "successful": 2,
          "failed": 0
        },
        "_seq_no": 0,
        "_primary_term": 1,
        "status": 201
      }
    },
    {
      "index": {
        "_index": "my_rag_test_data",
        "_id": "2",
        "_version": 1,
        "result": "created",
        "_shards": {
          "total": 2,
          "successful": 2,
          "failed": 0
        },
        "_seq_no": 1,
        "_primary_term": 1,
        "status": 201
      }
    }
  ]
}
  1. Create a conversation memory
POST /_plugins/_ml/memory/
{
  "name": "Conversation about NYC population"
}

result

{
  "memory_id": "5DI08I4B9O1XISSYMuJU"
}

10.Search

GET /my_rag_test_data/_search
{
  "query": {
    "match": {
      "text": "What's the population of NYC metro area in 2023"
    }
  },
  "ext": {
    "generative_qa_parameters": {
      "llm_model": "qwen-72b-chat",
      "llm_question": "What's the population of NYC metro area in 2023",
      "memory_id": "5DI08I4B9O1XISSYMuJU",
      "context_size": 5,
      "message_size": 5,
      "timeout": 15
    }
  }
}

error

{
  "error": {
    "root_cause": [
      {
        "type": "illegal_argument_exception",
        "reason": "Please check the provided generative_qa_parameters are complete and non-null(https://opensearch.org/docs/latest/search-plugins/conversational-search/#rag-pipeline). Messages in the memory can not have Null value for input and response"
      }
    ],
    "type": "illegal_argument_exception",
    "reason": "Please check the provided generative_qa_parameters are complete and non-null(https://opensearch.org/docs/latest/search-plugins/conversational-search/#rag-pipeline). Messages in the memory can not have Null value for input and response"
  },
  "status": 400
}

After reading some documents, I don’t know what the reason is

Could you please provide an answer.
Thank you very much