Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
OpenSearch: 2.11
Describe the issue:
I am trying to use an external model(deployed on sagemaker). I created a connector like this
{
"name": "sagemaker_connector_test_10_19_01",
"description": "The testing connection to sagemaker endpoint",
"version": "1.0.0",
"protocol": "aws_sigv4",
"credential": {
"access_key": "{{aws_access_key_id}}",
"secret_key": "{{aws_secret_access_key}}",
"session_token": "{{aws_session_token}}"
},
"parameters": {
"region": "us-west-2",
"service_name": "sagemaker"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"headers": {
"content-type": "application/json"
},
"url": "https://runtime.sagemaker.us-west-2.amazonaws.com/endpoints/huggingface-pytorch-inference-2023-10-17-06-37-51-037/invocations",
"request_body": "{ \"text_doc\": \"${parameters.text_doc}\" }"
}
]
}
I can make predictions via send request body like this
{
"parameters": {"text_doc": "dog"}
}
How to change the connector in order to send request body like this:
{
“text_doc”: “dog”
}
In the official document, they make request like this, this is no additional “parameters” :
POST /_plugins/_ml/_predict/text_embedding/WWQI44MBbzI2oUKAvNUt
{
"text_docs":[ "today is sunny"],
"return_number": true,
"target_response": ["sentence_embedding"]
}
Configuration:
Relevant Logs or Screenshots: