Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
12.2
Describe the issue:
I created a connector with Bedrock as an external ML engine, and I registered the agent. However, when I try to query it, it generates the following error:
Cannot invoke \"org.opensearch.ml.common.output.Output.toXContent\
This only happens when I try to use an agent, but if I query it like this:
POST /_plugins/_ml/models/-EoBH44BwS0-eTAGiweE/_predict
{
"parameters": {
"question": "Que dia fue ayer"
}
}
it works.
Configuration:
Connector:
POST /_plugins/_ml/connectors/_create
{
"name": "Amazon Bedrock",
"description": "Connector for Amazon Bedrock (Llama 2 13B)",
"version": 1,
"protocol": "aws_sigv4",
"credential": {
"access_key": "////////////",
"secret_key": "////////////"
},
"parameters": {
"region": "us-east-1",
"service_name": "bedrock",
"model_name": "meta.llama2-13b-chat-v1"
},
"actions": [
{
"action_type": "predict",
"method": "POST",
"headers": {
"content-type": "application/json"
},
"url": "https://bedrock-runtime.${parameters.region}.amazonaws.com/model/${parameters.model_name}/invoke",
"request_body": "{\"prompt\":\"${parameters.question}\"}"
}
]
}
Agents:
POST /_plugins/_ml/agents/_register
{
"name": "prod agent for embedding model",
"type": "flow",
"description": "this is a test agent peroduccion",
"tools": [
{
"type": "MLModelTool",
"description": "A general tool to answer any question",
"parameters": {
"model_id": "-EoBH44BwS0-eTAGiweE",
"prompt": "\n\nHuman:${parameters.question}\n\nAssistant:"
}
}
]
}
Agent test
POST /_plugins/_ml/agents/J0oHH44BwS0-eTAGYggn/_execute
{
"parameters": {
"question": "que dia fue ayer"
}
}
Relevant Logs or Screenshots:
"{\"error\":{\"reason\":\"System Error\",\"details\":\"Cannot invoke \\\"org.opensearch.ml.common.output.Output.toXContent(org.opensearch.core.xcontent.XContentBuilder, org.opensearch.core.xcontent.ToXContent$Params)\\\" because \\\"this.output\\\" is null\",\"type\":\"NullPointerException\"},\"status\":500}"