Create a connector in order to Connecting to remote models

POST /_plugins/_ml/connectors/_create
{
“name”: “OpenAI Chat Connector”,
“description”: “The connector to public OpenAI model service for GPT 3.5”,
“version”: 1,
“protocol”: “http”,
“parameters”: {
“endpoint”: “api.openai.com”,
“model”: “gpt-3.5-turbo”
},
“credential”: {
“openAI_key”: “api-key”
},
“actions”: [
{
“action_type”: “predict”,
“method”: “POST”,
“url”: “https://${parameters.endpoint}/v1/chat/completions”,
“headers”: {
“Authorization”: “Bearer ${credential.openAI_key}”
},
“request_body”: “{ "model": "${parameters.model}", "messages": ${parameters.messages} }”
}
]
}

when i hit this I got error like below
“Request failed to get to the server (status code: 502)”
please help me to sort out this

1 Like

OpenSearch returns 502? Anything interesting in the logs?

The request itself looks good to me. I just compared it with a tutorial that I wrote and tested. Maybe it’s useful: Tutorial: RAG with OpenSearch via ml-commons - Sematext

1 Like

PUT /_cluster/settings
{
“persistent”: {
“plugins.ml_commons.memory_feature_enabled”: true,
“plugins.ml_commons.rag_pipeline_feature_enabled”: true
}
}

when i hit this . I got response like 401
{
“message”:“your request:‘/_cluster/settings’ payload is not allowed”

yesterday i hit below query
PUT /_cluster/settings
{
“persistent”: {
“plugins.ml_commons.connector_access_control_enabled”: true
}
}
gave response like:
200

{
  "acknowledged": true,
  "persistent": {
    "plugins": {
      "ml_commons": {
        "connector_access_control_enabled": "true"
      }
    }
  },
  "transient": {}
}

Sounds like a permission issue?

Like I hit this “POST /_plugins/_ml/connectors/_create” in opensearch dashboard(dev tools) .
But it won’t work in dashboard .

Note that you can’t use a POST request in the Kibana console.

so i have used python code.
import boto3
import requests
from requests_aws4auth import AWS4Auth

host = ‘domain-endpoint/’
region = ‘region’
service = ‘es’
credentials = boto3.Session().get_credentials()
awsauth = AWS4Auth(credentials.access_key, credentials.secret_key, region, service, session_token=credentials.token)

Register repository

path = ‘_plugins/_ml/connectors/_create’
url = host + path

payload = {
“name”: “sagemaker: embedding”,
“description”: “Test connector for Sagemaker embedding model”,
“version”: 1,
“protocol”: “aws_sigv4”,
“credential”: {
“roleArn”: “arn:aws:iam::account-id:role/opensearch-sagemaker-role”
},
“parameters”: {
“region”: “region”,
“service_name”: “sagemaker”
},
“actions”: [
{
“action_type”: “predict”,
“method”: “POST”,
“headers”: {
“content-type”: “application/json”
},
“url”: “https://runtime.sagemaker.region.amazonaws.com/endpoints/endpoint-id/invocations”,
“request_body”: “{ "inputs": { "question": "${parameters.question}", "context": "${parameters.context}" } }”
}
]
}
headers = {“Content-Type”: “application/json”}

r = requests.post(url, auth=awsauth, json=payload, headers=headers)
print(r.status_code)
print(r.text)

There are two reasons your code has worked

  • dev tools user treated as anonymous so you have to use AWS user in order to have access to secret and roles
  • they payload if you notice its different, it has role instead of having api-key

Instead of you can use SSH client to securely connect to remote model. Like PuTTY allow to establish SSH connection and perform function through command line interface.