Connecting to externally hosted models error "illegal_argument_exception"

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
Latest

Describe the issue:
Hello, I followed the steps here Open Search External Model and after completing the steps and sending a test:

POST /_plugins/_ml/models/r4JTNI8BptvQ_tonNlLA/_predict
{
“parameters”: {
“prompt”: “\n\nHuman:hello\n\nAssistant:”
}
}

I get this error back:

{
“error”: {
“root_cause”: [
{
“type”: “illegal_argument_exception”,
“reason”: “localhost”
}
],
“type”: “illegal_argument_exception”,
“reason”: “localhost”
},
“status”: 400
}

This is how i created the connector:

POST /_plugins/_ml/connectors/_create
{
“name”: “Local Ollama Connector”,
“description”: “The connector to local Ollama service”,
“version”: 1,
“protocol”: “http”,
“parameters”: {
“service_name”: “ollama”,
“model”: “llama3”,
“content_type”: “application/json”,
“max_tokens_to_sample”: 8000,
“temperature”: 0.0001,
“response_filter”: “$.completion”
},
“actions”: [
{
“action_type”: “predict”,
“method”: “POST”,
“url”: “http://localhost:11434/api/generate”,
“headers”: {
“content-type”: “application/json”
},
“request_body”: “{"model": "${parameters.model}","prompt":"${parameters.prompt}", "max_tokens_to_sample":${parameters.max_tokens_to_sample}, "temperature":${parameters.temperature}}”
}
]
}

The tutorial shows how to setup a connector for an aws hosted model but i want to set it up a locally hosted ollama.

Any ideas?

Thanks

In the end i think it is this bug / feature request:

“Cannot use private ip address for model”

Yes, it’s by design to block the local IPs. This is mainly for security consideration. We can continue discussion on the Github issue

1 Like

I saw your post on Github, thank you for the reply and the explanation. The idea is good but like others have said there needs to be an option to disable the check so that we can test locally and show “proof of concept” before moving stuff to the cloud. Thanks

Make sense, we will check with security guys first