Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
Latest
Describe the issue :
Hello, I followed the steps here Open Search External Model and after completing the steps and sending a test:
POST /_plugins/_ml/models/r4JTNI8BptvQ_tonNlLA/_predict
{
“parameters”: {
“prompt”: “\n\nHuman:hello\n\nAssistant:”
}
}
I get this error back:
{
“error”: {
“root_cause”: [
{
“type”: “illegal_argument_exception”,
“reason”: “localhost”
}
],
“type”: “illegal_argument_exception”,
“reason”: “localhost”
},
“status”: 400
}
This is how i created the connector:
POST /_plugins/_ml/connectors/_create
{
“name”: “Local Ollama Connector”,
“description”: “The connector to local Ollama service”,
“version”: 1,
“protocol”: “http”,
“parameters”: {
“service_name”: “ollama”,
“model”: “llama3”,
“content_type”: “application/json”,
“max_tokens_to_sample”: 8000,
“temperature”: 0.0001,
“response_filter”: “$.completion”
},
“actions”: [
{
“action_type”: “predict”,
“method”: “POST”,
“url”: “http://localhost:11434/api/generate ”,
“headers”: {
“content-type”: “application/json”
},
“request_body”: “{"model": "${parameters.model}","prompt":"${parameters.prompt}", "max_tokens_to_sample":${parameters.max_tokens_to_sample}, "temperature":${parameters.temperature}}”
}
]
}
The tutorial shows how to setup a connector for an aws hosted model but i want to set it up a locally hosted ollama.
Any ideas?
Thanks
In the end i think it is this bug / feature request:
opened 11:27AM - 20 Feb 24 UTC
enhancement
v2.14.0
I have use case as well that involves using an "externally hosted model" that is… self hosted and located within a private network (or more simply another use cases is if i'm using an api gateway that has a private ip address ), however it seems there is a hard coded requirement that externally hosted models can not have a private ip address:
https://github.com/opensearch-project/ml-commons/blob/0903d5da4bc9fb8051621de05759dbdd36613972/ml-algorithms/src/main/java/org/opensearch/ml/engine/httpclient/MLHttpClientFactory.java#L77-L84
This seems like an arbitrary restriction, which i think should either be removed or only used when a config flag is provided.
“Cannot use private ip address for model”
ylwu
May 1, 2024, 8:38pm
3
Yes, it’s by design to block the local IPs. This is mainly for security consideration. We can continue discussion on the Github issue
1 Like
I saw your post on Github, thank you for the reply and the explanation. The idea is good but like others have said there needs to be an option to disable the check so that we can test locally and show “proof of concept” before moving stuff to the cloud. Thanks
ylwu
May 1, 2024, 10:01pm
5
Make sense, we will check with security guys first
system
Closed
June 30, 2024, 10:02pm
6
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.