Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
Latest
Describe the issue:
Hello, I followed the steps here Open Search External Model and after completing the steps and sending a test:
POST /_plugins/_ml/models/r4JTNI8BptvQ_tonNlLA/_predict
{
“parameters”: {
“prompt”: “\n\nHuman:hello\n\nAssistant:”
}
}
I get this error back:
{
“error”: {
“root_cause”: [
{
“type”: “illegal_argument_exception”,
“reason”: “localhost”
}
],
“type”: “illegal_argument_exception”,
“reason”: “localhost”
},
“status”: 400
}
This is how i created the connector:
POST /_plugins/_ml/connectors/_create
{
“name”: “Local Ollama Connector”,
“description”: “The connector to local Ollama service”,
“version”: 1,
“protocol”: “http”,
“parameters”: {
“service_name”: “ollama”,
“model”: “llama3”,
“content_type”: “application/json”,
“max_tokens_to_sample”: 8000,
“temperature”: 0.0001,
“response_filter”: “$.completion”
},
“actions”: [
{
“action_type”: “predict”,
“method”: “POST”,
“url”: “http://localhost:11434/api/generate”,
“headers”: {
“content-type”: “application/json”
},
“request_body”: “{"model": "${parameters.model}","prompt":"${parameters.prompt}", "max_tokens_to_sample":${parameters.max_tokens_to_sample}, "temperature":${parameters.temperature}}”
}
]
}
The tutorial shows how to setup a connector for an aws hosted model but i want to set it up a locally hosted ollama.
Any ideas?
Thanks