No handler found for uri [/_plugins/_ml/models] and method [GET]

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
image: opensearchproject/opensearch:latest # This should be the same image used for opensearch-node1 to avoid issues

Describe the issue:

I am developing with opensearch locally, I want to register a model according to the documentation here Register model - OpenSearch Documentation

Like this

def setup_neural_sparse_model(self):
        """Registers and deploys a SPLADE model using the ML Commons plugin."""
        model_registration = {
            "name": "amazon/neural-sparse/opensearch-neural-sparse-encoding-doc-v3-distill",
            "version": "1.0.0",
            "model_group_id": "Z1eQf4oB5Vm0Tdw8EIP2",
            "model_format": "TORCH_SCRIPT"
            }
        try:
            response = self.os.transport.perform_request(
                method="POST",
                url="/_plugins/_ml/models/_register",
                body=model_registration
            )
            model_id = response.get("model_id")
            print(f"Model registered with ID: {model_id}")
        except Exception as e:
            print(f"Error registering model: {e}")
            return None

But I get this error

Error: RequestError(400, 'no handler found for uri [/_plugins/_ml/models] and method [GET]', 'no handler found for uri [/_plugins/_ml/models] and method [GET]')

Configuration:

If I exec into the container image I can see that the ml plugin is there

Relevant Logs or Screenshots:

@ithompson That is the expected response. To list installed models try the below.

POST /_plugins/_ml/models/_search
{
  "query": {
    "match_all": {}
  },
  "size": 1000
}

@pablo

Thanks for quick reply, the search models endpoint returns 0 results

{'took': 0, 'timed_out': False, '_shards': {'total': 0, 'successful': 0, 'skipped': 0, 'failed': 0}, 'hits': {'total': {'value': 0, 'relation': 'eq'}, 'max_score': 0.0, 'hits': []}}

My question would be how to I register a model with this endpoint? Im sending a POST but error returns that I was using GET

response = self.os.transport.perform_request(
                method="POST",
                url="/_plugins/_ml/models/_register",
                body=model_registration
            )

Ah I think I see, im porting over from elastic search so the get(“model_id”) was incorrect. The register model endpoint returned a taskid, and guessing that I need to then deploy the model once registered, and then setup the ingestion pipeline