Add Custom CA for internal HTTPS endpoints

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser): 3.3.0

Describe the issue: I am trying to connect my local (docker) OpenSearch Cluster to an LLM that can only be accessed through our internal OpenRouter installation, that is secured by our internal CA. I am unable to provide the CA to OpenSearch for outgoing HTTPS connections. I end up having:

Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target:

Configuration:

My opensearch.yml

plugins.security.ssl.transport.pemcert_filepath: /usr/share/opensearch/config/node.pem

plugins.security.ssl.transport.pemkey_filepath: /usr/share/opensearch/config/node-key.pem

plugins.security.ssl.transport.pemtrustedcas_filepath: /usr/share/opensearch/config/root-ca.pem

transport.ssl.enforce_hostname_verification: false

plugins.security.ssl.http.enabled: true

plugins.security.ssl.http.pemcert_filepath: /usr/share/opensearch/config/node.pem

plugins.security.ssl.http.pemkey_filepath: /usr/share/opensearch/config/node-key.pem

plugins.security.ssl.http.pemtrustedcas_filepath: /usr/share/opensearch/config/root-ca.pem

plugins.security.allow_default_init_securityindex: true

plugins.security.authcz.admin_dn:

  - CN=A,OU=UNIT,O=ORG,L=TORONTO,ST= ONTARIO,C=CA

plugins.security.nodes_dn:

  - 'CN=node1.dns.a-record,OU=UNIT,O=ORG,L=TORONTO,ST=ONTARIO,C=CA'

  - 'CN=node2.dns.a-record,OU=UNIT,O=ORG,L=TORONTO,ST=ONTARIO,C=CA'

plugins.security.audit.type: internal_opensearch

plugins.security.enable_snapshot_restore_privilege: true

plugins.security.check_snapshot_restore_write_privileges: true

plugins.security.restapi.roles_enabled: ["all_access", "security_rest_api_access"]

plugins.security.ssl.transport.truststore_filepath: truststore.jks


cluster.routing.allocation.disk.threshold_enabled: false

opendistro_security.audit.config.disabled_rest_categories: NONE

opendistro_security.audit.config.disabled_transport_categories: NONE



network.host: 0.0.0.0

http.host: 0.0.0.0

transport.host: 0.0.0.0

Ive also followed What CA is used by notifications channels? - #4 by AMKIO , but no luck.

How exactly can I provide a CA to OpenSearch to connect to remote HTTPS endpoints?

Relevant Logs or Screenshots:

@FalcoSuessgott I’ve tested solution provided in the shared link and I was able to connect to my remote Ollama with HTTPS and self-signed certificate.

These were my steps.

  1. Copy cacerts keystore from the running OpenSearch Docker
docker cp opensearch-node1_3.3.0:/usr/share/opensearch/jdk/lib/security/cacerts .
  1. Get certificate from Ollama instance
openssl s_client -connect ollama.pablo.local:443
  1. Save certificate into ollama.crt
  2. Add ollama.crt to the carets keystore
keytool -import -noprompt -trustcacerts -alias ollama -file ollama.crt -keystore cacerts -storepass changeit
  1. Confirm that ollama certificate is in the keystore
 keytool -keystore cacerts -storepass changeit -list|grep ollama

output:

ollama, Nov 3, 2025, trustedCertEntry,

  1. Map cacerts keystore to each OpenSearch Docker container. I also added to OpenSearch Dashboards

OpenSearch:

    volumes:
      - opensearch-data1:/usr/share/opensearch/data
      ...
      - ./certs/cacerts:/usr/share/opensearch/jdk/lib/security/cacerts

OpenSearch Dashboards

    volumes:
      ...
      - ./certs/cacerts:/usr/share/opensearch-dashboards/jdk/lib/security/cacerts
  1. Start docker compose
  2. Create connector with Ollama HTTPS address.
POST /_plugins/_ml/connectors/_create
{
  "name": "Llama-3.3-70B-Instruct Connector",
  "description": "Connector for Llama-3.3-70B-Instruct",
  "protocol": "http",
  "version": 1,
  "parameters": {
    "model": "llama3.1:8b",
    "temperature": 0.7,
    "max_tokens": 500,
    "endpoint": "ollama.pablo.local:443"
  },
  "credential": {
    "api_key": "123456789123456789123456789"
  },
  "client_config" : {
    "read_timeout": 60000,
    "connection_timeout": 30000,
    "max_connection": 256,
    "max_retry_times": 3,
    "retry_backoff_policy": "exponential_full_jitter"
  },
  "actions": [
    {
      "action_type": "predict",
      "method": "POST",
      "url": "https://${parameters.endpoint}/v1/chat/completions",
      "headers": {
        "Content-Type": "application/json",
        "Authorization": "Bearer ${credential.api_key}"
      },
      "request_body": "{ \"model\": \"${parameters.model}\",  \"messages\": ${parameters.messages}, \"temperature\": ${parameters.temperature}, \"max_tokens\": ${parameters.max_tokens}, \"stream\": false }"
    }
  ]
}
  1. Test model
POST /_plugins/_ml/models/es3XSZoBLdsWuLU2kDy5/_predict
{
 "parameters": {
   "messages": [
     {
       "role": "system",
       "content": "You are a helpful assistant."
     },
     {
       "role": "user",
       "content": "Hello!What is the name of this model??"
     }
   ]
 }
}

output:

{
  "inference_results": [
    {
      "output": [
        {
          "name": "response",
          "dataAsMap": {
            "id": "chatcmpl-358",
            "object": "chat.completion",
            "created": 1762175496,
            "model": "llama3.1:8b",
            "system_fingerprint": "fp_ollama",
            "choices": [
              {
                "index": 0,
                "message": {
                  "role": "assistant",
                  "content":
...

Thanks so much that worked like a charme! I was so close haha

1 Like