Configure Kafka sink for SASL_SSL

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):

Data Prepper 2.12.0

Describe the issue:

There is barely any documentation on the kafka sink. How do I configure the plugin to send messages using SASL_SSL protocol? Is this feature supported?

For context, we are hosting Data Prepper on ECS in an effort to migrate away from a pipeline with Logstash sending to kafka using security_protocol => "SASL_SSL".

Configuration:

What I have tried:

  sink:
  - kafka:
      bootstrap_servers:
      - "server1.domain.net:9094"
      - "server2.domain.net:9094"
      - "server3.domain.net:9094"
      - "server4.domain.net:9094"
      - "server5.domain.net:9094"
      - "server6.domain.net:9094"
      topic:
        name: TOPIC_NAME
      serde_format: "json"
      partition_key: "messageTimestamp"
      producer_properties:
        enable_idempotence: true
        retries: 1
      encryption:
        trust_store_file_path: /usr/share/data-prepper/certs/truststore.jks
        trust_store_password: ********
      authentication:
        sasl:
          scram:
            username: "user"
            password: "pass"
            mechanism: "SCRAM-SHA-256"

Relevant Logs or Screenshots:

Error:

2025-09-03T00:08:46,255 [kafka-producer-network-thread | producer-1] WARN  org.apache.kafka.clients.NetworkClient$DefaultMetadataUpdater - [Producer clientId=producer-1] Bootstrap broker server4.domain.net:9094 (id: -4 rack: null) disconnected

2025-09-02T21:13:56,164 [pool-5-thread-1] ERROR org.opensearch.dataprepper.plugins.kafka.producer.KafkaCustomProducer - Error occurred while publishing Topic TOPIC_NAME not present in metadata after 60000 ms.

According to Solved: Problem kafka producer - WARN clients.NetworkClien… - Cloudera Community - 347167, these error message indicates a security misconfiguration.

Hi @nanochristal ,

Are you still seeing an issue here? or did you manage to get it working?

Leeroy.