Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
Describe the issue:
I’ve configured OpenSearch and Data Prepper locally, and set up the pipeline to test the otel_logs_source plugin. Although the setup appears to be running without any errors, and I’m sending data to it, I’m not seeing any logs ingested into the OpenSearch index.
Configuration:
Opnsearch docker-compose.yaml:
services:
opensearch-node1:
image: opensearchproject/opensearch:latest
container_name: opensearch-node1
environment:
- cluster.name=opensearch-cluster
- node.name=opensearch-node1
- discovery.seed_hosts=opensearch-node1,opensearch-node2
- cluster.initial_cluster_manager_nodes=opensearch-node1,opensearch-node2
- bootstrap.memory_lock=true
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
- OPENSEARCH_INITIAL_ADMIN_PASSWORD=Aravinth@31
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
volumes:
- opensearch-data1:/usr/share/opensearch/data
ports:
- 9200:9200
- 9600:9600
networks:
- opensearch-net
opensearch-node2:
image: opensearchproject/opensearch:latest
container_name: opensearch-node2
ports:
- 9201:9200
environment:
- cluster.name=opensearch-cluster
- node.name=opensearch-node2
- discovery.seed_hosts=opensearch-node1,opensearch-node2
- cluster.initial_cluster_manager_nodes=opensearch-node1,opensearch-node2
- bootstrap.memory_lock=true
- "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
- OPENSEARCH_INITIAL_ADMIN_PASSWORD=Aravinth@31
ulimits:
memlock:
soft: -1
hard: -1
nofile:
soft: 65536
hard: 65536
volumes:
- opensearch-data2:/usr/share/opensearch/data
networks:
- opensearch-net
opensearch-dashboards:
image: opensearchproject/opensearch-dashboards:latest
container_name: opensearch-dashboards
ports:
- 5601:5601
expose:
- "5601"
environment:
OPENSEARCH_HOSTS: '["https://opensearch-node1:9200","https://opensearch-node2:9200"]'
networks:
- opensearch-net
volumes:
opensearch-data1:
opensearch-data2:
networks:
opensearch-net:
external: true
data-prepper docker-compose.yaml:
services:
data-prepper:
image: opensearchproject/data-prepper:latest
container_name: data-prepper
platform: linux/amd64
ports:
- "2021:2021"
- "21892:21892"
- "4900:4900"
volumes:
- ./pipelines.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml
- ./data-prepper-config.yaml:/usr/share/data-prepper/config/data-prepper-config.yaml
networks:
- opensearch-net
volumes:
data-prepper:
networks:
opensearch-net:
external: true
data-prepper-config.yaml
ssl: false
serverPort: 4900
authentication:
http_basic:
username: admin
password: Aravinth@31
pipelines.yaml
version: "2"
log-pipeline:
source:
otel_logs_source:
path: /events/ingest
ssl: false
processor:
- add_entries:
entries:
- metadata_key: "doc_id"
value_expression: "/log.attributes.EventUuId"
- rename_keys:
entries:
- from_key: "body"
to_key: "eventName"
- from_key: "log.attributes.ExternalIp"
to_key: "resource.attributes.ExternalIp"
- from_key: "log.attributes.EndpointId"
to_key: "resource.attributes.EndpointId"
- from_key: "log.attributes.LocalIp"
to_key: "resource.attributes.LocalIp"
- from_key: "log.attributes.TenantId"
to_key: "resource.attributes.TenantId"
- from_key: "log.attributes.DeviceName"
to_key: "resource.attributes.DeviceName"
- from_key: "log.attributes.AgentVersion"
to_key: "instrumentationScope.version"
- convert_entry_type:
key: "eventName"
type: "integer"
- grok:
match:
log.attributes.Hashes:
- "SHA1=%{NOTSPACE:log.attributes.HashSHA1},MD5=%{NOTSPACE:log.attributes.HashMD5},SHA256=%{NOTSPACE:log.attributes.HashSHA256},IMPHASH=%{NOTSPACE:log.attributes.ImpHash}"
- delete_entries:
with_keys: ["log.attributes.EventUuId", log.attributes.Hashes]
sink:
- opensearch:
hosts:
- "https://opensearch-node1:9200"
index: "data-prepper-0001"
username: admin
password: Aravinth@31
document_id: ${getMetadata("doc_id")}
insecure: true
Relevant Logs or Screenshots:
2025-09-02 16:16:46 Reading pipelines and data-prepper configuration files from Data Prepper home directory.
2025-09-02 16:16:46 /usr/bin/java
2025-09-02 16:16:46 Found openjdk version of 17.0
2025-09-02 16:16:49 2025-09-02T10:46:49,826 [main] INFO org.opensearch.dataprepper.pipeline.parser.transformer.DynamicConfigTransformer - No transformation needed
2025-09-02 16:16:51 2025-09-02T10:46:51,473 [main] INFO org.opensearch.dataprepper.plugins.kafka.extension.KafkaClusterConfigExtension - Applying Kafka Cluster Config Extension.
2025-09-02 16:16:52 2025-09-02T10:46:52,362 [main] WARN org.opensearch.dataprepper.plugins.source.otellogs.OTelLogsSource - Creating otel-logs-source without authentication. This is not secure.
2025-09-02 16:16:52 2025-09-02T10:46:52,362 [main] WARN org.opensearch.dataprepper.plugins.source.otellogs.OTelLogsSource - In order to set up Http Basic authentication for the otel-logs-source, go here: https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-logs-source#authentication-configurations
2025-09-02 16:16:52 2025-09-02T10:46:52,727 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink
2025-09-02 16:16:52 2025-09-02T10:46:52,734 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the username provided in the config.
2025-09-02 16:16:52 2025-09-02T10:46:52,752 [main] WARN org.opensearch.dataprepper.core.pipeline.server.HttpServerProvider - Creating Data Prepper server without TLS. This is not secure.
2025-09-02 16:16:52 2025-09-02T10:46:52,752 [main] WARN org.opensearch.dataprepper.core.pipeline.server.HttpServerProvider - In order to set up TLS for the Data Prepper server, go here: https://github.com/opensearch-project/data-prepper/blob/main/docs/configuration.md#server-configuration
2025-09-02 16:16:53 2025-09-02T10:46:53,032 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2025-09-02 16:16:53 2025-09-02T10:46:53,534 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initialized OpenSearch sink
2025-09-02 16:16:54 2025-09-02T10:46:54,226 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.server.CreateServer - Adding service with path: /events/ingest
2025-09-02 16:16:54 2025-09-02T10:46:54,644 [log-pipeline-sink-worker-2-thread-1] WARN org.opensearch.dataprepper.plugins.server.CreateServer - Creating otel_logs_source without SSL/TLS. This is not secure.
2025-09-02 16:16:54 2025-09-02T10:46:54,645 [log-pipeline-sink-worker-2-thread-1] WARN org.opensearch.dataprepper.plugins.server.CreateServer - In order to set up TLS for the otel_logs_source, go here: https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-trace-source#ssl
2025-09-02 16:16:54 2025-09-02T10:46:54,885 [log-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.source.otellogs.OTelLogsSource - Started otel_logs_source...
these is no logs in pipeline too.
curl --request POST \
--url http://localhost:21892/events/ingest \
--header 'Content-Type: application/json' \
--data '{
"resourceLogs": [
{
"resource": {},
"scopeLogs": [
{
"scope": {},
"logRecords": [
{
"timeUnixNano": "1756457718527272600",
"body": {
"stringValue": "0"
},
"attributes": [
{
"key": "DeviceName",
"value": {
"stringValue": "mitsdevice"
}
},
{
"key": "TenantId",
"value": {
"stringValue": "-99"
}
},
{
"key": "EndpointId",
"value": {
"stringValue": "htut89"
}
},
{
"key": "AgentVersion",
"value": {
"stringValue": "1.0.0.0"
}
},
{
"key": "ExternalIp",
"value": {
"stringValue": "100.0.0.1"
}
},
{
"key": "LocalIp",
"value": {
"stringValue": "100.0.0.1"
}
},
{
"key": "Image",
"value": {
"stringValue": "C:/Program Files/PostgreSQL/17/bin/postgres.exe"
}
},
{
"key": "ParentProcessId",
"value": {
"intValue": "6148"
}
},
{
"key": "CommandLine",
"value": {
"stringValue": "\"C:/Program Files/PostgreSQL/17/bin/postgres.exe\" --forkchild=\"autovacuum worker\" 1564"
}
},
{
"key": "ParentImage",
"value": {
"stringValue": "C:\\Program Files\\PostgreSQL\\17\\bin\\postgres.exe"
}
},
{
"key": "ParentCommandLine",
"value": {
"stringValue": "\"C:\\Program Files\\PostgreSQL\\17\\bin\\postgres.exe\" -D \"C:\\Program Files\\PostgreSQL\\17\\data\" "
}
},
{
"key": "IntegrityLevel",
"value": {
"stringValue": ""
}
},
{
"key": "UserName",
"value": {
"stringValue": ""
}
},
{
"key": "Hashes",
"value": {
"stringValue": "1B17FE58BDAAB5BC62A2410771CFBF61"
}
},
{
"key": "FileVersion",
"value": {
"stringValue": "17.2"
}
},
{
"key": "Description",
"value": {
"stringValue": "PostgreSQL Server"
}
},
{
"key": "Product",
"value": {
"stringValue": "PostgreSQL"
}
},
{
"key": "Company",
"value": {
"stringValue": "PostgreSQL Global Development Group"
}
},
{
"key": "ProviderName",
"value": {
"stringValue": "Windows Kernel"
}
},
{
"key": "ParentUser",
"value": {
"stringValue": "NT AUTHORITY\\NETWORK SERVICE"
}
},
{
"key": "CurrentDirectory",
"value": {}
},
{
"key": "OriginalFileName",
"value": {
"stringValue": "postgres.exe"
}
},
{
"key": "LogonId",
"value": {
"stringValue": ""
}
},
{
"key": "ThreadId",
"value": {
"intValue": "0"
}
},
{
"key": "EventUuId",
"value": {
"stringValue": "b0749eac-290b-473d-bcca-9bd6401453e3"
}
},
{
"key": "traceId",
"value": {
"stringValue": "4fd0c4f0e3f64b7d9b8c2c4a9d2a1fcd"
}
},
{
"key": "spanId",
"value": {
"stringValue": "6a7c12345de34c92"
}
}
],
"traceId": "4fd0c4f0e3f64b7d9b8c2c4a9d2a1fcd",
"spanId": "6a7c12345de34c92"
}
]
}
]
}
]
}'
for curl request with content type as application/json i am getting Missing or invalid Content-Type header.If I tried with content type as application/grpc for same request, I am getting 200, but could not see any logs in pipeline also I couldn’t see any data in index.
Also I tried with grpc curl.
grpcurl -plaintext \
-proto opentelemetry/proto/collector/logs/v1/logs_service.proto \
-proto opentelemetry/proto/logs/v1/logs.proto \
-proto opentelemetry/proto/common/v1/common.proto \
-proto opentelemetry/proto/resource/v1/resource.proto \
-d '{
"resourceLogs": [{
"resource": {
"attributes": [{
"key": "service.name",
"value": { "stringValue": "test-service" }
}]
},
"scopeLogs": [{
"logRecords": [{
"timeUnixNano": "1756457718527272600",
"body": { "stringValue": "Hello from grpcurl!" },
"attributes": [{
"key": "DeviceName",
"value": { "stringValue": "mitsdevice" }
}]
}]
}]
}]
}' \
localhost:21892 opentelemetry.proto.collector.logs.v1.LogsService/Export
ERROR:
Code: Unimplemented
Message: unexpected HTTP status code received from server: 404 (Not Found); transport: received unexpected content-type "text/plain; charset=utf-8"
'm using the otel_logs_source plugin to ingest logs via an AWS pipeline, where I’m able to send data in JSON format. Now, I want to replicate this setup locally and need to confirm whether JSON-formatted logs can be sent locally as well. Additionally, I need to understand the structure of the incoming JSON once it’s received by the otel_logs_source plugin, as I’m facing issues with configuring processors due to lack of clarity on the schema. Could someone share an example or documentation that shows how the incoming JSON looks after ingestion via otel_logs_source, or guide me on how to inspect it?