com.amazon.dataprepper.pipeline.ProcessWorker - raw-pipeline Worker: No records received from buffer

My data collector is up and running. But I don’t see any traces received from collector to prepper. Both collector and prepper are up and running, So can I debug this issue.

Here is my otel-localconfig.yml

receivers:
  otlp:
    protocols:
      grpc:
       endpoint: 0.0.0.0:4317
       http:
       endpoint: 0.0.0.0:55681

processors:
  batch:

exporters:
  otlp/data-prepper:
    endpoint: 0.0.0.0:21890
    insecure: true

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp/data-prepper] receivers:
  otlp:
    protocols:
      grpc:
       endpoint: 0.0.0.0:4317
       http:
       endpoint: 0.0.0.0:55681

processors:
  batch:

exporters:
  otlp/data-prepper:
    endpoint: 0.0.0.0:21890
    insecure: true

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp/data-prepper] 

Here is the pipeline.yml

entry-pipeline:
  delay: "100"
  source:
    otel_trace_source:
      ssl: false
  sink:
    - pipeline:
        name: "raw-pipeline"
    - pipeline:
        name: "service-map-pipeline"
raw-pipeline:
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - otel_trace_raw_prepper:
  sink:
    - opensearch:
        hosts: ["https://localhost:9200"]
        insecure: true
        username: admin
        password: admin
        trace_analytics_raw: true
service-map-pipeline:
  delay: "100"
  source:
    pipeline:
      name: "entry-pipeline"
  prepper:
    - service_map_stateful:
  sink:
    - opensearch:
        hosts: ["https://localhost:9200"]
        insecure: true
        username: admin
        password: admin
        trace_analytics_service_map: true

Here is the data-prepper-config.yml

ssl : false

I have tried changing the sink to stdout and still the same. Looks like the collector is not sending data to dataprepper?
Here is the trace from collector

Value: 38.000000
Metric #16
Descriptor:
     -> Name: scrape_series_added
     -> Description: The approximate number of new series in this scrape
     -> Unit:
     -> DataType: Gauge
NumberDataPoints #0
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.3 +0000 UTC
Value: 38.000000

2022-03-14T10:27:40.050Z        INFO    loggingexporter/logging_exporter.go:56  MetricsExporter {"#metrics": 5}
2022-03-14T10:27:40.051Z        DEBUG   loggingexporter/logging_exporter.go:66  ResourceMetrics #0
Resource labels:
     -> host.arch: STRING(amd64)
     -> host.name: STRING(DESKTOP-GK9AKCD)
     -> os.description: STRING(Windows 10 10.0)
     -> os.type: STRING(windows)
     -> process.command_line: STRING(C:\Program Files\Java\jdk-11.0.9;bin;java.exe -Xms128m -Xmx600m -javaagent:C:\Users\AnjanaAsok\Downloads\opentelemetry-javaagent-all.jar -Dotel.traces.exporter=otlp -Dotel.metrics.exporter=otlp -Dotel.exporter.otlp.endpoint=http://localhost:4317 -Dotel.javaagent.debug=true -Dfile.encoding=Cp1252)
     -> process.executable.path: STRING(C:\Program Files\Java\jdk-11.0.9;bin;java.exe)
     -> process.pid: INT(25400)
     -> process.runtime.description: STRING(Oracle Corporation Java HotSpot(TM) 64-Bit Server VM 11.0.9+7-LTS)
     -> process.runtime.name: STRING(Java(TM) SE Runtime Environment)
     -> process.runtime.version: STRING(11.0.9+7-LTS)
     -> service.name: STRING(unknown_service:java)
     -> telemetry.auto.version: STRING(1.6.2)
     -> telemetry.sdk.language: STRING(java)
     -> telemetry.sdk.name: STRING(opentelemetry)
     -> telemetry.sdk.version: STRING(1.6.0)
InstrumentationLibraryMetrics #0
InstrumentationLibrary io.opentelemetry.sdk.trace
Metric #0
Descriptor:
     -> Name: queueSize
     -> Description: The number of spans queued
     -> Unit: 1
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> spanProcessorType: STRING(BatchSpanProcessor)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0952864 +0000 UTC
InstrumentationLibraryMetrics #1
InstrumentationLibrary io.opentelemetry.javaagent.shaded.instrumentation.runtimemetrics.GarbageCollector
Metric #0
Descriptor:
     -> Name: runtime.jvm.gc.count
     -> Description: The number of collections that have occurred for a given JVM garbage collector.
     -> Unit: collections
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> gc: STRING(G1 Old Generation)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0942868 +0000 UTC
NumberDataPoints #1
Data point attributes:
     -> gc: STRING(G1 Young Generation)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0942868 +0000 UTC
Value: 5
Metric #1
Descriptor:
     -> Name: runtime.jvm.gc.time
     -> Description: Time spent in a given JVM garbage collector in milliseconds.
     -> Unit: ms
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> gc: STRING(G1 Old Generation)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0942868 +0000 UTC
NumberDataPoints #1
Data point attributes:
     -> gc: STRING(G1 Young Generation)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0942868 +0000 UTC
Value: 28
InstrumentationLibraryMetrics #2
InstrumentationLibrary io.opentelemetry.javaagent.shaded.instrumentation.runtimemetrics.MemoryPools
Metric #0
Descriptor:
     -> Name: runtime.jvm.memory.pool
     -> Description: Bytes of a given JVM memory pool.
     -> Unit: By
     -> DataType: Gauge
NumberDataPoints #0
Data point attributes:
     -> pool: STRING(CodeHeap 'profiled nmethods')
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 9371648
NumberDataPoints #1
Data point attributes:
     -> pool: STRING(Compressed Class Space)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 4194304
NumberDataPoints #2
Data point attributes:
     -> pool: STRING(G1 Old Gen)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 9418352
NumberDataPoints #3
Data point attributes:
     -> pool: STRING(G1 Eden Space)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 13631488
NumberDataPoints #4
Data point attributes:
     -> pool: STRING(CodeHeap 'non-profiled nmethods')
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 2555904
NumberDataPoints #5
Data point attributes:
     -> pool: STRING(CodeHeap 'profiled nmethods')
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 9355648
NumberDataPoints #6
Data point attributes:
     -> pool: STRING(Compressed Class Space)
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 1073741824
NumberDataPoints #7
Data point attributes:
     -> pool: STRING(G1 Eden Space)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 77594624
NumberDataPoints #8
Data point attributes:
     -> pool: STRING(CodeHeap 'non-nmethods')
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 5898240
NumberDataPoints #9
Data point attributes:
     -> pool: STRING(Compressed Class Space)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 3960272
NumberDataPoints #10
Data point attributes:
     -> pool: STRING(CodeHeap 'non-nmethods')
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 1198592
NumberDataPoints #11
Data point attributes:
     -> pool: STRING(CodeHeap 'profiled nmethods')
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 122880000
NumberDataPoints #12
Data point attributes:
     -> pool: STRING(CodeHeap 'non-profiled nmethods')
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 2414080
NumberDataPoints #13
Data point attributes:
     -> pool: STRING(G1 Survivor Space)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 6291456
NumberDataPoints #14
Data point attributes:
     -> pool: STRING(G1 Old Gen)
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 629145600
NumberDataPoints #15
Data point attributes:
     -> pool: STRING(CodeHeap 'non-profiled nmethods')
     -> type: STRING(max)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 122880000
NumberDataPoints #16
Data point attributes:
     -> pool: STRING(Metaspace)
     -> type: STRING(used)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 31779184
NumberDataPoints #17
Data point attributes:
     -> pool: STRING(G1 Old Gen)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 50331648
NumberDataPoints #18
Data point attributes:
     -> pool: STRING(CodeHeap 'non-nmethods')
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 2555904
NumberDataPoints #19
Data point attributes:
     -> pool: STRING(G1 Survivor Space)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 6291456
NumberDataPoints #20
Data point attributes:
     -> pool: STRING(Metaspace)
     -> type: STRING(committed)
StartTimestamp: 1970-01-01 00:00:00 +0000 UTC
Timestamp: 2022-03-14 10:27:39.0872878 +0000 UTC
Value: 32423936
Metric #1
Descriptor:
     -> Name: runtime.jvm.memory.area
     -> Description: Bytes of a given JVM memory area.
     -> Unit: By

Any help is much appreciated Thanks

Hi @ammujgd,

I am a little confused with your otel-localconfig.yml. It appears that you have duplicate service, processors, receivers, and exporters sections. Is this just a copy pasting error?

Additionally, how are you running both data prepper and the otel-collector? Through Docker? What commands are you using to run these? I also want to confirm that you are trying to send OpenTelemetry trace data to Data Prepper, and not OpenTelemetry metric data.

Another thing that would be helpful is to get some logs from the otel-collector. You can do this by adding the logging exporter to the configuration like this:

exporters:
  otlp/data-prepper:
    endpoint: 0.0.0.0:21890
    insecure: true
  logging:

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, otlp/data-prepper]

the otel collector config has a copy paste error . Providing the updated one

receivers:
  otlp:
    protocols:
      grpc:
       endpoint: 0.0.0.0:4317
       http:
       endpoint: 0.0.0.0:55681

processors:
  batch:

exporters:
  otlp/data-prepper:
    endpoint: 0.0.0.0:21890
    insecure: true

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp/data-prepper] 

Copying your questions here

Additionally, how are you running both data prepper and the otel-collector? Through Docker?

I am starting dataprepper using below command

docker run --expose 21890 -v ‘/c/program files/docker/cli-plugins/pipeline.yaml:/usr/share/data-prepper/pipelines.yaml’ -v ‘/c/program files/docker/cli-plugins/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml’ --network “host” opensearchproject/data-prepper:latest

I am starting otel-collector using this command

docker run --rm -p 13133:13133 -p 14250:14250 -p 14268:14268 -p 55678-55679:55678-55679 -p 4317:4317 -p 8888:8888 -p 9411:9411  --name otelcol99999 otel/opentelemetry-collector 

I am sending the opentelemtry trace date from java as below as from my eclipse workspace

-Dotel.traces.exporter=otlp
-Dotel.metrics.exporter=otlp
-Dotel.exporter.otlp.endpoint=http://localhost:4317
-Dotel.javaagent.debug=true

Hi does any body have any updates on this ? Thanks

@ammujgd This issue is going to be directly related to the other issues you have created. I recommend reworking your docker networking scheme away from using “host”