Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>[""], :thread=>"#<Thread:0x239d6b0 run>"}

Hi,

I am trying to run openseach and logstash in docker but I always get an error.
My docker-compose file content is as follow:

version: '3'
services:
  opensearch-node1:
    image: opensearchproject/opensearch:latest
    container_name: opensearch-node1
    environment:
      - cluster.name=opensearch-cluster
      - node.name=opensearch-node1
      - discovery.seed_hosts=opensearch-node1,opensearch-node2
      - cluster.initial_master_nodes=opensearch-node1,opensearch-node2
      - bootstrap.memory_lock=true # along with the memlock settings below, disables swapping
      - "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m" # minimum and maximum Java heap size, recommend setting both to 50% of system RAM
    ulimits:
      memlock:
        soft: -1
        hard: -1
      nofile:
        soft: 65536 # maximum number of open files for the OpenSearch user, set to at least 65536 on modern systems
        hard: 65536
    volumes:
      - opensearch-data1:/usr/share/opensearch/data
    ports:
      - 9200:9200
      - 9600:9600 # required for Performance Analyzer
    networks:
      - opensearch-net
  opensearch-node2:
    image: opensearchproject/opensearch:latest
    container_name: opensearch-node2
    environment:
      - cluster.name=opensearch-cluster
      - node.name=opensearch-node2
      - discovery.seed_hosts=opensearch-node1,opensearch-node2
      - cluster.initial_master_nodes=opensearch-node1,opensearch-node2
      - bootstrap.memory_lock=true
      - "OPENSEARCH_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
      nofile:
        soft: 65536
        hard: 65536
    volumes:
      - opensearch-data2:/usr/share/opensearch/data
    networks:
      - opensearch-net
  opensearch-dashboards:
    image: opensearchproject/opensearch-dashboards:latest
    container_name: opensearch-dashboards
    ports:
      - 5601:5601
    expose:
      - "5601"
    environment:
      OPENSEARCH_HOSTS: '["https://opensearch-node1:9200","https://opensearch-node2:9200"]'
    networks:
      - opensearch-net
  logstash-oss:
    image: opensearchproject/logstash-oss-with-opensearch-output-plugin:latest
    container_name: logstash
    volumes:
      - ./logstash-config/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
      - ./session_data:/sample_data
    ports:
      - "8080:8080"
    networks:
      - opensearch-net
volumes:
  opensearch-data1:
  opensearch-data2:
networks:
  opensearch-net:

Also my logstash file is simple and looks like this:

input {
    file {
        start_position => "beginning"
        path => " C:/Users/user_id/docker_test/logstash-config/sessions_data.json"

        sincedb_path => "/dev/null"     
    }
}
filter {
    json{
        source => "message"
    }
}
output {
    stdout{}
}

The error I get when I run my docker-compose file I get this error:

Pipeline error {:pipeline_id=>"main", :exceptir_test/logstash-config/sessions_data.json>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-input-file-4.4.3/lib/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-input-file-4.4.3/lib/logstash/inputs/file.rb:284:in `register'", "/usr/share/logRubyArray.java:1865:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:232:in `register_plugins'", "/usr/share/logstash-core/lib/logstash/java_pipeline.rb:316:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:190:in , "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x239d6b0 run>"}

Could someone please help?

Hi @MKH,
I don’t know if Logstash has a problem with the space at the beginning of your path.
But I am more interested in why you are specifying a Windows path?
I’m confused because the OpenSearch Docker container is based on Amazon Linux and even if you mount an SMB share, the path will still look like /var/log/docker_test/… .
Is it possible that you start the Docker container on a WSL2?
If so, you need to change the path to /mnt/c/user_id/docker_test/logstash-config/sessions_data.json or something like that.