Nginx module does not parse logs

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):

2.9.0

Describe the issue:

I found that when using Filebeat OSS 7.12.1, the Nginx module does not parse logs. What can this be due to and how to fix it? Thank you.

Hey @Kruzerson

Correct me if I’m wrong but are you using filebeat to get Nginx logs by using nginx module and sending them to Opensearch?

If so, another option would be put Data-prepper or Logstash infront of Opensearch. and not worry about the nginx module, just an idea.

This is a good example of Logstash parsing Nginx logs found here

Here is and example below of logstash syslog Input /w GROK. I had to use Key Value the grok filter you see extracts fields from each log line using a regular expression.

Log_shipper --> Logstash --> Opensearch

input {
  syslog{
    port => 5544
    tags => [ 'syslog' ]
      }
}

filter {
  if [@metadata][input-http] {
    date {
      match => [ "date", "UNIX" ]
      remove_field => [ "date" ]
    }
    mutate {
      remove_field => ["headers","host"]
    }
  }
}


filter {

if "syslog" in [tags] {

grok {
      match => ["message", "%{SYSLOG5424PRI}%{GREEDYDATA:message}"]
      overwrite => [ "message" ]
        }


    kv {
       source => "message"
       value_split => "="
    }

   }
  }
output {
if "syslog" in [tags] {
  opensearch {
    hosts => ["https://opensearch-node1.net:9200"]
    auth_type => {
              type => 'basic'
              user => 'admin'
              password => 'changeit'
            }
    ecs_compatibility => disabled
    ssl => true
    ssl_certificate_verification => false
    cacert => "/opt/logstash-8.6.1/root-ca.pem"    
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
     }
  }

As for filebeat i havent had a issue parsing the logs , and again I used Logstash infront of opensearch.

Are you receiving any logs to Opensearch or does filebeat not send them? If not, is there anything in the log files where Filebeat is on that may state what the issue is? Some more info on what going on would help.

Hello @Gsmitt .

Correct me if I’m wrong but are you using filebeat to get Nginx logs by using nginx module and sending them to Opensearch?

So I understand that I can use an additional abstraction to process logs. It’s just that I have an old cluster built on ELK with free features and the logs are parsed natively. So I was surprised because Filebeat OSS has Nginx modules in it.

As for filebeat i havent had a issue parsing the logs , and again I used Logstash infront of opensearch.

Do you use a single Logstash or in a cluster or docker swarm? Because I’m using docker swarm and for some reason the load balancing doesn’t work properly.

Are you receiving any logs to Opensearch or does filebeat not send them? If not, is there anything in the log files where Filebeat is on that may state what the issue is? Some more info on what going on would help.

I get the logs, but the message is not parsed. Yes, I have ready-made filters for Logstash, but if you can use the standard tools, it’s worth doing, I think.

Hey @Kruzerson

Just head up, eventually you’re going to need to pick a side or Update :wink: Elastic does have licensing on their software so you may run into a situation where some of the software from them may not work anymore on OpenSearch.

I don’t use Docker/K8’s just a virtual machine/s. Logstash sits on the same VM as OpenSearch and yeah it works in a cluster. I just started working with Data-Prepper and found it was just as useful as logstash.

Yes, I concur, Past few years I found that software changes and modules break on new releases and/or they are not maintained. As for production I try to avoid additional modules as possible, that just me thou

Just head up, eventually you’re going to need to pick a side or Update :wink: Elastic does have licensing on their software so you may run into a situation where some of the software from them may not work anymore on OpenSearch.

Well, the decision has already been made :wink:

Yes, I concur, Past few years I found that software changes and modules break on new releases and/or they are not maintained. As for production I try to avoid additional modules as possible, that just me thou

Yes, I will eventually use Logstash, but I need to figure out how to balance the load.

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.