Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
opensearch-2.6.0-linux-x64.rpm
logstash-oss-with-opensearch-output-plugin-8.6.1-linux-x64.tar.gz
filebeat-8.6.2-x86_64.rpm
Redis 5.0.3
Red Hat Enterprise Linux release 8.9 (Ootpa)
ruby 2.5.9p229 (2021-04-05 revision 67939) [x86_64-linux]
Describe the issue:
Logstash was ingesting logs to opensearch without any issues for more than a year, but suddenly after OS patching, we are facing the below issue
[2024-01-30T12:35:31,071][ERROR][logstash.javapipeline ][applogs] Pipeline error {:pipeline_id=>“applogs”, :exception=>#<ArgumentError: invalid byte sequence in UTF-8>, :backtrace=>[“org/jruby/RubyRegexp.java:1145:in `=~'”, “org/jruby/RubyString.java:1684:in `=~'”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:72:in `block in add_patterns_from_file’”, “org/jruby/RubyIO.java:3533:in `each’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:70:in `add_patterns_from_file’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-grok-4.4.3/lib/logstash/filters/grok.rb:471:in `block in add_patterns_from_files’”, “org/jruby/RubyArray.java:1865:in `each’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-grok-4.4.3/lib/logstash/filters/grok.rb:467:in `add_patterns_from_files’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-grok-4.4.3/lib/logstash/filters/grok.rb:280:in `block in register’”, “org/jruby/RubyArray.java:1865:in `each’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-grok-4.4.3/lib/logstash/filters/grok.rb:276:in `block in register’”, “org/jruby/RubyHash.java:1519:in `each’”, “/var/lib/data/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-grok-4.4.3/lib/logstash/filters/grok.rb:271:in `register’”, “org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:234:in `block in register_plugins’”, “org/jruby/RubyArray.java:1865:in `each’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:233:in `register_plugins’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:601:in `maybe_setup_out_plugins’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:246:in `start_workers’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:191:in `run’”, “/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:143:in `block in start’”], “pipeline.sources”=>[“/var/lib/data/logstash/config/pipeline_app.conf”], :thread=>“#<Thread:0x24c1040e@/var/lib/data/logstash/logstash-core/lib/logstash/java_pipeline.rb:131 run>”}
[2024-01-30T12:35:31,072][INFO ][logstash.javapipeline ][applogs] Pipeline terminated {“pipeline.id”=>“applogs”}
[2024-01-30T12:35:31,080][INFO ][logstash.outputs.opensearch][applogs] Using a default mapping template {:version=>2, :ecs_compatibility=>:v8}
[2024-01-30T12:35:31,081][ERROR][logstash.agent ] Failed to execute action {:id=>:applogs, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>“Could not execute action: PipelineAction::Create, action_result: false”, :backtrace=>nil}
Configuration:
Logstash pipelines.yml
- pipeline.id: applogs
path.config: "/var/lib/data/logstash/config/pipeline_app.conf"
- pipeline.id: dblogs
path.config: "/var/lib/data/logstash/config/pipeline_db.conf"
Logstash pipeline_app.conf
input {
redis {
host => "127.0.0.1"
port => '6379'
data_type => "list"
key => "redisqueue"
password => "filebeat123"
}
}
filter {
if ([fields][logtype] == "UNITS") {
grok {
tag_on_failure => ["_field_not_found"]
match => { "message" => "\s{1,2}(?m)%{DATESTAMP:logtimestamp}%{GREEDYDATA}" }
}
date {
match => ["logtimestamp", "dd.MM.yyyy HH:mm:ss.SSS"]
target => "@timestamp"
}
}
else if ([fields][logtype] == "PORTS") {
grok {
tag_on_failure => ["_field_not_found"]
match => { "message" => "\s{1,2}%{DATESTAMP:logtimestamp}%{GREEDYDATA}" }
}
date {
match => ["logtimestamp", "dd.MM.yyyy HH:mm:ss.SSS"]
target => "@timestamp"
}
}
else if ([fields][logtype] == "UNITHOST") {
grok {
tag_on_failure => ["_field_not_found"]
match => { "message" => "\s{1,2}(?m)%{DATESTAMP:logtimestamp}%{GREEDYDATA}" }
}
date {
match => ["logtimestamp", "dd.MM.yyyy HH:mm:ss.SSS"]
target => "@timestamp"
}
}
if "NORIS" in [tags] {
mutate {
add_field => { "DataCenter" => "NORIS" }
}
}
if "EIP" in [tags] {
mutate {
add_field => { "DataCenter" => "EIP" }
}
}
}
output {
stdout {codec => rubydebug { metadata => true } }
if "FIMI_PORT_LOG" in [tags] and "CDE" in [tags] {
opensearch {
hosts => ["http://OPENSEARCH_LB_IP:PORT"]
index => "two-fimiport-logs-%{+YYYY.MM.dd}"
user => "admin"
password => "PASSWORD"
}
}
if "FIMI_UNIT_LOG" in [tags] and "CDE" in [tags] {
opensearch {
hosts => ["http://OPENSEARCH_LB_IP:PORT"]
index => "two-fimiunit-logs-%{+YYYY.MM.dd}"
user => "admin"
password => "PASSWORD"
}
}
if "FIMI_UNIT_HOST_LOG" in [tags] and "CDE" in [tags] {
opensearch {
hosts => ["http://OPENSEARCH_LB_IP:PORT"]
index => "two-fimiunithost-logs-%{+YYYY.MM.dd}"
user => "admin"
password => "PASSWORD"
}
}
}
Relevant Logs or Screenshots: