Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
opensearch 2.3.0
Describe the issue:
Hi team,
I’m using a logstash input s3snssqs plugin to read logs from s3 and sqs but its throwing a below exception which I am not able to figure it out.
Logstash process runs normally at some containers and kills the process with other containers with below error message.
The below error arises when Registering SQS queues while logstash starts running.
Timeout exception
> Pipeline error {:pipeline_id=>"main", :exception=>#<Seahorse::Client::NetworkingError: execution expired>, :backtrace=>["org/jruby/ext/socket/RubyTCPSocket.java:134:in
initialize’", “org/jruby/RubyClass.java:911:in new'", "org/jruby/RubyIO.java:1146:in
open’”, “/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:951:in block in connect'", "org/jruby/ext/timeout/Timeout.java:118:in
timeout’”, "org/jruby/ext/timeout/Timeout.java:92:in timeout'",
Configuration:
here’s the sample input configuration
s3snssqs {
region => "ap-southeast-2"
from_sns => false
consumer_threads => 2
s3_default_options => { "endpoint_discovery" => true }
queue => "backup-log-messages-queue"
#access_key_id => "###############"
#secret_access_key => "#########"
role_arn => "arn:aws:iamrole"
type => "sqs-logs"
tags => ["backup-log"]
sqs_skip_delete => false
#codec => json
sqs_wait_time_seconds => 20
s3_options_by_bucket => [
{ bucket_name => "backup-logs"
folders => [
{ key => ".*/backup.*"
codec => "json_stream"
type => "backuplogs"}
]
}
]
}
can you please help me why it says as execution expired and throws a timeout error