I try ingest the data from s3 bucket to opensearch dashboard using logstash

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):
-bash-4.2# /opt/logstash-8.13.4/bin/logstash -f /opt/logstash-8.13.4/config/pipelines.yml
Using bundled JDK: /opt/logstash-8.13.4/jdk
/opt/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_int
/opt/logstash-8.13.4/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13: warning: method redefined; discarding old to_f
Sending Logstash logs to /opt/logstash-8.13.4/logs which is now configured via log4j2.properties
[2024-05-15T15:12:10,417][INFO ][logstash.runner ] Log4j configuration path used is: /opt/logstash-8.13.4/config/log4j2.properties
[2024-05-15T15:12:10,419][INFO ][logstash.runner ] Starting Logstash {“logstash.version”=>“8.13.4”, “jruby.version”=>“jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.11+9 on 17.0.11+9 +indy +jit [x86_64-linux]”}
[2024-05-15T15:12:10,422][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-05-15T15:12:10,424][INFO ][logstash.runner ] Jackson default value override logstash.jackson.stream-read-constraints.max-string-length configured to 200000000
[2024-05-15T15:12:10,424][INFO ][logstash.runner ] Jackson default value override logstash.jackson.stream-read-constraints.max-number-length configured to 10000
[2024-05-15T15:12:10,626][WARN ][logstash.config.source.multilocal] Ignoring the ‘pipelines.yml’ file because modules or command line options are specified
[2024-05-15T15:12:10,635][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the “path.data” setting.
[2024-05-15T15:12:10,639][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:808) ~[jruby.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:767) ~[jruby.jar:?]
at opt.logstash_minus_8_dot_13_dot_4.lib.bootstrap.environment.(/opt/logstash-8.13.4/lib/bootstrap/environment.rb:90) ~[?:?]
-bash-4.2#

Describe the issue:

Configuration:

Relevant Logs or Screenshots:

Hi @rathiga ,

Please check if the logstash service is already running on your server. To do that, please execute the following command:

systemctl status logstash