please guide me step by step to push the data /simple file from logstash to opensearch and provide me the docker-compose.yml file
Hi Prem I’d start by copying the example docker compose file from the site and add something like the setup below:
logstash:
image: "public.ecr.aws/opensearchproject/logstash-oss-with-opensearch-output-plugin:8.4.0"
container_name: logstash
ports:
- "5044:5044"
expose:
- "5044"
volumes:
- "logstash.conf:/usr/share/logstash/config/logstash-opensearch-sample.conf"
networks:
- {{ opensearch_network_name }}
create alogstash_conf.yml that has an input, filter and output
input {
# I'm using beats, so your inputs could be different
beats {
port => 5044
}
filter {
#######################
# example for a DROP filter #
#######################
# drop logs without message
if [message] == "" { drop{} }
}
#################
# OUTPUT #
#################
output {
opensearch {
hosts => ["https://opensearch-node1:9200"]
index => "name of the index you can use meta info to make this dynamic"
user => "admin"
password => "admin"
ssl => true
ssl_certificate_verification => false
}
}
I didnt try this on my machine but the settings should be close
Kind regards
Hi @drBenway, I have tried but didn’t worked out. Can you explain me step by step of how a sample input file ( json/csv) can be sent to open search from log stash and if possible can you explain me how the Kafka integration works with log stash and from log stash to open search.
Thanks in advance.
Thats not something that can be explained via a forum I would have a look at one of the Elasticsearch trainings on Udemy. They should give you a good intro on how stuff works.
see if this link helps you