By chance are you using Logstash or did you configure Filebeat to send to Opensearch?
Here is a simple config If your using logstash perhaps a config like this… This will make you index set and rotate every day. You can make a template for personal configuration be sure you create a Index pattern && alias, so when your send logs to Opensearsh it will grab that template with your personal configurations.
output.logstash:
# The Logstash hosts
hosts: ["8.8.8.8:5044"]
Then the Logstash configuration.
root@ansible:/opt/logstash-8.6.1/config# cat logstash.conf
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
beats {
port => 5044
}
}
output {
opensearch {
hosts => ["http://8.8.8.8:9200"]
auth_type => {
type => 'basic'
user => 'admin'
password => 'changeit'
}
ecs_compatibility => disabled
index => "filebeat-%{+YYYY.MM.dd}"
}
}
root@ansible:/opt/logstash-8.6.1/config#
That worked for me, as for not using logstash I found this. So im not 100% sure.
Sorry to do not make the graylog encryption configuration , beacuse in my job told me that firt it works and probe opensearch vs elasticsearch an backups ang dashboards etc etc and the lastone willbe the encrypt comunications, this is it.
I am thinking to put graylog 5.0.6 with elasticsearch 7.10.2 and mongodb 6.0 in production but I don´t know how many logs it will recolect, how many machines it will take inputs, the client is strange, they want all with don´t tell me anything after that problems will be all.
Can you tellme how to make backup of elasticsearch, I am trying to make it with curl but nothing.
root@GRAYLOGDEBIANSERVER:/var/log/graylog-server# curl -XPUT -H "content-type:application/json" 'http://localhost:9200/_snapshot/snapshot' -d '{"type":"fs","settings":{"location":"/backup","compress":true}}'
<HTML><HEAD>
<TITLE>Access Denied</TITLE>
</HEAD>
<BODY>
<FONT face="Helvetica">
<big><strong></strong></big><BR>
</FONT>
<blockquote>
<TABLE border=0 cellPadding=1 width="80%">
<TR><TD>
<FONT face="Helvetica">
<big>Access Denied (authentication_failed)</big>
<BR>
<BR>
</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica">
Your credentials could not be authenticated: "Credentials are missing.". You will not be permitted access until your credentials can be verified.
</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica">
This is typically caused by an incorrect username and/or password, but could also be caused by network problems.
</FONT>
</TD></TR>
<TR><TD>
<FONT face="Helvetica" SIZE=2>
<BR>
For assistance, contact your network support team.
</FONT>
</TD></TR>
</TABLE>
</blockquote>
</FONT>
</BODY></HTML>
root@GRAYLOGDEBIANSERVER:/var/log/graylog-server#
this is the result of execute the command… What user and password have I to put?
I would look at changing Elasticsearch for Opensearch if you can. Graylog is moving into Opensearch , So im just assuming but Graylog may not be able to use elasticsearch for much longer.
You would need to configure a couple things. and this goes for both Opensearch && Elasticsearch.
Adjust your elasticsearch.yml file by adding path to your repository #path.repo: ["/mnt/repo_set"]. Then enable it.
Just an FYI, Graylog uses plain text username:password to connect to elasticsearch unless its local , where as Opensearch can use TLS.
Here are both documentaions for Elastic & Opensearch. you will notice there the same. This also works on Graylog.