Install Help Ubuntu 18/20

Hello,

I have and ELK stack and am moving to OpenSearch. I am trying to follow the documentation but am having issues with it. I am using Ubuntu 18 and have tried Ubuntu 20. I have tried the tarball install but after running the install script I cant find any directories to modify any files. Documentation says opensearch home directory is in /usr/share/opensearch but that directory doesnt exist.

I have had more success with the docker-compose file. I can get to the dashboards and login. I have installed the opensearch plugin on my existing logstash host but logstash doesnt seem to be sending the logs over. I am not seeing anything helpful in opensearch or logstash logs.

Any assistance would be appreciated I am pretty lost.

Hi @michael.anderton - I think I might be able to point you in the right direction.

The OpenSearch and OpenSearch Dashboards docker containers have OpenSEarch installed in the directory /usr/share/opensearch, that much is correct. When dealing with the tarball, I believe some assembly is required.

Thereā€™s some good starting documentation here: Redirectingā€¦ that describes it as follows:

The tarball installation provides a self-contained directory with everything you need to run OpenSearch, including an integrated Java Development Kit (JDK). The tarball is a good option for testing and development.

That is to say, if youā€™d like to make your Ubuntu environment mimic the docker container, you can just simply move the contents of that tarball to /usr/share/opensearch. Since the directory is self contained, you can put it anywhere you like.

Happy to help with any other questions!

Nate

@nateynate

Thanks for the response. I think I am going to stick with the docker-compose route as it is the farthest I have gotten.

You may be able to help on an additional point? I have Logstash running on a separate host and have installed the opensearch output plugin, I have commented out the security portion of config/opensearch.yml and logstash is reporting no errors but I am not seeing any data coming in. Any ideas?

Sure @michael.anderton , although there might be a bit of troubleshooting required here depending on what your security configuration is. Some direct log output indicating any errors might help. Is logstash logging any errors?

If youā€™re using the self signed certificates that came with OpenSearch, you may have to configure logstash to not verify the certificates being returned by the search API on port 9200. This is usually what people bump into.

You can double check whether data is actually coming in by checking the ā€˜Discoverā€™ menu item under OpenSearch Dashboards:

Are you checking for data by using the discover tool or by submitting queries? Before you can query data, youā€™ll have to create an index pattern for it under Stack Management => Index Patterns:
image

Letā€™s see if we can verify that information is actually coming in first.

Nate

@nateynate

I had to restore a snapshot and now I cant even pull the image. Im reading its an mtu issue but Iā€™ve changed it to 1500 and keeps failing

@nateynate

I have finally had time to get back into opensearch. I have it running with the docker-compose.yml file and I added the logstash plugin image. The issue I am facing now is in index policies. When I try to save I am getting a ā€œfailed to fetchā€ error. I have yet to find the documentation or anything in the google universe that helps. Any tips?

Hi @michael.anderton - I donā€™t think Iā€™ll be of much help without seeing a specific error. Any chance you could grab a screenshot and/or paste any errors coming up in the log?

@nateynate

I brought the whole thing down and back up again and no more error! Thanks for the reply!

1 Like

Sorry you had to nuke from orbit, @michael.anderton. I hope you didnā€™t have too much data in there to lose.

Sometimes its the only way to be sure. :slight_smile:

Nate

@nateynate

Ive got everything running again with Docker including the logstash-oss image but I can not seem to figure out how to get data into it. I have two routes I am working. Logstash and Beat agent.

I have modified /usr/share/logstash/pipeline/logstash.conf to collect data from 1 host but I am not seeing the index populate in opensearch.

I have installed auditbeat on the ubuntu host where opensearch is running and modified /ect/auditbeat/auditbeat.yml file a few times to get to the opensearch-node1 but running auditbeat setup can not connect. I have tired as opensearch-node1, 0.0.0.0, localhost, and the ip of the host.

I am going back over the documentation with no luck.

Any ideas for me to try?

Iā€™m afraid Iā€™m not familiar with Beat agent, although I do aspire to take on more knowledge about all the various ingestion tools out there. Logstash Iā€™m a bit more intimate with. Mind posting your logstash.conf ?

What helped for me for logstash-oss was to start with the most basic configuration that I could - take input from STDIN (keyboard) and display the json messages that would have been sent to sTDOUT. Run it, type random stuff in, see if it spits out a json message. Usually from there I iterate by adding things to my input{} section to see if the messagers spit out on the screen when I run it. At that point you know the only thing left to configure is the output{} section to send it to OpenSearch. Either way, Iā€™d love to see the output{} section of your logstash.conf

@nateynate

I need to figure out how to make this config last through docker up/down but this is what I am using.

input {
syslog {
port => 5515
}
}

output {
opensearch {
index => ā€œsyslog-%{+YYY.MM}ā€
hosts => [ā€œhttp://localhost:9200ā€]
ssl_certificate_verification => false
user => ā€œadminā€
password => ā€œadminā€
}
}

I do have a system pointing logs to 5515. This is the same input block I used in the ELK stack I are moving away from. With this config, when I attach to the logstash logs I am getting a refused to connect error.

Full error:

Attempted to resurrect connection to dead OpenSearch instance, but got an error {:url=>ā€œhttp://admin:xxxxxx@localhost:9200/ā€, :exception=>LogStash::Outputs::OpenSearch::HttpClient::Pool::HostUnreachableError, :message=>ā€œOpenSearch Unreachable: [http://admin:xxxxxx@localhost:9200/][Manticore::SocketException] Connect to localhost:9200 [localhost/127.0.0.1] failed: Connection refused (Connection refused)ā€}

Thanks @michael.anderton

This likely has to do with the http in your hosts section. Mind changing that to https to see if you have better luck?

@nateynate

Thanks for pointing that out. Simple mistakes sure can be frustrating. That seems to have fixed the connection issue. I do not see the error anymore in the logs. Did my input block look okay? The logs show logstash listening on 5044, which is the default, and not the 5515 port.

1 Like

Looked good to me. The best way to check is to change your output section to just use the stdout plugin -

output { stdout {} }

That way you can see if it spits the json messages out to the screen.

Additionally, you can use the ā€˜Discoverā€™ button in your Opensearch Dashboards instance to make sure that records are being added to your index.

Glad I was able to help!

Nate