Jdbc data access issue

Hi Team,
I am trying to import elasticsearch index in a third party aplication as sql table using jdbc driver by giving below connection url but only columns are imported.Data(records) are not imported and not visible .
conn_url = “jdbc:elasticsearch://https://127.0.0.1:9200?trustSelfSigned=true”
table = “sarvindex”
username = “admin”
password = “admin”

Can you Please advice this aboveconnection url is correct or not as in your documentation there are many key values are suggested in connection url like below.Am i missing anything here.

jdbc:elasticsearch://[scheme://][host][:port][/context-path]?[property-key=value]&[property-key2=value2]…&[property-keyN=valueN]

Hi @sarvendras, thanks for reporting the issue! So you were able to see the column names. Does that mean there was no connection and authentication issue?

Hi @daichen
Table schema(columns of elasticsearch index) was visible in external application but no data was visible So i think this mean that there is no connection and authentication issue…I have attached screenshots Please have a look once.
Is my jdbc connection url correct or I am missing anything here? Please advice.

conn_url = “jdbc:elasticsearch://https://127.0.0.1:9200?trustSelfSigned=true”
table = “sarvindex”
username = “admin”
password = “admin”

In opendisro jdbc documentation there are many property key values pair mentioned.I am not able to undertsand those Does issue may be because of these?

    jdbc:elasticsearch://[scheme://][host][:port][/context-path]?[property-key=value]&[property-key2=value2]..&[property-keyN=valueN]




![columns imported|690x184](upload://z4QirV0FSFLQ8sRYMxwMkeXShCk.png) ![index data|690x245](upload://leHHsclcuUpJqCFWJf6JhvWB1FQ.png) ![jdbc conn1|690x338](upload://1MdOK5fyH6AxKVt1dX2JBMxyy7C.png) ![table import|690x351](upload://dFM7M1yUKFndGIkOTU2Q0axAHcz.png) 
Thanks
Sarvendra

Hi @daichen Please suggest …

Hi @daichen Please suggest as this is critical for our Project.We had tried many things but unable to import elasticsearch index data.Only schema columns are imported.

Hi @sarvendras, I don’t see problem in your connection info. And I guess if there is connectivity issue you won’t be able to see anything including the schema. Could you check or share your code loading the data?

@daichen Thanks for reply sir…
I am using h2o(machine learning framework application to import elasticsearch index table using JDBC .Below is the code.

conn_url = “jdbc:elasticsearch://https://127.0.0.1:9200?trustSelfSigned=true
table = “esindextable”
username = “admin”
password = “admin”
h2o_importtable= h2o.import_sql_table(conn_url, table, username, password)

Below is the h2o documentation link for sql table import.
https://docs.h2o.ai/h2o/latest-stable/h2o-py/docs/h2o.html

h2o.import_sql_table(connection_url, table, username, password, columns=None, optimize=True, fetch_mode=None, num_chunks_hint=None)[source]
Import SQL table to H2OFrame in memory.

Assumes that the SQL table is not being updated and is stable. Runs multiple SELECT SQL queries concurrently for parallel ingestion. Be sure to start the h2o.jar in the terminal with your downloaded JDBC driver in the classpath:

java -cp <path_to_h2o_jar>:<path_to_jdbc_driver_jar> water.H2OApp
Also see import_sql_select(). Currently supported SQL databases are MySQL, PostgreSQL, MariaDB, Hive, Oracle and Microsoft SQL.

Parameters
connection_url – URL of the SQL database connection as specified by the Java Database Connectivity (JDBC) Driver. For example, “jdbc:mysql://localhost:3306/menagerie?&useSSL=false”

table – name of SQL table

columns – a list of column names to import from SQL table. Default is to import all columns.

username – username for SQL server

password – password for SQL server

optimize – DEPRECATED. Ignored - use fetch_mode instead. Optimize import of SQL table for faster imports.

fetch_mode – Set to DISTRIBUTED to enable distributed import. Set to SINGLE to force a sequential read by a single node from the database.

num_chunks_hint – Desired number of chunks for the target Frame.

Returns
an H2OFrame containing data of the specified SQL table.

Thanks
Sarvendra

@daichen I just created sample index in kibana with 2-3 records for import in h2o…I hope I am not missing anything

this is the h2o screenshot after sql import fucntion…only columns showing no records…

Thanks for the information! Will try to figure out.

@daichen

Hi,Have you got a chance to look in this

Hi @sarvendras, one thing I noticed is there is a auth option in our JDBC driver: NONE (no auth), BASIC (HTTP Basic), AWS_SIGV4 (AWS SIGV4). Could you give it a try by setting to basic? ex. jdbc:elasticsearch://https://127.0.0.1:9200?trustSelfSigned=true&auth=basic. And are you using our ODFE security plugin to auth?

And if possible, could you check elasticsearch.log to see the query it generated and if any error? You should be able to find logging at INFO level like “Incoming request”: sql/RestSqlAction.java at 893cd18fe5ce1fc7d3016c76567ce6e5ca834182 · opendistro-for-elasticsearch/sql · GitHub