Unable to see prometheus data source index on the dashboard

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser):

opensearch 2.12, ubuntu, chrome

Describe the issue:

The indexing on the data store for Prometheus doesn’t seem to work. is there any config that might be missing. I am using helm chart to install the opensearch and dashboard.

Relevant Logs or Screenshots:

Hey @Akshat

Testing this also and kind of have the same issue I think.

I’m running OS/OSD-v2.12.0 on Ubuntu22.0.4

I connected to Prometheus in the Data source section. no issues shown.

Checked my datasource index set…

On the left Pane Observability → Metrics I see this

If I click on any one of those metrics I see this…

Is this Issue you are having also?

@Akshat The provided screenshot shows only index patterns. An index pattern is not the same as an index.

You can list indices by running the following AIP call in the Dev Tools.

GET _cat/indices

The index pattern must be created manually.
Take a look at one of the screenshots provided by @Gsmitt

I used the dev tools and checked the ql-datasources in the list. The issue is that i can see the metrics visualization on the observability->metrics page. when I try to save the visualization and add to the dashboard, it shows me errors like below:
Failed to parse query due to offending symbol [ceph_bluestore_pricache:] at: 'source = cephprometheus.ceph_bluestore_pricache:' <--- HERE... More details: Expecting tokens in {'SEARCH', 'DESCRIBE', 'SHOW', 'FROM', 'WHERE', 'FIELDS', 'RENAME', 'STATS', 'DEDUP', 'SORT', 'EVAL', 'HEAD', 'TOP', 'RARE', 'PARSE', 'METHOD', 'REGEX', 'PUNCT', 'GROK', 'PATTERN', 'PATTERNS', 'NEW_FIELD', 'KMEANS', 'AD', 'ML', 'SOURCE', 'INDEX', 'D', 'DESC', 'DATASOURCES', 'SORTBY', 'STR', 'IP', 'NUM', 'KEEPEMPTY', 'CONSECUTIVE', 'DEDUP_SPLITVALUES', 'PARTITIONS', 'ALLNUM', 'DELIM', 'CENTROIDS', 'ITERATIONS', 'DISTANCE_TYPE', 'NUMBER_OF_TREES', 'SHINGLE_SIZE', 'SAMPLE_SIZE', 'OUTPUT_AFTER', 'TIME_DECAY', 'ANOMALY_RATE', 'CATEGORY_FIELD', 'TIME_FIELD', 'TIME_ZONE', 'TRAINING_DATA_SIZE', 'ANOMALY_SCORE_THRESHOLD', 'NOT', 'TRUE', 'FALSE', 'CONVERT_TZ', 'DATETIME', 'DAY', 'DAY_HOUR', 'DAY_MICROSECOND', 'DAY_MINUTE', 'DAY_OF_YEAR', 'DAY_SECOND', 'HOUR', 'HOUR_MICROSECOND', 'HOUR_MINUTE', 'HOUR_OF_DAY', 'HOUR_SECOND', 'INTERVAL', 'MICROSECOND', 'MILLISECOND', 'MINUTE', 'MINUTE_MICROSECOND', 'MINUTE_OF_DAY', 'MINUTE_OF_HOUR', 'MINUTE_SECOND', 'MONTH', 'MONTH_OF_YEAR', 'QUARTER', 'SECOND', 'SECOND_MICROSECOND', 'SECOND_OF_MINUTE', 'WEEK', 'WEEK_OF_YEAR', 'YEAR', 'YEAR_MONTH', '.', '+', '-', '(', '', ‘AVG’, ‘COUNT’, ‘DISTINCT_COUNT’, ‘ESTDC’, ‘ESTDC_ERROR’, ‘MAX’, ‘MEAN’, ‘MEDIAN’, ‘MIN’, ‘MODE’, ‘RANGE’, ‘STDEV’, ‘STDEVP’, ‘SUM’, ‘SUMSQ’, ‘VAR_SAMP’, ‘VAR_POP’, ‘STDDEV_SAMP’, ‘STDDEV_POP’, ‘PERCENTILE’, ‘TAKE’, ‘FIRST’, ‘LAST’, ‘LIST’, ‘VALUES’, ‘EARLIEST’, ‘EARLIEST_TIME’, ‘LATEST’, ‘LATEST_TIME’, ‘PER_DAY’, ‘PER_HOUR’, ‘PER_MINUTE’, ‘PER_SECOND’, ‘RATE’, ‘SPARKLINE’, ‘C’, ‘DC’, ‘ABS’, ‘CBRT’, ‘CEIL’, ‘CEILING’, ‘CONV’, ‘CRC32’, ‘E’, ‘EXP’, ‘FLOOR’, ‘LN’, ‘LOG’, ‘LOG10’, ‘LOG2’, ‘MOD’, ‘PI’, ‘POSITION’, ‘POW’, ‘POWER’, ‘RAND’, ‘ROUND’, ‘SIGN’, ‘SQRT’, ‘TRUNCATE’, ‘ACOS’, ‘ASIN’, ‘ATAN’, ‘ATAN2’, ‘COS’, ‘COT’, ‘DEGREES’, ‘RADIANS’, ‘SIN’, ‘TAN’, ‘ADDDATE’, ‘ADDTIME’, ‘CURDATE’, ‘CURRENT_DATE’, ‘CURRENT_TIME’, ‘CURRENT_TIMESTAMP’, ‘CURTIME’, ‘DATE’, ‘DATEDIFF’, ‘DATE_ADD’, ‘DATE_FORMAT’, ‘DATE_SUB’, ‘DAYNAME’, ‘DAYOFMONTH’, ‘DAYOFWEEK’, ‘DAYOFYEAR’, ‘DAY_OF_MONTH’, ‘DAY_OF_WEEK’, ‘EXTRACT’, ‘FROM_DAYS’, ‘FROM_UNIXTIME’, ‘GET_FORMAT’, ‘LAST_DAY’, ‘LOCALTIME’, ‘LOCALTIMESTAMP’, ‘MAKEDATE’, ‘MAKETIME’, ‘MONTHNAME’, ‘NOW’, ‘PERIOD_ADD’, ‘PERIOD_DIFF’, ‘SEC_TO_TIME’, ‘STR_TO_DATE’, ‘SUBDATE’, ‘SUBTIME’, ‘SYSDATE’, ‘TIME’, ‘TIMEDIFF’, ‘TIMESTAMP’, ‘TIMESTAMPADD’, ‘TIMESTAMPDIFF’, ‘TIME_FORMAT’, ‘TIME_TO_SEC’, ‘TO_DAYS’, ‘TO_SECONDS’, ‘UNIX_TIMESTAMP’, ‘UTC_DATE’, ‘UTC_TIME’, ‘UTC_TIMESTAMP’, ‘WEEKDAY’, ‘YEARWEEK’, ‘SUBSTR’, ‘SUBSTRING’, ‘LTRIM’, ‘RTRIM’, ‘TRIM’, ‘LOWER’, ‘UPPER’, ‘CONCAT’, ‘CONCAT_WS’, ‘LENGTH’, ‘STRCMP’, ‘RIGHT’, ‘LEFT’, ‘ASCII’, ‘LOCATE’, ‘REPLACE’, ‘REVERSE’, ‘CAST’, ‘LIKE’, ‘ISNULL’, ‘ISNOTNULL’, ‘IFNULL’, ‘NULLIF’, ‘IF’, ‘TYPEOF’, ‘MATCH’, ‘MATCH_PHRASE’, ‘MATCH_PHRASE_PREFIX’, ‘MATCH_BOOL_PREFIX’, ‘SIMPLE_QUERY_STRING’, ‘MULTI_MATCH’, ‘QUERY_STRING’, ‘ALLOW_LEADING_WILDCARD’, ‘ANALYZE_WILDCARD’, ‘ANALYZER’, ‘AUTO_GENERATE_SYNONYMS_PHRASE_QUERY’, ‘BOOST’, ‘CUTOFF_FREQUENCY’, ‘DEFAULT_FIELD’, ‘DEFAULT_OPERATOR’, ‘ENABLE_POSITION_INCREMENTS’, ‘ESCAPE’, ‘FLAGS’, ‘FUZZY_MAX_EXPANSIONS’, ‘FUZZY_PREFIX_LENGTH’, ‘FUZZY_TRANSPOSITIONS’, ‘FUZZY_REWRITE’, ‘FUZZINESS’, ‘LENIENT’, ‘LOW_FREQ_OPERATOR’, ‘MAX_DETERMINIZED_STATES’, ‘MAX_EXPANSIONS’, ‘MINIMUM_SHOULD_MATCH’, ‘OPERATOR’, ‘PHRASE_SLOP’, ‘PREFIX_LENGTH’, ‘QUOTE_ANALYZER’, ‘QUOTE_FIELD_SUFFIX’, ‘REWRITE’, ‘SLOP’, ‘TIE_BREAKER’, ‘TYPE’, ‘ZERO_TERMS_QUERY’, ‘SPAN’, ‘MS’, ‘S’, ‘M’, ‘H’, ‘W’, ‘Q’, ‘Y’, ID, INTEGER_LITERAL, DECIMAL_LITERAL, DQUOTA_STRING, SQUOTA_STRING, BQUOTA_STRING}`

Did you create the data source from the Dashboard UI or with API endpoint? Lets see if this solves the issue Opensearch dashboard goes blank with Json parse error - #6 by Akshat

I did both. Or I should say I tried both. right now I’m just using API.

EDIT: I do have key made, I saw that when I was troubleshooting.

I have the same issue showing up again. I tried 2.11.1 and used default admin admin user and added the master key and it was working. Now when I upgraded 2.12.0 the page goes blank again. I see that there is a new version out yesterday 2.13 and I am going to give this 1 more go. I am getting out of ideas and I hope someone from the community or the contributors look into this soon.

Hey @Akshat

Yeah there is a issue, I posted in Slack and GitHub.

1 Like

Hey @Akshat

I tried version 2.13 same issue

On OpenSearch Slack.com I was just talking to someone about this. I think this maybe fix in version 2.14.

You can see it here…

1 Like

Do any one over there has idea about when this bug was introduced? it would be great to get an invite for the slack channel.

try this

do we have any idea if this will be fixed on 2.14? I was reading the channel and looks like there is going to be a new release soon.

Not sure, you can check GitHub. I think they have a road map.

Check here…