DNS SAP monitor throws error on create

Just updated a cluster to 2.7 and attempted to create a new DNS SAP monitor:

[2023-05-03T21:13:00,010][ERROR][o.o.a.u.AlertingException] [2.aws] Alerting error: RemoteTransportException[[2.aws][0.0.0.0:9300][indices:admin/mapping/put]]; nested: IllegalArgumentException[mapper [monitor.schedule.period.unit] cannot be changed from type [text] to [keyword]];
[2023-05-03T21:13:00,011][ERROR][o.o.s.u.SecurityAnalyticsException] [2.aws] Security Analytics error:
org.opensearch.alerting.util.AlertingException: [2.aws][0.0.0.0:9300][indices:admin/mapping/put]
	at org.opensearch.alerting.util.AlertingException$Companion.wrap(AlertingException.kt:70) ~[?:?]
	at org.opensearch.alerting.transport.TransportIndexMonitorAction$IndexMonitorHandler$start$2.onFailure(TransportIndexMonitorAction.kt:350) ~[?:?]
	at org.opensearch.action.support.TransportAction$1.onFailure(TransportAction.java:122) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.action.support.RetryableAction$RetryingListener.onFinalFailure(RetryableAction.java:218) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.action.support.RetryableAction$RetryingListener.onFailure(RetryableAction.java:210) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.action.support.clustermanager.TransportClusterManagerNodeAction$AsyncSingleAction$1.handleException(TransportClusterManagerNodeAction.java:300) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.TransportService$6.handleException(TransportService.java:794) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.security.transport.SecurityInterceptor$RestoringTransportResponseHandler.handleException(SecurityInterceptor.java:312) ~[?:?]
	at org.opensearch.transport.TransportService$ContextRestoreResponseHandler.handleException(TransportService.java:1414) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundHandler.lambda$handleException$3(InboundHandler.java:420) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.common.util.concurrent.OpenSearchExecutors$DirectExecutorService.execute(OpenSearchExecutors.java:343) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundHandler.handleException(InboundHandler.java:418) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundHandler.handlerResponseError(InboundHandler.java:410) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundHandler.messageReceived(InboundHandler.java:158) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundHandler.inboundMessage(InboundHandler.java:114) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.TcpTransport.inboundMessage(TcpTransport.java:769) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundPipeline.forwardFragments(InboundPipeline.java:175) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundPipeline.doHandleBytes(InboundPipeline.java:150) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.InboundPipeline.handleBytes(InboundPipeline.java:115) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.transport.netty4.Netty4MessageChannelHandler.channelRead(Netty4MessageChannelHandler.java:94) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[?:?]
	at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:280) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[?:?]
	at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1383) ~[?:?]
	at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1246) ~[?:?]
	at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1295) ~[?:?]
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:529) ~[?:?]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:468) ~[?:?]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[?:?]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) ~[?:?]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[?:?]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[?:?]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[?:?]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) ~[?:?]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:689) ~[?:?]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:652) ~[?:?]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) ~[?:?]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[?:?]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[?:?]
	at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: java.lang.Exception: org.opensearch.transport.RemoteTransportException: [2.aws][0.0.0.0:9300][indices:admin/mapping/put]
	... 48 more
[2023-05-03T21:13:00,012][WARN ][r.suppressed             ] [2.aws] path: /_plugins/_security_analytics/detectors, params: {}
org.opensearch.securityanalytics.util.SecurityAnalyticsException: [2.aws][0.0.0.0:9300][indices:admin/mapping/put]
	at org.opensearch.securityanalytics.util.SecurityAnalyticsException.wrap(SecurityAnalyticsException.java:51) ~[?:?]
	at org.opensearch.securityanalytics.transport.TransportIndexDetectorAction$AsyncIndexDetectorsAction.lambda$finishHim$0(TransportIndexDetectorAction.java:1168) ~[?:?]
	at org.opensearch.action.ActionRunnable.lambda$supply$0(ActionRunnable.java:73) [opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.action.ActionRunnable$2.doRun(ActionRunnable.java:88) ~[opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:806) [opensearch-2.7.0.jar:2.7.0]
	at org.opensearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:52) [opensearch-2.7.0.jar:2.7.0]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
	at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: java.lang.Exception: org.opensearch.alerting.util.AlertingException: [2.aws][0.0.0.0:9300][indices:admin/mapping/put]
	... 9 more

Any ideas what index this would be erroring on when updating the template? The index with the data obviously does not have the “monitor.schedule.period.unit” so I am assuming its an internal index?

I think I figured it out. It looks like it is trying to write to .opendistro-alerting-config.
Once I looked at the mapping of that index I realized something was up.
Backstory: After an upgrade we had some weird issues and basically I had to restore-delete-reindex “opendistro-alerting-config”. It looks like that process added unwanted mappings (plain old index restore for “opendistro-alerting-config” was not an option at the time).

How I fixed the bad mapping:

# kill the old index THEN create a dummy monitor in alerting to re-create the config index
curl -k --cert ./admin.pem --key ./admin-key.pem -XDELETE "https://cluster:9200/.opendistro-alerting-config"
# Dump
NODE_TLS_REJECT_UNAUTHORIZED=0 elasticdump  --cert=./admin.pem --key=./admin.key --ca=./ca.pem --tlsAuth --input=https://cluster:9200/.opendistro-alerting-config-jrojas-reindex --output=dump.json
# remove anything about the restored index
perl -p -i -e 's/-jrojas-reindex//go' dump.json #Old habits die hard.
# Restore the data
NODE_TLS_REJECT_UNAUTHORIZED=0 elasticdump  --cert=./admin.pem --key=./admin.key --ca=./ca.pem --tlsAuth --input=./dump.json --output=https://cluster:9200/.opendistro-alerting-config

After sleeping on this a bit more and looking at the new index template management in 2.7 I am curious if there are any plans of including these index templates as part of the setup to prevent these indices from having bad mappings? (alerting config index etc)

What looks like happened here actually is that Monitor document got indexed into .opendistro-alerting-config before index was created with proper mappings. So in this case, monitor.schedule.period.unit type was automatically set to “text” during auto-creation of index and mappings.

Did this start right after upgrade?

This issue did show up post upgrade (only because I had no SAP monitors defined).
The actual problem started when my alerting config index was created on an older version and I had to “reindex” it to create the proper version stamp in the metadata.
That re-indexing process seems to have been the culprit for the broken index mapping of the alerting config index.

That makes sense. In future, if you want to reindex this alerting config index, make sure that target index has these mappings applied: link

I guess that goes into my follow up question, now that index template management are part of the dashboard are there any plans to include the default system level index mappings in there as well?

In SA, Index template management was introduced soley to support datastreams/aliases/index patterns as source index inputs in detectors.

However there is 1 case where SA is creating index template to enrich Alerting’s query indices.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.