Issue -Sending Specific Fields To Slack when Alert Triggered

Versions (relevant - OpenSearch/Dashboard/Server OS/Browser): 2.11.1

Describe the issue:
Dear Blog mate ,
I trigger Alert When Specific Conditions are Mate ,
i have this Json file as Example :

{
   "@timestamp": "2024-02-07T09:26:48.954Z",
   "@version": "1",
   "id": "38074011966970165006522121143354900108793948976928392031",

   "timestamp": "2024-02-07T09:22:24",
   "event_timestamp": "1707297744",
   "log_stream": "My_Log_Stream",
   "event": {
      "app_proto": "tls",
      "src_port": 63185,
      "src_ip": "My Src IP",
      "dest_port": 443,
      "alert": {
         "category": "",
         "signature": "aws:alert_established action",
         "severity": 3,
         "signature_id": 5,
         "rev": 0,
         "action": "blocked"
      },
      "proto": "TCP",
      "tls": {
         "version": "UNDETERMINED",
         "sni": "optimizationguide-pa.googleapis.com",
         "ja3": {},
         "ja3s": {}
      },
      "timestamp": "2024-02-07T09:22:24.583876+0000",
      "event_type": "alert",
      "dest_ip": "Dest_Ip",
      "flow_id": 752900448361541
   }

and i would like to export some fields into My slack When The Alert Triggered ,
Right Now i get in my slack notification only this fields :
Screenshot 2024-02-07 at 11.45.45
i would like to get in my Slack Notification this fields :
*sni
*app_proto
*src_ip

I tried this syntax
-Used Protocol: {{ctx.app_proto}}
-Associated Source ip: {{ctx. src_ip}}

But in My slack i got Blank Data , What could be the problem ?

@Dexter_96 Is the reported JSON file represents a single document from your index?
Could you share a full JSON representation of your Monitor?

Yes

i created Alert for this specific Index and this is The Kinf of Document that i got ,
i Want to extract This fields into my slack notification

I’m trying to achieve the same. Highly appreciated if you can tell us if you find a way to fix it.

I Didn’t get any clue right now

@Dexter_96 @hm21 This is my working example. I’ve used security-auditlog as a test index.

Query

{
    "size": 0,
    "query": {
            "bool": {
              "must": [
                {
                  "match": {
                    "audit_category": {
                      "query": "FAILED_LOGIN"
                    }
                  }
                },
                {
                  "range": {
                    "@timestamp": {
                      "gte": "now-50m"
                    }
                  }
                }
              ]
            }
    },
          "aggs": {
            "failed_logins": {
              "terms": {
                "field": "audit_request_effective_user.keyword"
              }
            }
          }
          
}

Query response:

{
    "_shards": {
        "total": 1,
        "failed": 0,
        "successful": 1,
        "skipped": 0
    },
    "hits": {
        "hits": [],
        "total": {
            "value": 30,
            "relation": "eq"
        },
        "max_score": null
    },
    "took": 6,
    "timed_out": false,
    "aggregations": {
        "failed_logins": {
            "doc_count_error_upper_bound": 0,
            "sum_other_doc_count": 0,
            "buckets": [
                {
                    "doc_count": 18,
                    "key": "admin"
                },
                {
                    "doc_count": 7,
                    "key": "elastic"
                },
                {
                    "doc_count": 5,
                    "key": "pablo"
                }
            ]
        }

Action message:

Monitor {{ctx.monitor.name}} just entered alert status. Please investigate the issue.
  - Trigger: {{ctx.trigger.name}}
  - Severity: {{ctx.trigger.severity}}
  - Period start: {{ctx.periodStart}}
  - Period end: {{ctx.periodEnd}}  

Failed {{ctx.results.0.hits.total.value}}  login attempts.
{{#ctx.results.0.aggregations.failed_logins.buckets}} user {{key}} had {{doc_count}} failed attempts,  {{/ctx.results.0.aggregations.failed_logins.buckets}}

Result:

Monitor test1 just entered alert status. Please investigate the issue.
  - Trigger: trigger1
  - Severity: 1
  - Period start: 2024-02-16T00:40:56Z
  - Period end: 2024-02-16T00:41:56Z  

Failed 20  login attempts.
 user admin had 8 failed attempts,   user elastic had 7 failed attempts,   user pablo had 5 failed attempts,  
1 Like

I tried to send it to the pager duty and its seems like i have a problem ,
i talked with the opensearch and they offer this solution:
{ “event_action”: “trigger”,
“payload” :
{ “summary”: “{{ctx.trigger.name}} -Severity : {{ctx.trigger.severity}}\n -Period start : {{ctx.periodStart}}\n -Period end: {{ctx.periodEnd}}\n -Involved User: {{ctx.Username}}”,
“source”: " {{ctx.monitor.name}}",
“severity”: “critical”
}
}

And this works

i answered down there , check this out