AWS opensearch field_limits

I am new to AWS opensearch, and I am having issues with field_limits within a month the field_limits that is really small and its been exceeded had been raised more than four times, giving index problems.
Do note I dont to raise the limits any more.

a snapshot from my index

GET /zonade/_mapping

returned the following

{
    "events": {
        "mappings": {
            "properties": {
                "456": {
                    "type": "text",
                    "fields": {
                        "keyword": {
                            "type": "keyword",
                            "ignore_above": 256
                        }
                    }
                },
                "Artthee": {
                    "type": "text",
                    "fields": {
                        "keyword": {
                            "type": "keyword",
                            "ignore_above": 256
                        }
                    }
                },
                "Movie Name": {
                    "type": "text",
                    "fields": {
                        "keyword": {
                            "type": "keyword",
                            "ignore_above": 256
                        }

     			     }
      			   }
      		    }
      		}
         }
       }
    }
}

For normalization I need to implement the following rules.

all words should be lowercase

all_spaces_should_be_replaced_by_underscores

would be grateful if I can get any suggestion on how to solve the issue

Hey @bashdex

By default, the maximum number of fields in an index is 1000. How much higher did you set it?

There are couple ways to reduce those limits.

  1. Separate logs into different index sets. ( i.e., Firewall , Windows Devices, Linux Devices, etc…)
  2. Drop fields that are not needed
  3. Only send data that you want.

This could be another solution if you use flattened