Glad to hear you share in the excitement Thanks for your question Kiran. Could you pls expand on your usecase or the problem you are trying to solve? Also can you elaborate on what you mean by tokenize / de-tokenize data?
Hi - We have a security requirement to be able to tokenize the data such as Personal Identity data when stored at disk so that if anyone gets access to the disk, they really can’t understand as the data is scrambled/tokenized using a key. At the same time, when it is required to be read via OS Dashboard, we need the ability to de-tokenize/descamble back to original text using a key and present to the user.
In addition, what can also benefit is that if data can be stored by OS to disk at rest in encrypted format.
I see Amazon OS service providng this capability, how can we get data at rest encryption for Azure? Any inputs would be appreciated.
We have a solution (Opensearch plugin) that encrypts all the sensitive data before OpenSearch indexes the data; and fulfills all the searches normally with <8% overhead on ingest and <3% overhead on search.
Check us out. If you like what you see you can schedule a consultation on our website or email me at email@example.com.
We have a proxy that sits in front of OpenSearch and handles tokenization of fields that are configured as protected. It is primarily targeted at multi-tenant applications, so the tokenization uses different keys per tenant to prevent data leakage across tenants.