Question Answering Model in 2.13.0

Hi all,

I saw the below in the documentation. I have three questions:

  1. Which model is this? is it available in huggingface so that we can easily finetune it?
  2. How to register other Q/A models in huggingface?
  3. Calling the model only return the output answer without showing the position of the answer in the context. Is there a way to return the position?

Thanks

POST /_plugins/_ml/models/_register
{
    "name": "question_answering",
    "version": "1.0.0",
    "function_name": "QUESTION_ANSWERING",
    "description": "test model",
    "model_format": "TORCH_SCRIPT",
    "model_group_id": "lN4AP40BKolAMNtR4KJ5",
    "model_content_hash_value": "e837c8fc05fd58a6e2e8383b319257f9c3859dfb3edc89b26badfaf8a4405ff6",
    "model_config": { 
        "model_type": "bert",
        "framework_type": "huggingface_transformers"
    },
    "url": "https://github.com/opensearch-project/ml-commons/blob/main/ml-algorithms/src/test/resources/org/opensearch/ml/engine/algorithms/question_answering/question_answering_pt.zip?raw=true"
}

Hi @asfoorial To answer your questions,

  1. Yes the model is available in huggingface and here is the link to it: mrm8488/electra-small-finetuned-squadv2 · Hugging Face
    It is a small model though, so using a larger QA model such as distilbert/distilbert-base-cased-distilled-squad · Hugging Face would yield more accurate answers.
  2. Yes to use any QA model in huggingface, it needs to be traced to torchscript or onnx format first and then be registered to OS cluster. We are planning to have a few pre-trained QA models available to users to be used directly by next release.
  3. In the current implementation, we do not return the position of the answer. It could be a good enhancement to consider in future releases.

Thanks @rbhavna . Good to know that more models will be supported. Do you sharing the steps you followd to trace the model?