Default search pipeline support with sparse encoding indices

OpenSearch 2.11.0

Hi all,

I was wondering if sparse neural search have any support for default search pipeline (neural_query_enricher)? I created ingestion and search pipelines as shown below and tried to search but got an error requiring me to pass a model id.

PUT /_ingest/pipeline/nlp-ingest-pipeline-sparse
{
“description”: “An sparse encoding ingest pipeline”,
“processors”: [
{
“sparse_encoding”: {
“model_id”: “7v52UYsBsF8TdmF-f-F3”,
“field_map”: {
“passage_text”: “passage_embedding”
}
}
}
]
}

PUT /_search/pipeline/default_model_pipeline
{
“request_processors”: [
{
“neural_query_enricher” : {
“default_model_id”: “7v52UYsBsF8TdmF-f-F3”
}
}
]
}

PUT /sparse_index2
{
“settings”: {
“default_pipeline”: “nlp-ingest-pipeline-sparse”,
“index.search.default_pipeline” : “default_model_pipeline”
},
“mappings”: {
“properties”: {
“id”: {
“type”: “text”
},
“passage_embedding”: {
“type”: “rank_features”
},
“passage_text”: {
“type”: “text”
}
}
}
}

PUT /sparse_index2/_doc/1
{
“passage_text”: “Arya Stark is a key character in GOT”,
“id”: “s1”
}

GET sparse_index2/_search
{
“_source”: “passage_text”,
“query”: {
“neural_sparse”: {
“passage_embedding”: {
“query_text”: “who killed the night king”,
“max_token_score”: 2
}
}
}
}

But I got the following error:

model_id field must be provided for [neural_sparse] query

@modelcollapse Charlie can you help on this?

@asfoorial one small suggestion : you can format the request body for easy reading

Hi @asfoorial, in 2.11 we only released the neural search(k-NN) search pipeline support, neural encoding search pipeline will be supported in 2.12.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.