Abstract
Recent advances in deep learning-based retrieval have established models such as SPLADE - a prominent variant of Learned Sparse Retrieval (LSR) - as state-of-the-art approaches, particularly in benchmark tasks like TREC Product Search. However, conventional LSR models (e.g., SPLADE) often struggle to effectively capture the semantics of negative Query terms and their lexical variations, leading to misinterpretation of user intent. This limitation leads to retrieval results that do not accurately reflect user intent, especially in queries containing negations like 'without'. To overcome this challenge, we propose a novel activation function, Negative Gaussian Linear Unit (NeGLU), designed to assign negative weights that better represent the semantics of negated terms. By integrating NeGLU into an LSR framework, our method improves the model's ability to identify and suppress negatively influential terms, thereby enhancing ranking performance. Extensive experiments on the TREC2023 product search test collection demonstrate the effectiveness of our approach, achieving an NDCG@100 score of 0.745, significantly outperforming existing methods. To further validate its robustness, we evaluated NeGLU on 186 test queries, including 54 involving negation, where it showed consistent gains - with an average improvement rate of 26.95% overall and 35.
| Original language | English |
|---|---|
| Pages (from-to) | 144492-144504 |
| Number of pages | 13 |
| Journal | IEEE Access |
| Volume | 13 |
| DOIs | |
| State | Published - 2025 |
Keywords
- deep learning-based ranking
- Learned sparse retrieval
- negation-aware retrieval
- negative weighting in retrieval
- product search optimization
Fingerprint
Dive into the research topics of 'NeGLU: Negation-Aware Sparse Retrieval With Negative Weights for Product Search'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver