The Power of Selecting Key Blocks with Local Pre-ranking for Long Document Information Retrieval
On a wide range of natural language processing and information retrieval tasks, transformer-based models, particularly pre-trained language models like BERT, have demonstrated tremendous effectiveness. Due to the quadratic complexity of the self-attention mechanism, however, such models have difficu...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Association for Computing Machinery
2023
|
Subjects: | |
Online Access: | View Fulltext in Publisher View in Scopus |