【综述专栏】一文详解神经信息检索领域的最新进展
在科学研究中,从方法论上来讲,都应“先见森林,再见树木”。当前,人工智能学术研究方兴未艾,技术迅猛发展,可谓万木争荣,日新月异。对于AI从业者来说,在广袤的知识森林中,系统梳理脉络,才能更好地把握趋势。为此,我们精选国内外优秀的综述文章,开辟“综述专栏”,敬请关注。
来源:知乎—Gordon Lee
翻译自Medium博客
https://medium.com/@mhammadkhan?source=post_page-----c0a67278f626--------------------------------
原文链接:
https://medium.com/@mhammadkhan/neural-re-ranking-models-c0a67278f626
译者 @Gordon Lee 按:题目有改动,原博客题目是Neural IR Models。这篇博客主要介绍了神经信息检索领域中不同类型的经典或者最新方法,里面提到的论文值得细读。
01
02
密集段落检索器(Dense passage retriever, DPR)
最近邻负对比估计 (Approximate nearest neighbour Negative Contrastive Estimation, ANCE)
03
Contextualized Late Interaction over BERT, ColBERT
04
05
Condenser
TaCL
06
参考文献
[1] Yixuan Su, Fangyu Liu, Zaiqiao Meng, Lei Shu, Ehsan Shareghi, and Nigel Collier. TaCL: Improving bert pre-training with token-aware contrastive learning, 2021
[2] Lee Xiong, Chenyan Xiong, Ye Li, Kwok-Fung Tang, Jialin Liu, Paul N. Bennett, Junaid Ahmed, and Arnold Overwijk. Approximate nearest neighbor negative contrastive learning for dense text retrieval. CoRR, abs/2007.00808, 2020.
[3] Rodrigo Nogueira and Kyunghyun Cho. Passage re-ranking with BERT. CoRR, abs/1901.04085, 2019.
[4] Jimmy Lin, Rodrigo Nogueira, and Andrew Yates. Pretrained transformers for text ranking: BERT and beyond. CoRR, abs/2010.06467, 2020.
[5] Sebastian Hofst ̈atter, Sheng-Chieh Lin, Jheng-Hong Yang, Jimmy Lin, and Allan Hanbury. Efficiently teaching an effective dense retriever with balanced topic-aware sampling. CoRR, abs/2104.06967, 2021
[6] Omar Khattab and Matei Zaharia. ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT, pages 39–48. Association for Computing Machinery, New York, NY, USA, 2020.
[7] Luyu Gao and Jamie Callan. Is your language model ready for dense representation fine-tuning? CoRR, abs/2104.08253, 2021
[8] Sebastian Hofst ̈atter, Sophia Althammer, Michael Schr ̈oder, Mete Sertkan, and Allan Hanbury. Improving efficient neural ranking models with cross-architecture knowledge distillation. CoRR, abs/2010.02666, 2020.
[9] Google. Google search understanding using BERT. https://blog.google/products/search/search-language-understanding-bert/.
本文目的在于学术交流,并不代表本公众号赞同其观点或对其内容真实性负责,版权归原作者所有,如有侵权请告知删除。
“综述专栏”历史文章
工业缺陷检测深度学习方法综述
马毅沈向洋曹颖最新AI综述火了!
可信图神经网络综述: 隐私, 鲁棒性, 公平和可解释性
2022年最新动态图神经网络(Dynamic GNN)综述
上海交大&华为:“非完全监督下基于深度学习的图像分割方法”最新综述
自动驾驶轨迹预测
重磅发布 | 图像图形学发展年度报告综述专刊《中国图象图形学报》2022年第6期
IoU、GIoU、DIoU、CIoU损失函数的那点事儿
Transformers中稀疏自注意力综述,及其在视觉跟踪中应用(IJCAI2022)
Tesla AI DAY 深度分析 硬核!EP1 Tesla Vision
针对深度学习的GPU共享
从2022年的这几篇论文看推荐系统序列建模的趋势
CNN调优总结
大规模图神经网络系统综述
港科+清华+中科院+微软等——视觉-语言智能最新综述
元宇宙技术综述
更多综述专栏文章,
请点击文章底部“阅读原文”查看
分享、点赞、在看,给个三连击呗!