ColBERT Build from Sentence Transformers

Neural-Cherche is a versatile library that allows for fine-tuning and utilization of neural search models like Splade, ColBERT, and SparseEmbed on specific datasets. It provides classes for running efficient inference on a fine-tuned retriever or ranker, and even allows users to save computed embeddings to avoid redundant computations. The installation process is as simple as using pip install neural-cherche, and additional evaluation capabilities can be added with pip install “neural-cherche[eval]”. The library is documented comprehensively and offers a quick start guide for easy implementation. A code example demonstrates how to fine-tune ColBERT using a Sentence Transformer pre-trained checkpoint. References to related research papers are provided.

https://github.com/raphaelsty/neural-cherche

To top