Go library for in-process vector search and embeddings with llama.cpp

Semantic Search Library simplifies embedding and vector search for small to medium-sized projects. It utilizes brute-force techniques and SIMD optimizations for speed. Key features include llama.cpp without cgo, BERT model support, precompiled binaries with GPU acceleration, and search index creation. However, it may face efficiency challenges with large datasets and lacks advanced query capabilities. Instructions for using the library include loading a model, generating text embeddings, creating a search index, and performing a search. Compiling the library on Linux or Windows is explained, with options for GPU support. Surprising and unique content includes the library’s focus on simplicity and support for sophisticated embeddings. Template loading and simplified C code.

https://github.com/kelindar/search

To top