Llama.vim – Local LLM-assisted text completion

llama.vim offers local LLM-assisted text completion in Vim. Users can toggle suggestions, accept them with shortcuts, and configure text generation settings. Context can be controlled and reused efficiently, even on low-end hardware. Installation is easy using popular plugin managers. For optimal performance, recommended llama.cpp settings depend on VRAM capacity. The plugin requires specific models for FIM compatibility. Surprisingly, llama.vim can provide high-quality completions on consumer-grade hardware. The plugin is lightweight and aims to maintain performance even in large codebases. The implementation details and support for other IDEs can be found in the provided links.

https://github.com/ggml-org/llama.vim

To top