I’ve built a locally running Perplexity clone

LLocalSearch is a locally running search engine using LLM Agents that finds answers to user questions without requiring OpenAI or Google API keys. It runs on low-end LLM hardware and offers progress logs for transparency. The project is still in early development stages, so bugs are expected. The set-up requires a running Ollama server and Docker Compose. Users can easily deploy the system with a mobile-friendly interface and unique features like follow-up questions. To access the web interface, simply clone the latest release or the current git version and follow the provided instructions. Just be aware of potential stability issues with newer features.

https://github.com/nilsherzig/LLocalSearch

To top