Guide to running Llama 2 locally

Llama 2 can be run locally on various devices, including M1/M2 Mac, Windows, Linux, and even smartphones. One advantage of running Llama 2 locally is that an internet connection is not required. There are already a few open-source tools available for running Llama 2 locally, such as Llama.cpp for Mac, Windows, and Linux, Ollama for Mac, and MLC LLM for iOS and Android. To install Llama.cpp on an M1/M2 Mac, a simple one-liner command can be used. Ollama is a macOS app that allows running and interacting with Llama 2 through a command-line interface. MLC LLM enables running Llama 2 on mobile devices, but it is still in beta. Finally, users are encouraged to join the Discord community and share their creations. Replicate provides a platform for running machine learning models in the cloud, including Llama 2.

https://replicate.com/blog/run-llama-locally

To top