Ollama is now available as an official Docker image

We are excited to announce that Ollama is now available as an official Docker image, sponsored by Docker. This means that it is now much easier to use Ollama’s large language models using Docker containers. One of the unique features of Ollama is that all interactions with the models occur locally, ensuring that no private data is sent to third-party services. On Mac, Ollama can run models with GPU acceleration, making it even faster. To get started, simply download and install Ollama and run it alongside Docker Desktop for macOS. On Linux, Ollama can run with GPU acceleration inside Docker containers for Nvidia GPUs. You can find more models on the Ollama library and join their Discord and follow them on Twitter for updates.

https://ollama.ai/blog/ollama-is-now-available-as-an-official-docker-image

To top