Ollama is now available on Windows in preview

Ollama is now available on Windows in preview, offering the ability to use large language models with built-in GPU acceleration and access to the full model library. No virtualization is required, and vision models can be easily run by dragging and dropping images into the Ollama interface. The Ollama API runs automatically in the background, allowing for seamless connection with tools and applications. Additionally, Ollama on Windows is compatible with OpenAI, enabling users to utilize existing tooling with local models. To get started, simply download and install Ollama on Windows, then run your desired model in the terminal. Feedback is encouraged via Discord or issue reports.

https://ollama.com/blog/windows-preview

To top