New package for factory: ollama
Hi all! I've created a package[0] for Ollama[1], and I'd like to submit it to Factory. Ollama is a wrapper for the llama.cpp LLM inferencing software, and IMO it's the easiest way to get started with running local LLMs. This package does *not* include models; those must be manually downloaded after Ollama is installed. Furthermore, since neither CUDA nor ROCm are in Factory (and CUDA cannot be added as it's nonfree), there is no GPU accelleration in this package. Maintenance of this package should be pretty straightforward; since Ollama is written in Go, it has very minimal dependencies, which reduces the chance of compilation breaking. It does see new releases on approximately a weekly schedule, but most of those releases have very minimal changes or even are simply updating the embedded llama.cpp build. This is my first time seriously packaging and maintaining a package for openSUSE, and I just want to say I'm already enjoying it! Have a lot of fun, Loren [0]: https://build.opensuse.org/package/show/science:machinelearning/ollama [1]: https://ollama.com
participants (1)
-
Loren Burkholder