after months of ollama seeing only my CPU (maybe due to my Vulkan or ROCM setup) it finally occurred to me to try the docker image, and it’s working perfectly on my GPU now! https://docs.ollama.com/docker
it could be permissions: my user is in render and video, but the docker container runs as root
I’m running my containers in Podman Desktop (Bazzite plasma). It’s daemonless (no root). Ollama + open webui + searxng. Upon ollama container creation I had to specify using (amd) gpu. Podman is awesome, highly recommend!
interesting, i wasn’t prompted or anything, from the documentation there’s the ollama/ollama:latest image and the ollama/ollama:rocm image but either way ollama will do its own detection and silently fallback to CPU if anything even smells wrong, haha
i’ve noticed a few people are not using the official ollama images 🤷
i wonder if rootless podman would have the same permissions problem … something to try out
ramalama also works pretty well if you are going the container route.
Dingdong. Sorry. I had to.
Thanks for mentioning ramalama, I was not aware of that one yet!
deleted by creator



