after months of ollama seeing only my CPU (maybe due to my Vulkan or ROCM setup) it finally occurred to me to try the docker image, and it’s working perfectly on my GPU now! https://docs.ollama.com/docker

it could be permissions: my user is in render and video, but the docker container runs as root

  • bobbbu@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    21 hours ago

    I’m running my containers in Podman Desktop (Bazzite plasma). It’s daemonless (no root). Ollama + open webui + searxng. Upon ollama container creation I had to specify using (amd) gpu. Podman is awesome, highly recommend!

    • jokeyrhyme@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      interesting, i wasn’t prompted or anything, from the documentation there’s the ollama/ollama:latest image and the ollama/ollama:rocm image but either way ollama will do its own detection and silently fallback to CPU if anything even smells wrong, haha

      i’ve noticed a few people are not using the official ollama images 🤷

    • jokeyrhyme@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 hours ago

      i wonder if rootless podman would have the same permissions problem … something to try out