petsoi@discuss.tchncs.de to Linux@lemmy.ml · 1 month agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up188arrow-down115
arrow-up173arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 1 month agomessage-square19fedilink
minus-squarelelgenio@lemmy.mllinkfedilinkarrow-up3·1 month agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
ollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now