turkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 11 months agoOllama now supports AMD graphics cardsollama.comexternal-linkmessage-square4fedilinkarrow-up175arrow-down10file-text
arrow-up175arrow-down1external-linkOllama now supports AMD graphics cardsollama.comturkishdelight@lemmy.ml to LocalLLaMA@sh.itjust.worksEnglish · 11 months agomessage-square4fedilinkfile-text
But in all fairness, it’s really llama.cpp that supports AMD. Now looking forward to the Vulkan support!
minus-squareAlex@lemmy.mllinkfedilinkEnglisharrow-up1·11 months agoI was sadly stymied by the fact the rocm driver install is very much x86 only.
minus-squareturkishdelight@lemmy.mlOPlinkfedilinkEnglisharrow-up2·11 months agoIt’s improving very fast. Give it a little time.
I was sadly stymied by the fact the rocm driver install is very much x86 only.
It’s improving very fast. Give it a little time.