\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

Running LLMs on AMD GPUs is fucking easy now, performance same as Nvidia

I snagged at 7600xt with 16gb of VRAM for $300. Stuck it in ...
mauve adventurous giraffe
  06/02/25


Poast new message in this thread



Reply Favorite

Date: June 2nd, 2025 8:58 PM
Author: mauve adventurous giraffe

I snagged at 7600xt with 16gb of VRAM for $300. Stuck it in a proxmox server, blacklisted the AMD drivers, passed the GPU through to an OpenWebUI LXC, installed AMD drivers in the container, installed ollama in the container, and now the shit just works. Need to get comfyui going next.

(http://www.autoadmit.com/thread.php?thread_id=5732696&forum_id=2...id.#48982000)