Date: June 3rd, 2025 5:04 PM
Author: https://imgur.com/a/o2g8xYK
This is the 7600 XT with 16gb of VRAM. For ~$330, this is probably the cheapest way to get 16gb VRAM on anything that can run LLMs.
The only reason this isn't discussed is because AMD didn't release good drivers for it until about a year ago. They have this platform called ROCm that's essentially the same as CUDA, but it took a while for it to become mature.
Now, not only is it mature, it seems to work better than CUDA. More importantly, it's open source and easy to get running in Linux. Linux users still have to install the AMD vendor reset patch, and power management isn't the best in Linux VMs. Like it will keep drawing 200W for a good 15 minutes after you ask it to do something. Even if you unload the LLM from its VRAM it will still draw 200W for 15 minutes, just generating waste heat for no reason.
But Windows in a VM manages power just fine with PCIe passthrough. At idle it only consumes about 20W, the same as Nvidiashit. And for all I know it's mananaged better in some shitdistro like Fedora, I don't care to find out. That's where you come in. Buy this cheap ass graphics card and run big(-ish) LLMs offline in whatever PC you got. For <$350 it rapes.
(http://www.autoadmit.com/thread.php?thread_id=5733030&forum_id=2#48984369)